Advertisement
Singapore markets closed
  • Straits Times Index

    3,280.10
    -7.65 (-0.23%)
     
  • Nikkei

    37,934.76
    +306.28 (+0.81%)
     
  • Hang Seng

    17,651.15
    +366.61 (+2.12%)
     
  • FTSE 100

    8,145.04
    +66.18 (+0.82%)
     
  • Bitcoin USD

    63,369.10
    -501.27 (-0.78%)
     
  • CMC Crypto 200

    1,321.20
    -75.34 (-5.16%)
     
  • S&P 500

    5,098.81
    +50.39 (+1.00%)
     
  • Dow

    38,210.99
    +125.19 (+0.33%)
     
  • Nasdaq

    15,916.25
    +304.49 (+1.95%)
     
  • Gold

    2,345.10
    +2.60 (+0.11%)
     
  • Crude Oil

    84.08
    +0.51 (+0.61%)
     
  • 10-Yr Bond

    4.6690
    -0.0370 (-0.79%)
     
  • FTSE Bursa Malaysia

    1,575.16
    +5.91 (+0.38%)
     
  • Jakarta Composite Index

    7,036.08
    -119.22 (-1.67%)
     
  • PSE Index

    6,628.75
    +53.87 (+0.82%)
     

The internet of things is becoming the next cloud battleground

Sensors and connected devices are popping up everywhere, and the data they’re producing has to be processed somewhere. While the easy stuff and the immediate stuff happens locally, more complex stuff — predictive analytics, visualizing data on mobile apps, talking to other devices or applications — happens in the cloud. And cloud computing providers are already beginning their fight to house all that data and all those workloads.

As it stands, the internet of things, like the web and mobile economies from which it grew, runs largely on Amazon Web Services. But there’s no guarantee the status quo will remain in place. As part of its broader home-automation plans, for example, Google is already buying up large AWS users such as Nest and Dropcam. Dropcam Co-founder and CEO Greg Duffy told me last year that his company runs “the largest inbound streaming service on the entire internet” — bigger than even YouTube. Assuming they eventually move onto Google’s infrastructure, AWS will lose both revenue and some banner use cases.

However, the competition — which stepped up to another level in the past week — won’t be won by M&A alone, or even by offering up the lowest prices on computing and storage. Cloud providers will also have to prove they’re the most capable platform for handling the particular needs of the internet of things. AWS has a stream-processing service called Kinesis that was no doubt created with connected devices in mind to some degree. Google’s new Cloud Dataflow service is also designed to process streaming data as it hits the network, and then analyze it more deeply later.

Download This Episode

ADVERTISEMENT

Subscribe in iTunes

The Structure Show RSS Feed

Microsoft, meanwhile, might actually have the most-compelling internet of things service among the three largest cloud providers. It’s offering up in limited release a service called Azure Intelligent Systems Service, which the company claims will let users not just collect, store and process device data, but also connect devices and services and even manage them.

It seems probable that AWS and Google will eventually release similar services of their own. The savings in simplicity and latency that could result from having everything stored and managed on the same platform, by the same tools, and possibly within the same data center, could be too much to resist. Like many shifts in technology, it’s probably easy enough to get started with a single use case and maybe a single type of device or sensor, but companies might start begging for help as they begin measuring more stuff with new devices and complexity skyrockets.

I thought about this even more after the connected cities panel I led at our Structure Connect conference earlier this week (the audio from which is included in the podcast embed above, and the video of which is embedded below). We spoke about how cities are dipping their toes into the connected world by doing things such as measuring open parking spots and installing intelligent light bulbs. However, as City of San Jose, CIO Vijay Sammeta explained, his team is already looking into lots of new use cases, including vehicle-to-vehicle communications to prevent accidents and joining parking data with economic data to try bringing shoppers to areas that have both available parking and businesses in need of patrons.

From left to right: Vijay Sammeta, CIO, City of San Jose; Maciej Kranz, VP of Corporate Technology Group, Cisco; Charlie Catlett, Senior Computer Scientist, Argonne National Laboratory and University of Chicago; Derrick Harris, Senior Writer, Gigaom. Credit: Jakub Mosur

Charlie Catlett, a senior computer scientist at Argonne National Laboratory outside of Chicago, explained a project his institution is about to deploy that will measure things like air quality and other environmental factors. One goal of the project is to correlate that data with public health data in order to to help determine how external factors can affect health in different parts of Chicago. A 40-sensor deployment is rolling out in November, but Catlett said he has verbal commitment from the city to deploy 1,000 of them.

Cloud providers have been fighting fiercely for lucrative government accounts for years, over everything from city-government email to large private-cloud environments for the CIA. One of the big factors adding to the competitive nature is that so many governments now consider (some by mandate) cloud computing to be a good place to host new and non-critical workloads. As governments large and small start getting serious about quantifying the infrastructure and even the environments they manage, possibly deploying thousands of nodes at a time, they’re going to start looking for platforms to handle all that data and all that gear, and cloud providers are going to line up to deliver them.

Image copyright Shutterstock / asharkyu.

Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.



More From paidContent.org