Michael Hay

Will Cloud Computing Survive the Internet of Things? (The Force of Internet of Things)

Blog Post created by Michael Hay Employee on Jul 28, 2014

The world includes hundreds of millions of interconnected devices or things.  These things range from handheld mobile platforms to systems that will observe the universe.  At a coarse level this is what the Internet of Things (IoT) is all about: devices that are connected either directly or through a proxy to either an Intranet or the Internet.  However is there a more robust or even a formally authored standard definition?  Unfortunately the state of the standards work is more immature than Cloud Computing.  As recently as 2013 the European Union’s Internet of Things Architecture team states, “…after many years of heavy discussion, there is still no clear and common definition of the concept [21].”  For the purposes of this paper the definition from the International Telecommunications Union is referenced.

Internet of things (IoT): A global infrastructure for the information society, enabling advanced services by interconnecting (physical and virtual) things based on existing and evolving interoperable information and communication technologies [22].



While necessary, the ITU definition doesn’t sufficiently characterize size, type or key uses.  At a coarse level Cisco can provide a sense of scale as they predict the total number connected things to be roughly 50 billion in 2020 versus 12.5 billion in 2010 [23].  However more is needed to engender a sense of just how impactful IoT is today and will be tomorrow.  To that end this section will characterize both consumer, scientific and industrial aspects of IoT, with an aim of illustrating how impactful this force will be.  Further it is anticipated that the characterization will set the stage for explaining the key question posed about the impact of IoT on Cloud Computing.


Consumer Internet of Things

hue.pngAs mentioned, Apple sold of 77 million in their fiscal quarter that closed December 2013 [12].  According to Apple, the number of iPhones alone grew from 47.8 million to 51 million, a 6% increase from the same quarter a year prior.  While a staggering number, it does not well characterize the Consumer Internet of Things (CIoT).  Earlier in this paper a combination device and controlling application was called out:  Hue from Phillips.  Hue is an exemplary case of transformation of a device, the light bulb, from humble unconnected-ness to profound connected-ness.  What results is a system that affords users new use cases obsoleting old devices in the process such as simple light switch timers from hardware stores.  Obsolescence occurs because the Hue includes an application to control the platform allowing users to use create “light recipes” that can be set to start and stop randomly or via a schedule [24]. Most interesting, and important to the discussion, is the architecture that Hue employs.  Phillips’ system includes three elements including various LED light bulbs, a controlling application, and the bridge.  According to the Hue website, the bridge is used for local and direct control of LED light bulbs and also allows for the Hue platform to connect back into the “My Hue” cloud application.  The bridge in the Hue platform is an illustrative architecture pattern of the IoT, and it is generally described in “Data Management for the Internet of Things: Design Primitives and Solution” and Enabling Things to Talk Designing IoT solutions with the IoT Architectural Reference Model [25] [21].  Abu-Elkheir et al. utilize a data centric view recommending an architectural pattern with, “…temporal, real-time data stored near at the objects’ generating this data, and persistent, long-term data that is to be used for analysis catalogued and stored at dedicated facilities [25].”  iot-arch.pngAre there other examples in the consumer space which repeat the example described the Abu-Elheir in general and Hue in practice?  Yes. Another example of this architecture is sensor and monitoring platforms placed on bicycles.  A company called Wahoo Fitness offers a series of Bluetooth Low Energy (BLE) sensors and displays that track and report cadence, speed, and heart rate [26].  These sensors are physically attached to the bicycle and the rider; and network attached, via BLE, to a mobile platform, like an iPhone, running a cyclometer application.  The mobile platform is acting as a bi-directional gateway to capture local sensor data, redirects data to a display in real-time, and correlates results with GPS data gathered from cloud-computing platforms [21] [26].  Once a ride is completed, the rider can upload his locally stored data from the gateway device, the iPhone, to a centralized Cloud Computing software application to track ride performance over time.


While not discussed either by Wahoo or Phillips, there is another important facet of pervasive human data collection: the human digital shadow or footprint.  The idea that an individual is tracked through financial transactions, grocery stores, and in airports is well defined in the press and media.  In fact EMC Corporation sponsored research with IDC on the concept of the digital shadow.  EMC and IDC developed a small application that estimates the size of one’s daily digital shadow or footprint by answering a few questions [27].  I downloaded the application and investigated the results.  My findings are that my annual estimated data production exceeds 1.7 terabytes (TiB) with over 4.7 gigabytes (GiB) being produced daily.  An important fact discovered through the utilization of this tool is that the estimation fails to include data gathered by emerging IoT offerings like the Hue or Wahoo’s fitness sensing platform.  The implication is that the digital footprint estimation, derived from the tool, may fall short and if not updated will become more inaccurate in the future.


digital shadow.pngGrowth of one’s digital footprint is relevant because like Wahoo and Hue inevitably one needs to persist and process the data, likely using a remotely hosted Cloud Computing application.  To do that requires an upload of one’s locally produced data using home or mobile Internet connections.  Assuming a daily data production upper bound of 4.7 GiB, then to upload this amount every day for a month suggests a bandwidth capacity of no less than 143 GiB with an upload rate of 4.7 GiB/day or 24.57 megabits/second1.  For reference my monthly mobile bandwidth capacity subscription is 10 GiB including uploads and downloads.  Further the range of measured upload bandwidth for my home Internet service is 2 megabits/second to 3 megabits/second.  Therefore if I uploaded my digital footprint daily I would exhaust my monthly mobile subscription in 2.12 days; and using my home Internet connection would require three days of upload time for every one-day of data collected.  An implication from this simple accounting demonstrates a fundamental conflict between the suggested increased scale of digital footprints and the relatively meager carrying capacity of consumer Internet connections [5].  Moreover when the CIoT becomes prevalent, the situation will likely worsen because not only will the footprint grow, new requirements may emerge like low latency design targets.  Do these architectural patterns and problems exist beyond CIoT?  They do indeed.  The following section will examine the impact of IoT in the industrial and scientific sectors.


Scientific and Industrial Internet of Things

Mankind has yearned to know what exists beyond the boundaries of his current experience.  Starting well before Columbus’ “discovery of the New World” he has pushed the boundaries of the art of the possible.  A project in Australia and South Africa tests the art of the possible which will reveal secrets of the universe.  The project is called the Square Kilometer Array (SKA); and the design target is a telescope covering 1 square kilometer of terrestrial area to observe the Universe [15].  Projected design targets for data collection are into the exabyte scale.  IBM and ASTRON report that when operational, in 2024, the system’s antennas will produce 14 exabytes2 of data per day [15].  ska.pngIf the previous metaphor of showing an exabyte in terms of high definition movies were used, then the daily data production will be 280,000,000 movies representing over 63 thousand years of viewing time.  (Note that additional facts for the Square Kilometer Array are available for review in figure 3-4.)  Going back to the illustration of a conflict between one’s digital shadow and consumer network carrying capacity, one might wonder if similar situation exists here.  Again the answer is yes.  A conflict exists between the antennas’ estimated data production level, 14 exabytes per day, and the carrying capacity of commercially available networking technologies back to the central data center.  The entities engaged in this effort have determined that a way to resolve the conflict is to implement the gateway model and the approach recommended by Abu-Elkheir et al. [25] [21].  That is to say way the SKA team will likely resolve the challenge of extreme data production comes from, “…doing much of the initial processing out in the field, close to the antennas [15].”  The scale of this system is extraordinary, and there are other groups that are planning for similar scale systems in roughly the same time frame or sooner, as described earlier in this paper.  However even with two or more exemplary cases, the scale of these systems is unique and not the norm.


Boeing’s new 787 platforms are the most connected plane in the air today.  Virgin Atlantic proclaims that they expect to, “…get upwards of half a terabyte of data from a single flight from all of the different devices which are Internet connected [28].”  Coping with the data is a tall order and the remainder of the article discusses the storage and processing challenges.  Specifically the scale of data coming from these platforms has caused Virgin to rethink their data storage and processing architectures to include cloud storage paradigms [28].  However to fully understand the structure of 787 data payload delivery, deeper investigation is needed.  An initial read of “Boeing 787s to create half a terabyte of data per flight, says Virgin Atlantic” may lead one to believe that the plane is constantly communicating with the Internet [28].  Yet this is not entirely true.  Instead limited sets of data are communicated in real-time during flight.  This leaves the bulk of the data capture to occur at the end of the flight via private wireless networking infrastructures at airports.  One point of interest in Boeing’s documentation referred to past maintenance procedures requiring ground staff to go to an airplane for data retrieval.  Boeing’s stated process recommended the manual gathering of disks or cabled download procedures using a laptop to capture the data.  With the advent of the 787’s wireless download capabilities Boeing has automated the manual gathering of data, which may later be stored in either a traditional manner, in private clouds or public clouds.  The Common Core System (CCS) powers the advanced data collection capabilities on the 787 platforms. CCS includes Common Computing Resources (CCR), Common Data Networking (CDN), and Remote Data Concentrators (RDC).

Each 787 had two CCR cabinets, with eight general processing modules, network switches, and two fiber-optic translator modules in each cabinet. The CDN was made up of network switches inside the CCR cabinets as well as throughout the aircraft. Provided by GE Aerospace’s Cheltenham site in the United Kingdom, the RDCs replaced dedicated wiring and concentrated signals from the aircraft’s twenty-one remote sensors and effectors, feeding them into the network. The effectors sent signals to make units such as actuators move [29].



Existence of the CCS and its subordinate elements brings up another point that is relevant for the Industrial Internet of Things (IIoT): timely onboard control of elements is mandatory for security and safety.


Finally another segment that has implemented nearly the same gateway pattern is: Indycar and Formula One (F1) Racing [21] [25].  While not as sophisticated as the Square Kilometer Array or Boeing 787, auto racing exhibits similar architectural design patterns via Engine Control Units (ECU) and supporting peripherals.  Both racing forums require the use of specific ECUs that are supplied by companies like McLaren.  ECUs embedded in the cars are serious computing and storage systems including modern processors, peripheral interconnects for sensors, wired and wireless networking, and flash storage ranging from 1 GiB to 8 GiB in capacity.  Utilization of these platforms is critical for both F1 and Indycar racing.  For example, as of 2013 the Lotus F1 car has over 300 onboard sensors, which produce 25 GiB of data per race lap that is available through both proprietary wireless technologies and standard wired 10/100 megabits/second Ethernet [31].  Collected data is used during and after the race with 60% of the data being used

Application processing power955 MIPS4000 MIPS
Memory (RAM)40 MiB512 MiB
Memory protectionLimitedFull
Logged data capacity1 GiB8 GiB
CAN busses611
FlexRay busses02
Standard analog input rate capability1 kHz10 kHz
Fast analog input sampling rate capability10 kHz100 kHz
Internal accelerometerNoneTri-axis ±10G
Maximum number of logged channels5124000
Ethernet link maximum speed100 Mbps1 Gbps

to optimize the car’s behavior.  While somewhat anecdotal, the author conducted informal interviews with an Indycar racing team.  For the Hitachi/Penske car nearly 200 sensors are on the platform and 32 channels of data can be communicated wirelessly back to the pit for real-time analysis during a race.  Interestingly, during practice laps the team does not use the wireless communication channels because their competitors attempt to spy on their data feeds to gather intelligence about the Hitachi/Penske car.  During the conclusion of practice runs, a standard Ethernet cable is pulled to the car and the data downloaded for deeper inspection.  This reinforces that security is a factor required when considering designs for IoT applications [21].



From changes in the humble light bulb to advanced telescopes that will observe the heavens, connected things are emerging as a force.  A survey of various systems showed that due to network bandwidth constraints, scale, safety and security existing IT models, like Cloud Computing, are challenged to support these evolving systems.  Architectural patterns, like the gateway model, and recommendations are emerging to move processing as close to the source as possible [21] [25].  In practice these models are being deployed to overcome actual and perceived technology limitations.


  1. The author used dimensional analysis to compute the amount of required upload bandwidth for 4.7 GiB/day.  Further the number referenced of 24.57 GiB/second assumes that no other parallel activity like viewing movies, working from home, and so on occurs.
  2. The previously cited data about the aggregate system producing nearly 1 exabyte of data per day doesn’t refer to the data production on the actual antennas or telescopes.  This suggests that the hinted at processing elements near the antennas winnows the data set from 14 exabytes to 1 exabyte.
  3. Image & Table Credits:
    1. Phillips' description of the bridge used in the Hue product platform [37].
    2. A network architecture model taken from the Internet of Things Architecture taskforce [21].
    3. An estimate of the author's digital shadow from the EMC/IDC tool [27].
    4. An IBM informational graphic illustrating several facts about the Square Kilometer Array [39].
    5. Specifications of McLaren's Engine Control Units [30].


  1. Will Cloud Computing Survive the Internet of Things? (Introduction) - includes all references represented in this post
  2. Will Cloud Computing Survive the Internet of Things? (The Force of Cloud Computing ) - second installment in this series
  3. Will Cloud Computing Survive the Internet of Things? (Reflections and Conclusions) - the final installment in this series