Michael Hay

Will Cloud Computing Survive the Internet of Things? (The Force of Cloud Computing )

Blog Post created by Michael Hay Employee on Jul 19, 2014

Are centralized advanced data centers emerging as the modern version of centralized power utilities?  In what may be a seminal effort on the topic of Cloud Computing, Nicholas Carr posits this [2].  If true, Carr’s hypothesis implies that data processing and application platforms will be centralized, leaving little remaining in an average company’s data center.  Before delving into detail, it is important to ground the discussion in a robust definition of Cloud Computing.  A relevant international organization like ISO would have properly defined this term and it would be possible to cite that definition.  However ISO’s definition of Cloud Computing remains in development [3].  For purposes of this paper NIST’s Cloud Computing definition will be assumed as the best possible.


Cloud Computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment models [4].


NIST describes various characteristics of Cloud Computing such as on-demand self service; broad network access; resource pooling; rapid elasticity; and measured service.  Deployment models that are germane to the discussion including Private Cloud; Community Cloud; Public Cloud; and Hybrid Cloud.  In these deployment archetypes, NIST’s words do not preclude various physical instantiations of a Cloud Computing platform.  For example when considering the Private Cloud it is stated that, “…it may exist on or off premises [4].”  Whereas when clarifying the Public Cloud deployment model there is a clear statement that the assets are maintained within the confines of a Public Cloud provider.  This begs a follow up question regarding Carr’s hypothesis: Did Carr get it wrong or did he get it right?  While the definition is important to the discussion, the physical layer is not described in the definition.  Therefore further explanation is required before it is possible to answer this question.



How pervasive is Cloud Computing?  Unfortunately there is no single reference that can answer this question.  For example, one could sum the revenues generated by major Public Cloud providers for their services.  However using this measure fails to incorporate data about other cloud deployment models like Private Clouds and potentially Community Clouds. Another approach would be to use industry analyst reports, press releases and data that size the market.  IDC proclaims, “…spending on public IT cloud services will reach $47.4 billion in 2013 and is expected to be more than $107 billion in 2017 [5].”  Again this consists of revenues only for the Public Cloud.  Therefore the paper will provide a sense of pervasiveness and scale by reporting several diverse Cloud Computing statistics and facts.


Whether it is the number of public facing active web sites, revenues or significant cloud deals, Amazon Web Services appear to play a significant role in Cloud Computing as a Public Cloud provider.  For instance Netcraft reports that Amazon Web Services (AWS) includes 11.6 million web sites and 2.1 million sites with active content [6].  AWS1 financial results are quite significant for Amazon with Charles Babcock of InformationWeek estimating revenues of US $3.2 billion for 2013 [7].  Beyond counting web sites and revenues, many people indirectly use services on AWS without knowing it.  For example NETFLIX, an Amazon Instant Video competitor, depends heavily on AWS.  Therefore users who watch any digital content from NETFLIX indirectly use AWS [8].  Further San Jose State’s online learning management system (LMS) is based upon Canvas from a company named Infrastructure [9].  This LMS makes use of AWS for their entire platform including the storage of classroom collateral [10].  Beyond AWS there are many other Public Cloud services that are pervasively consumed.  Apple Inc. offers 5 gigabytes of free storage on their iCloud service for users who have an Apple mobile device and login.  iCloud is used for storage of pictures, documents, device backups, music and movies.2  The number of iOS devices shipped in a quarter suggests that Apple’s service may consume over 367 petabytes of usable digital storage per quarter [11] [12].  While it is possible to cite other cases, it should be clear that the deployment of cloud services is pervasive, impacts many peoples’ daily lives, and is a serious business. Is there more to measure?


Scale by the Numbers

A way to gauge Cloud Computing’s scale is to look at proxy measures like space, computational power, digital storage and energy consumption.  For instance Google3 operates 12 data centers across the Americas, Asia Pacific and Europe for their Cloud Computing infrastructures [13].  Additionally in 2013 former Microsoft CEO, Steve Ballmer, stated that Microsoft’s Cloud Computing servers were “over a million,” bigger than Amazon’s but less than Google’s [14].  However this million server number is difficult to grasp. ExtremeTech calculated the amount of power needed to run a million servers.  Assuming that each server directly consumed 200 Watts of power and indirectly 50 Watts, the total power consumption would be 250 megawatts or nearly 177,000 US households [14].  This is a staggering amount of energy consumed by Microsoft’s cloud services.


Another measure of scale for Cloud Computing is planned and actual digital storage consumption.  Interviews with end users suggests a variety of large-scale cloud deployments in production, in early stages of implementation, or on the drawing board.  In extreme cases users have deployed or are planning deployments at or beyond the exabyte scale of digital storage capacity.  In at least one instance a user believes that the deployment will be of storage class memory technologies like NAND flash by 2016.  Like previously cited power reference of 250 megawatts, the exabyte scale is vast in size and hard to comprehend.  Therefore a frame of reference is required for solid understanding.  Consider the following illustration: If a single high definition movie consumes approximately 50 gigabytes in digital storage capacity then there are about 20,000,000 high definition movies in a single exabyte, which equates to 4566 years of sequential viewing.  exabyte.pngThat is a lot of digital storage capacity. The Square Kilometer Array (SKA) is another example of a cloud-like project.  This project aims to create a multimode telescope to observe the heavens covering 1 million square meters located in Australia.  At its peak in 2024 the SKA will, “…generate as much as an exabyte of data each day that will need to be stored for later analysis using computers. [15]”  (More on this project is discussed later in the paper in the section titled The Force of the Internet of Things.)  If the Information Technology platforms used in SKA are cloud in nature, it is illustrative of an open project within the exabyte scale.


Vendor and Open Offerings

Are technology vendors seeing a shift towards Cloud Computing acquisition models in their customer base?  The websites of Cisco, Hitachi Data Systems, IBM and HP indicate that these companies provide converged infrastructure offerings which can be used to construct a private cloud; vendor specific definitions are cited below.


  • Cisco Unified Computing System (UCS) and servers unify computing, networking, management, virtualization, and storage access into a single integrated architecture. This unique architecture enables end-to-end server visibility, management, and control in both bare metal and virtual environments, and facilitates the move to Cloud Computing and IT-as-a-Service with Fabric-Based Infrastructure [16].
  • Hitachi Unified Compute Platform (UCP) family is a broad portfolio of converged infrastructure solutions. ucp_arch.pngThe family combines best-of-breed storage, server, networking and software management in fully integrated packages designed for mission-critical workloads [17].
  • The PureFlex™ System is a fully integrated system with unified management of compute, storage, networking and virtualization resources that utilize built in Patterns of Expertise based on IBM's decades of experience and thousands of client deployments [18].
  • HP ConvergedSystem is a portfolio of complete, engineered systems optimized for virtualization, cloud, and big data. The portfolio delivers a total systems experience that simplifies IT through quick deployment, intuitive management, and system-level support, all built from best-in-class components [19].


These offerings include many facets of the NIST definition.  Specifically these converged offerings provide a shared pool of computing resource inclusive of servers, networking, storage and applications that can be managed flexibly and released as needed [4].  Moreover companies define Cloud Computing as a key utilization pattern.  Since these Information Technology companies offer converged systems that meet NIST’s definition, it is not a stretch to state they are witnessing a shift in customer acquisition models towards Cloud Computing.


openstack_arch.pngBeyond these four vendors are there movements towards open Cloud Computing as well?   Yes.  Projects like OpenStack represent movements in open source focused on Cloud Computing.  With a robust community and ecosystem, this project offers a range of interaction possibilities [20].  However OpenStack does not package hardware and software together into a combined system, but it does meet the spirit and intent of NIST’s definition with a slight twist.   OpenStack offers software versions of server, network and storage technologies that can be flexibly and easily deployed on demand.  The project’s aim is to allow users to choose a uniform and off the shelf hardware platform, with various cloud disciplines being supplied through software versions of server, network and storage elements.  A critical facet of any open source community’s health comes from technical contributions and industry adoption of the project.  In other words, contribution and adoption are a proxy method of forecasting the project’s sustainability and potential for success.  The OpenStack community includes 330 companies participating at various levels.  Further the member list is a veritable who’s who in the technology sector, including AT&T, Hitachi, Cisco, VMWare, RedHat, and HP.  More than that, material products and services are being developed either as proper public cloud infrastructures through companies like Rackspace and HP, or as private cloud enabling software stacks through companies like RedHat and Mirantis [20].



From IDC’s market proclamations to pervasiveness, NIST’s definition, scale, vendor offerings, and open source projects, is it safe to say that Cloud Computing is a force? IDC predicts economic cloud services growth at a 23.5% Compound Annual Growth Rate which is five times faster than the rest of the Information Technology industry [5].  IDC’s predictive measures coupled to data reported in this section indicates that Cloud Computing is a force.  What about Carr’s hypothesis: is Cloud Computing to be considered a modern utility?  This is more complex to answer; yet there are some indicators that his hypothesis holds because of reported data on Cloud Computing.  For example Amazon, Google, Apple, and Microsoft are constructing “mega data centers” where users can acquire and release applications, servers, networking and storage resources on demand.  If these mega data centers offering Cloud Computing utilities become the norm for Information Technology (IT) platforms, how will they support the next generation of connected things like tools, trains, towels and cars?  The paper will pose an answer to this question later.   First it is important to characterize the force of the Internet of Things...


  1. In his article Babcock points out that Amazon doesn’t break out AWS revenues directly [7].
  2. I took a simple approach to compute the total amount of usable space required by Apple for their most recent financial quarter.  347 petabytes was computed by obtaining the product of 5 gigabytes and the total number of devices sold in their most recent quarter, 77 million [12].
  3. While recent information about data center scale has been made public by the likes of Apple, Google, Microsoft and Facebook, it should be noted that this was what they were willing to say.  An open and accurate accounting of the real server population, storage capacities, consumed floor space, and so on is generally considered trade secret data and is therefore not exposed.



  1. Will Cloud Computing Survive the Internet of Things? (Introduction) - includes all references represented in this post
  2. Will Cloud Computing Survive the Internet of Things? (The Force of Internet of Things) - the next installment in this series
  3. Will Cloud Computing Survive the Internet of Things? (Reflections and Conclusions) - the final installment in this series