Toll Free: 1-888-928-9744

Mega Data Centers: Market Shares, Strategies, and Forecasts, Worldwide, 2017 to 2023

Published: Mar, 2017 | Pages: 418 | Publisher: WinterGreen Research
Industry: ICT | Report Format: Electronic (PDF)

Mega data centers represent a quantum change in computing.  They are building size single cloud computing units that function automatically, representing an entirely new dimension for computing.  Each building costs about $1 billion and works to manage web traffic and applications as an  integrated computing unit.  

The value of automated process to business has been clear since the inception of computing.  Automated process replaces manual process.  Recently, automated process has taken a sudden leap forward.  That leap forward has come in the form of a mega data center.  

Mega data centers replace enterprise data centers and many cloud hyperscale computing centers that are mired in spending patterns encompassing manual process by implementing automated infrastructure management and automated application integration.  

In the enterprise data centers mired in manual process, the vast majority of IT administrative expenditures are for maintenance rather than for addressing the long-term strategic initiatives.  

Business growth depends on technology spending that is intelligent, not on manual labor spending.  The manual labor is always slow and error prone, spending on manual process is counterproductive vs automation spending.   So many IT processes have been manual, tedious, and error prone that they have held the company back relative to the competition.  Mega data centers get rid of that problem.  The companies that invested in mega data centers and automated process for the data centers have had astounding growth, while the companies stuck with ordinary data centers mired in manual process, be they enterprise data centers or hyperscale cloud data centers with manual process, are stuck in slow growth mode.  

The only way to realign IT cost structures is to automate infrastructure management and orchestration.   Mega data centers automate server and connectivity management.  For example, Cisco UCS Director automates everything beyond the input mechanisms.   Cisco UCS automates switching and storage, along with hypervisor, operating system, and virtual machine provisioning.

As this leap forward happened, many companies were stuck with their enterprise data center that has become a bottleneck.  There is so much digital traffic that it cannot get through the traditional enterprise data center.  The existing enterprise data centers are built with Cat Ethernet cable that is not fast enough to handle the quantities of data coming through the existing enterprise data center, creating a bottleneck.  As these key enterprise data center parts of the economy bottleneck the flow of digital data, there is a serious problem.  Companies that want to grow need to embrace cloud computing and data center innovation to correct this major problem.

Conventional wisdom has it that cloud computing is the answer, but this does not tell enough of the story, it is that portion of cloud computing that embraces automated process that can provide significant competitive advantage, not all cloud computing works.  That new kid on the computing block is mega data centers.  

All manner of devices will have electronics to generate digital information.  The connected home will provide security on every door, window, and room that can be accessed from a smart phone.  The refrigerators and heaters will send info so they can be turned on and off remotely.  In industry, work flow is being automated so robots are active beyond a single process, extended to multi process information management. 

All this takes a lot of analytics, operation on data in place, and always on access to all the data.  Clos architecture mega data centers help implement the type of architecture. That a data center needs in order to operate in an effective, efficient manner.  

Robots, drones, and automated vehicles all generate tons of data, with the growth rate for IoT reaching 23% by the end of the forecast period.  Trillion dollar markets are evolving in multiple segments.  IoT is in the early stages of an explosive growth cycle.  The Pokemon Go phenomenon raid adoption raised awareness and expectation for the vision of augmented reality AR and digital enhancement of the surroundings.  Digital enhancement as IoT is just human explanation of our existing surroundings.  Digital economic leveraging of data provides better management of the innate natural world and of the machines we use to perform work.  

Clos architecture data centers are needed to manage all the data coming from the implementation of automated process everywhere.  

IoT is set to become an indispensable part of people’s lives.  Digital real time processing using mega data centers is poised to take off as part of the much heralded Mega Data Centers:.  Digital images become as much a part of the real world as the things we can touch and feel as they are integrated into everyday life.  The reality is augmented by the digital images.  Augmented reality is a misnomer to the extent that it implies that reality is somehow has something superimposed on it.  Instead the reality exists, and the digital images blend in to enhance the experience of reality, make it more understandable or more interesting.  The reality is not changed, it is not made better, it is understood better.  

Use-cases for IoT proliferate.  Pokemon Go points the way to, illustrates, the huge game market opportunity looming on the ubiquitous smart phones.   Adoption of IoT technology in the enterprise is growing.  AR headsets and glasses are used in manufacturing, logistics, remote service, retail, medical, and education.  One popular AR application is providing ‘see-what-I-see’ functionality, enabling off-site specialists to provide real-time guidance and expertise to troubleshoot an issue.  Others superimpose process steps by step information on dials and switches in workflow situations. 

Functional automated vehicles are driving around as Uber cars in San Francisco.   This is generating IoT data that is used for navigation and for transaction processing.   With 200.8 billion IoT endpoints predicted to be in service by 2023, the time is right to leverage the business value of the IoT by building Clos architecture mega data centers that manage the onslaught of digital data in a manner that is cost effective.”

According to Susan Eustis, lead author of the study, “Organizations are hampered by siloed enterprise data center systems that inhibit growth and increase costs.  Even the components inside the data center are siloed:  servers, database servers, storage, networking equipment.  Mega data centers function as universal IoT platforms that overcome legacy limitations and simplify device integration, to enable connectivity and data exchange.  Industrial end-to-end process automation markets are anticipated to reach $7 trillion by 2027, growing at a rapid pace, providing remarkable growth for companies able to build new data center capacity efficiently.  

Pokémon Go grew to a massive 45 million daily active users per day after two months in the market, with the market reaching $250 million for the vendor Niantic by September 2016 after two months starting from zero.  This kind of growth demands the scalability and economy of a clos architecture mega data center.    

Phenomenal growth is anticipated to come from implementation of step-by-step procedure virtual reality modules that are used to manage systems.  Every business executive in the world wants to have an IT structure agile enough to manage phenomenal growth, should that be necessary, the aim is to construct augmented reality modules that address the issues brought by the Mega Data Centers:.  IoT takes the data from sensors, superimposes analytics on collected data, turns the data into information, and streams alerts back to users that need to take action.  

The Mega Data Centers:: market size is $459.7 million in 2015 to $1.6 billion in 2016.  It goes from  anticipated to be USD $359.7  billion in 2023.   The market, an astoundingly rapid growth for a market that really is not yet well defined.  The increasing scope of applications across different industries, manufacturing, medical, retail, game, and automotive, all industries really, is expected to drive demand over the forecast period to these unprecedented levels, reaching into the trillion dollar market arenas soon.  IoT technology is in the nascent stage with a huge growth potential, and has attracted large investments contributing to the industry growth.

WinterGreen Research is an independent research organization funded by the sale of market research studies all over the world and by the implementation of ROI models that are used to calculate the total cost of ownership of equipment, services, and software.  The company has 35 distributors worldwide, including Global Information Info Shop, Market Research.com, Research and Markets, electronics.ca, and Thompson Financial.  It conducts its business with integrity.  

The increasingly global nature of science, technology and engineering is a reflection of the implementation of the globally integrated enterprise.  Customers trust wintergreen research to work alongside them to ensure the success of the participation in a particular market segment.

WinterGreen Research supports various market segment programs; provides trusted technical services to the marketing departments.  It carries out accurate market share and forecast analysis services for a range of commercial and government customers globally.  These are all vital market research support solutions requiring trust and integrity.

Companies Profiled 

Market Leaders
•	Amazon 
•	Facebook 
•	Google 
•	Microsoft

Market Participants
•	Rackspace 
•	Raging Wire 
•	Cisco 
•	Cern

Key Topics

•	Mega Data Center 
•	Clos Architecture 
•	Two Layer Data Center Architecture 
•	Wearable Computer Workplace Functions 
•	Internet of Things AR 
•	Digital traffic 
•	Automated infrastructure management 
•	Automated application integration 
•	Automated infrastructure management 
•	Data Center infrasturture orchestration 
•	Data Center application orchestration 
•	Mega data center automation 
•	Server management 
•	Connectivity management 
•	Quantum change in computing 
•	Digital enhancement as IoT 
•	Intelligent Cloud Segment
 Table of Contents

MEGA DATA CENTERS EXECUTIVE SUMMARY 23

Mega Data Center Scale and Automation 23
Mega Data Centers Have Stepped In To Do The Job Of Automated Process 25
Cloud 2.0 Mega Data Center Fabric Implementation 27
Cloud 2.0 Mega Data Center Different from the Hyperscale Cloud 29
Cloud 2.0 Mega Data Center Automatic Rules and Push-Button Actions 31
Making Individual Circuits And Devices Unimportant Is A Primary Aim Of Fabric Architecture 32
Digital Data Expanding Exponentially, Global IP Traffic Passes Zettabyte (1000 Exabytes) Threshold 34
Google Kubernetes Open Source Container Control System 35
Google Kubernetes a Defacto Standard Container Management System 36
Google Shift from Bare Metal To Container Controllers 37
Cloud 2.0 Mega Data Center Market Driving Forces 37
Mega Data Center Market Shares 41
Cloud Datacenter, Co-Location, and Social Media Cloud, Revenue Market Shares, Dollars, Worldwide, 2016 42
Cloud 2.0 Mega Data Center Market Forecasts 43

1. MEGA DATACENTERS: MARKET DESCRIPTION AND MARKET DYNAMICS 45
1.1 Data Center Manager Not Career Track for CEO 45
1.1.1 Colocation Shared Infrastructure 48
1.1.2 Power and Data Center Fault Tolerance 51
1.2 Fiber High Bandwidth Datacenters 54
1.3 100 Gbps Headed For The Data Center 55
1.3.1 100 Gbps Adoption 56
1.4 Scale: Cloud 2.0 Mega Data Center Containers 58
1.4.1 Data Center Architectures Evolving 58
1.4.2 High-Performance Cloud Computing Market Segments 61
1.4.3 Cisco CRS-3 Core Routing Platform 62
1.5 Evolution of Data Center Strategy 62
1.6 Cabling in The Datacenter 65
1.6.1 Datacenter Metrics 69
1.6.1 Digitalization Forcing Data Centers to Evolve 70
1.6.2 A One-Stop Shop 70
1.6.3 Growing With Business 71

2. MEGA DATA CENTERS MARKET SHARES AND FORECASTS 72
2.1 Mega Data Center Scale and Automation 72
2.1.1 Cloud 2.0 Mega Data Center Fabric Implementation 75
2.1.2 Cloud 2.0 Mega Data Center Different from the Hyperscale Cloud 77
2.1.3 Cloud 2.0 Mega Data Center Automatic Rules and Push-Button Actions 78
2.1.4 Making Individual Circuits And Devices Unimportant Is A Primary Aim Of Fabric Architecture 79
2.1.5 Digital Data Expanding Exponentially, Global IP Traffic Passes Zettabyte (1000 Exabytes) Threshold81
2.1.6 Google Kubernetes Open Source Container Control System 82
2.1.7 Google Kubernetes Defacto Standard Container Management System 83
2.1.8 Google Shift from Bare Metal To Container Controllers 84
2.1.9 Cloud 2.0 Mega Data Center Market Driving Forces 84
2.2 Mega Data Center Market Shares 88
2.2.1 Cloud 2.0 Mega Datacenter Cap Ex Spending Market Shares Dollars, Worldwide, 2016 89
2.2.2 Amazon Capex for Cloud 2.0 Mega Data Centers 93
2.2.3 Amazon (AWS) Cloud 94
2.2.4 Amazon Datacenter Footprint 94
2.2.5 Cloud 2.0 Mega Data Center Social Media and Search Revenue Market Shares, Dollars, 2016 94
2.2.6 Microsoft Azure 99
2.2.7 Microsoft Data Center, Dublin, 550,000 Sf 100
2.2.8 Microsoft Data Center Container Area in Chicago. 101
2.2.9 Microsoft Quincy Data Centers, 470,000 Square Feet 103
2.2.10 Microsoft San Antonio Data Center, 470,000 SF 104
2.2.11 Microsoft 3rd Data Center in Bexar Could Employ 150 105
2.2.12 Microsoft Builds the Intelligent Cloud Platform 106
2.2.13 Microsoft's Datacenter Footprint 107
2.2.14 Microsoft Datacenter Footprint 108
2.2.15 Google Datacenter Footprint 109
2.2.16 Google Datacenter Footprint 110
2.2.17 Facebook Datacenter Footprint 111
2.2.18 Facebook Datacenter Footprint 112
2.3 Cloud 2.0 Mega Data Center Market Forecasts 113
2.3.1 Market Segments: Web Social Media, Web Wireless Apps, Enterprise / Business Transactions, Co-Location, And Broadcast / Communications 115
2.3.2 Cloud 2.0 Mega Data Center Is Changing The Hardware And Data Center Markets 120
2.4 Mega-Datacenter: Internet Giants Continue To Increase Capex 121
2.4.1 Amazon Datacenter Footprint 122
2.4.2 Service Tiers and Applications 122
2.4.3 Cloud 2.0 Mega Data Center Segments 123
2.4.4 Mega Data Center Positioning 124
2.4.5 Cloud 2.0 Mega Data Centers 126
2.5 Hyperscale Datacenter Future 127
2.6 Data Expanding And Tools Used To Share, Store, And Analyze Evolving At Phenomenal Rates133
2.6.1 Video Traffic 134
2.6.2 Cisco Analysis of Business IP Traffic 135
2.6.3 Increasing Video Definition: By 2020, More Than 40 Percent of Connected Flat-Panel TV Sets Will Be 4K 142
2.6.4 M2M Applications 144
2.6.5 Applications, For Telemedicine And Smart Car Navigation Systems, Require Greater Bandwidth And Lower Latency 146
2.6.6 Explosion of Data Inside Cloud 2.0 Mega Data Center with Multi-Threading 151
2.6.7 Cloud 2.0 Mega Data Center Multi-Threading Automates Systems Integration 151
2.6.8 Fixed Broadband Speeds (in Mbps), 2015–2020 151
2.6.9 Internet Traffic Trends 155
2.6.10 Internet of Things 158
2.6.11 The Rise of the Converged “Digital Enterprise” 159
2.6.12 Enterprise Data Centers Give Way to Commercial Data Centers 159
2.6.13 Types of Cloud Computing 160
2.7 Cloud Mega Data Center Regional Market Analysis 160
2.7.1 Amazon, Google Detail Next Round of Cloud Data Center Launches 162
2.7.1 Cloud Data Centers Market in Europe 164
2.7.2 Cloud Data Centers Market in Ireland 165
2.7.3 Japanese Data Centers 165

3. MEGA DATA CENTER INFRASTRUCTURE DESCRIPTION 167
3.1 Amazon Cloud 167
3.1.1 Amazon AWS Regions and Availability Zones 168
3.1.2 Amazon Addresses Enterprise Cloud Market, Partnering With VMware 170
3.1.3 AWS Achieves High Availability Through Multiple Availability Zones 172
3.1.4 AWS Improving Continuity Replication Between Regions 172
3.1.5 Amazon (AWS) Meeting Compliance and Data Residency Requirements 173
3.1.6 AWS Step Functions Software 174
3.1.7 Amazon QuickSight Software 175
3.1.8 Amazon North America 177
3.1.9 AWS Server Scale 180
3.1.10 AWS Network Scale 181
3.2 Facebook 190
3.2.1 Dupont Fabros Constructing Second Phase In Acc7 Represents An Expanded Relationship with Facebook 192
3.2.2 Facebook $1B Cloud 2.0 Mega Data Center in Texas 193
3.2.3 Facebook $300 Million Cloud 2.0 Mega Data Center in Iowa 193
3.2.4 Fort Worth Facebook Mega-Data Center 196
3.2.5 Facebook Forest City, N.C. Cloud 2.0 mega data center 198
3.2.6 Data Center Fabric, The Next-Generation Facebook Data Center Network 199
3.2.1 Facebook Altoona Data Center Networking Fabric 200
3.2.2 Facebook Clusters and Limits Of Clusters 205
3.2.3 Facebook Fabric 210
3.2.4 Facebook Network Technology 214
3.2.5 Facebook Fabric Gradual Scalability 216
3.2.6 Facebook Mega Datacenter Physical Infrastructure 217
3.2.7 Facebook Large Fabric Network Automation 219
3.2.8 Facebook Fabric Data Center Transparent Transition 226
3.2.9 Facebook Large-Scale Network 227
3.3 Google Meta Data Centers 233
3.3.1 Google Datacenter Network 234
3.3.2 Google Office Productivity Dynamic Architecture 235
3.3.3 Google Search Engine Dynamic Architecture 238
3.3.4 BigFiles 239
3.3.5 Repository 239
3.3.6 Google Clos Networks 240
3.3.7 Google B4 Datacenter WAN, a SDN 243
3.3.8 Google Programmable Access To Network Stack 245
3.3.9 Google Compute Engine Load Balancing 249
3.3.10 Google Compute Engine (GCE) TCP Stream Performance Improvements 252
3.3.11 Google The Dalles, Oregon Cloud 2.0 Mega Data Center 257
3.3.12 Lenoir, North Carolina 258
3.3.13 Google Hamina, Finland 259
3.3.14 Google Mayes County 261
3.3.15 Google Douglas County 263
3.3.16 Google Cloud 2.0 Mega Data Center St Ghislain, Belgium 266
3.3.17 Google Council Bluffs, Iowa Cloud 2.0 Mega Data Center 267
3.3.18 Google Douglas County Cloud 2.0 Mega Data Center 270
3.3.19 Google $300m Expansion of Existing Metro Atlanta Data Center 272
3.3.20 Google B4 SDN Initiative Benefits: Not Need To Be A Network Engineer To Control A Network; Can Do It At An Application Level 274
3.3.21 Google Cloud 2.0 Mega Data Center in Finland 276
3.3.22 Google Switches Provide Scale-Out: Server And Storage Expansion 279
3.3.23 Google and Microsoft 25G Ethernet Consortium 287
3.3.24 Google Workload Definitions 289
3.3.25 Google Kubernetes Container 292
3.3.26 Google Optical Networking 293
3.3.27 Google Data Center Efficiency Measurements 295
3.3.28 Google Measuring and Improving Energy Use 295
3.3.29 Google Comprehensive Approach to Measuring PUE 296
3.3.30 Q3 2016 PUE Performance 298
3.4 Microsoft 303
3.4.1 Microsoft .Net Dynamically Defines Reusable Modules 308
3.4.2 Microsoft Combines Managed Modules into Assemblies 309
3.4.3 Microsoft Architecture Dynamic Modular Processing 309
3.4.4 Microsoft Builds Azure Cloud Data Centers in Canada 311
3.4.5 Microsoft Dublin Cloud 2.0 mega data center 312
3.4.6 Microsoft Data Center Largest in U.S. 313
3.4.7 Microsoft Crafts Homegrown Linux For Azure Switches 314
3.4.8 Microsoft Azure Cloud Switch 316
3.4.9 Microsoft Azure CTO Cloud Building 318
3.4.10 Microsoft Cloud 2.0 Mega Data Center Multi-Tenant Containers 319
3.4.11 Microsoft Managed Clustering and Container Management: Docker and Mesos 321
3.4.12 Kubernetes From Google or Mesos 322
3.4.13 Microsoft Second Generation Open Cloud Servers 322
3.4.14 Azure Active Directory 322
3.4.15 Microsoft Azure Stack Platform Brings The Suite Of Azure Services To The Corporate Datacenter 324
3.4.16 Hardware Foundation For Microsoft Azure Stack 331

4. MEGA DATACENTERS RESEARCH AND TECHNOLOGY 337
4.1 Enterprise IT Control Centers 337
4.2 Open Compute Project (OCP), 339
4.2.1 Microsoft Investment in Open Compute 341
4.2.2 Microsoft Leverages Open Compute Project to Bring Benefit to Enterprise Customers 342
4.3 Open Source Foundation 342
4.3.1 OSPF Neighbor Relationship Over Layer 3 MPLS VPN 343
4.4 Dynamic Systems 346
4.4.1 Robust, Enterprise-Quality Fault Tolerance 346
4.5 Cache / Queue 348
4.6 Multicast 350
4.7 Performance Optimization 351
4.8 Fault Tolerance 352
4.8.1 Gateways 353
4.8.2 Promise Of Web Services 353
4.9 IP Addressing And Directory Management 354
4.9.1 Dynamic Visual Representations 356
4.9.2 Application Integration 357
4.9.3 Point Applications 358
4.9.4 Fault Tolerance and Redundancy Solutions 359
4.9.5 Goldman Sachs Open Compute Project 360
4.10 Robust, Quality Cloud Computing 361
4.11 Networking Performance 368

5. MEGA DATACENTERS COMPANY PROFILES 370
5.1 Amazon 370
5.1.1 Amazon Business 370
5.1.2 Amazon Competition 370
5.1.3 Amazon Description 372
5.1.4 Amazon Revenue 376
5.2 Facebook 377
5.2.1 Facebook Technology 378
5.2.2 Facebook Sales and Operations 378
5.2.3 Facebook Management Discussion 378
5.2.4 Facebook Revenue 380
5.2.5 Facebook 381
5.2.6 Facebook App Draining Smart Phone Batteries 382
5.2.7 Facebook Messaging Provides Access to User Behavioral Data 382
5.2.8 Facebook Creating Better Ads 383
5.2.9 Facebook Next Generation Services 383
5.2.10 Facebook Platform 384
5.2.11 Facebook Free Basics 385
5.2.12 Facebook AI 385
5.2.13 Facebook Revenue 386
5.2.14 Facebook Revenue Growth Priorities: 387
5.2.15 Facebook Average Revenue Per User ARPU 388
5.2.16 Facebook Geographical Information 389
5.2.17 Facebook WhatsApp 389
5.2.18 Facebook WhatsApp Focusing on Growth 390
5.3 Google 391
5.3.1 Google Revenue 391
5.3.2 Google 393
5.3.3 Google Search Technology 393
5.3.4 Google Recognizes World Is Increasingly Mobile 394
5.3.5 Google Nest 394
5.3.6 Google / Nest Protect 395
5.3.7 Google / Nest Safety History 396
5.3.8 Google / Nest Learning Thermostat 398
5.3.9 Google Chromecast 399
5.4 Microsoft 400
5.4.1 Microsoft Builds the Intelligent Cloud Platform 401
5.4.2 Microsoft Targets Personal Computing 403
5.4.3 Microsoft Reportable Segments 403
5.4.4 Skype and Microsoft 407
5.4.5 Microsoft / Skype / GroupMe Free Group Messaging 408
5.4.6 Microsoft SOA 409
5.4.7 Microsoft .Net Open Source 411
5.4.8 Microsoft Competition 412
5.4.9 Microsoft Revenue 413
WinterGreen Research, INC.
WinterGreen Research Research Methodology 415
List of Figures

Figure 1. Cloud 2.0 Mega Data Center Market Driving Forces 39
Figure 2. Cloud Datacenter, Co-Location, and Social Media Revenue Market Shares, Dollars, Worldwide, 2016, Image 42
Figure 3. Cloud 2.0 Mega Datacenter Market Forecast, Dollars, Worldwide, 2017-2023 44
Figure 4. RagingWire Colocation N+1 Shared Infrastructure 48
Figure 5. RagingWire Colocation N+1 Dedicated Infrastructure 50
Figure 6. RagingWire Data Center Maintenance on N+1 Dedicated System Reduces Fault Tolerance to N52
Figure 7. RagingWire Data Center Stays Fault Tolerant During Maintenance with 2N+2 System53
Figure 8. 100 Gbps Adoption 57
Figure 9. Data Center Technology Shifting 59
Figure 10. Data Center Technology Shift 60
Figure 11. IT Cloud Evolution 65
Figure 12. Facebook Networking Infrastructure Fabric 67
Figure 13. Datacenter Metrics 69
Figure 14. Cloud 2.0 Mega Data Center Market Driving Forces 86
Figure 15. Cloud 2.0 Mega Datacenter Cap Ex Spending Market Shares Dollars, Worldwide, 2016 89
Figure 16. Large Internet Company Cap Ex Market Shares, Dollars, Worldwide, 2013 to 2016 91
Figure 17. Cloud 2.0 Mega Data Center Cap Ex Market Shares, Dollars, Worldwide, 2013 to 2016 92
Figure 18. Cloud 2.0 Mega Data Center Cap Ex Market Shares, Dollars, Worldwide, 2016 93
Figure 19. Cloud 2.0 Mega Data Center Social Media and Search Revenue Market Shares, Dollars, 2016, Image 95
Figure 20. Cloud 2.0 Mega Data Center Social Media and Search Revenue Market Shares, Dollars, 201696
Figure 21. 538,000SF: i/o Data Centers and Microsoft Phoenix One, Phoenix, Ariz. 97
Figure 22. Phoenix, Arizona i/o Data Center Design Innovations 98
Figure 23. Microsoft Data Center, Dublin, 550,000 Sf 100
Figure 24. Container Area In The Microsoft Data Center In Chicago 101
Figure 25. An aerial view of the Microsoft data center in Quincy, Washington 103
Figure 26. . Microsoft San Antonio Data Centers, 470,000 SF 104
Figure 27. Microsoft 3rd Data Center in Bexar Could Employ 150 105
Figure 28. Cloud 2.0 Mega Datacenter Market Forecast, Dollars, Worldwide, 2017-2023 114
Figure 29. Cloud 2.0 Mega Datacenter Market Shares Dollar, Forecast, Worldwide, 2017-2023115
Figure 30. Cloud 2.0 Mega Datacenter Market Shares Percent, Forecast, Worldwide, 2017-2023116
Figure 31. Market Driving Forces for Cloud 2.0 Mega Data Centers 117
Figure 32. Market Challenges of Cloud 2.0 Mega Data Centers 118
Figure 33. Key Components And Topology Of A Mega Datacenter 128
Figure 34. Datacenter Topology without Single Managed Entities 129
Figure 35. Key Challenges Enterprise IT Datacenters: 130
Figure 36. Software Defined Datacenter 132
Figure 37. Cisco VNI Forecast Overview 136
Figure 38. The Cisco VNI Forecast—Historical Internet Context 137
Figure 39. Global Devices and Connections Growth 138
Figure 40. Average Number of Devices and Connections per Capita 139
Figure 41. Global IP Traffic by Devices 140
Figure 42. Global Internet Traffic by Device Type 141
Figure 43. Global 4K Video Traffic 142
Figure 44. Global IPv6-Capable Devices and Connections Forecast 2015–2020 143
Figure 45. Projected Global Fixed and Mobile IPv6 Traffic Forecast 2015–2020 144
Figure 46. Global M2M Connection Growth 145
Figure 47. Global M2M Connection Growth by Industries 146
Figure 48. Global M2M Traffic Growth: Exabytes per Month 147
Figure 49. Global Residential Services Adoption and Growth 148
Figure 50. Global IP Traffic by Application Category 149
Figure 51. Mobile Video Growing Fastest; Online Video and Digital TV Grow Similarly 150
Figure 52. Global Cord Cutting Generates Double the Traffic 150
Figure 53. Fixed Broadband Speeds (in Mbps), 2015–2020 152
Figure 54. Future of Wi-Fi as Wired Complement 153
Figure 55. Global IP Traffic, Wired and Wireless* 154
Figure 56. Global Internet Traffic, Wired and Wireless 155
Figure 57. Cisco VNI Forecasts 194 EB per Month of IP Traffic by 2020 157
Figure 58. Cisco Forecast of Global Devices and Connections Growth 158
Figure 59. Cloud 2.0 Mega Data Center Regional Market Segments, Dollars, 2016, Image 161
Figure 60. Cloud 2.0 Mega Data Center Regional Market Segments, Dollars, 2016 162
Figure 61. Map of Google’s Cloud Data Centers 164
Figure 62. Amazon Zones and Regions 168
Figure 63. Amazon AWS Global Cloud Infrastructure 171
Figure 64. Amazon (AWS) Support for Global IT Presence 173
Figure 65. AWS E Tool Functions 175
Figure 66. AWS E Tool Supported Sources 176
Figure 67. Amazon North America Map 177
Figure 68. Amazon North America List of Locations 178
Figure 69. Example of AWS Region 183
Figure 70. Example of AWS Availability Zone 184
Figure 71. Example of AWS Data Center 185
Figure 72. AWS Network Latency and Variability 186
Figure 73. Amazon (AWS) Regional Data Center 187
Figure 74. A Map of Amazon Web Service Global Infrastructure 188
Figure 75. Rows of Servers Inside an Amazon (AWS) Data Center 189
Figure 76. Facebook DuPont Fabros Technology Ashburn, VA Data Center 191
Figure 77. Facebook Altoona Iowa Cloud 2.0 Mega Data Center 194
Figure 78. Facebook Cloud 2.0 mega data center in Altoona, Iowa Construction Criteria 195
Figure 79. Facebook Fifth Data Center Fort Worth Complex. 198
Figure 80. Facebook Altoona Positioning Of Global Infrastructure 200
Figure 81. Facebook Back-End Service Tiers And Applications Account for Machine-To-Machine Traffic Growth 203
Figure 82. Facebook Back-End Service Tiers And Applications Functions 204
Figure 83. Facebook Cluster-Focused Architecture Limitations 206
Figure 84. Facebook Clusters Fail to Solve a Networking Limitations 208
Figure 85. Facebook Sample Pod: Unit of Network 211
Figure 86. Facebook Data Center Fabric Network Topology 213
Figure 87. Facebook Network Technology 215
Figure 88. Facebook Schematic Fabric-Optimized Datacenter Physical Topology 218
Figure 89. Facebook Automation of Cloud 2.0 mega data center Process 221
Figure 90. Facebook Creating a Modular Cloud 2.0 mega data center Solution 222
Figure 91. Facebook Cloud 2.0 mega data center Fabric High-Level Settings Components 223
Figure 92. Facebook Cloud 2.0 mega data center Fabric Unattended Mode 224
Figure 93. Facebook Data Center Auto Discovery Functions 225
Figure 94. Facebook Automated Process Rapid Deployment Architecture 228
Figure 95. Facebook Fabric Automated Process Rapid Deployment Architecture 229
Figure 96. Facebook Fabric Rapid Deployment 230
Figure 97. Facebook Cloud 2.0 mega data center High Speed Network Implementation Aspects231
Figure 98. Facebook Cloud 2.0 mega data center High Speed Network Implementation Aspects232
Figure 99. Google St. Ghislain, Belgium, Europe Data Center 234
Figure 100. Google Dynamic Architecture 236
Figure 101. Google Clos Multistage Switching Network 241
Figure 102. Google Key Principles Used In Designing Datacenter Networks 242
Figure 103. Google Andromeda Cloud Architecture Throughput Benefits 244
Figure 104. Google Andromeda Software Defined Networking (SDN)-Based Substrate Functions 246
Figure 105. Google Andromeda Cloud High-Level Architecture 246
Figure 106. Google Andromeda Performance Factors Of The Underlying Network 248
Figure 107. Google Compute Engine Load Balanced Requests Architecture 250
Figure 108. Google Compute Engine Load Balancing 251
Figure 109. Google Cloud Platform TCP Andromeda Throughput Advantages 253
Figure 110. Google Meta Data Center Locations 255
Figure 111. Google Meta Data Center Locations Map 256
Figure 112. Google Dalles Data Center Cooling Pipes 258
Figure 113. Google Hamina, Finland Data Center 259
Figure 114. Google Lenoir Data Center North Carolina, US 260
Figure 115. Google Data Center in Pryor, Oklahoma 262
Figure 116. Google Douglas County, Georgia Data Center Facility 263
Figure 117. Google Berkeley County, South Carolina, Data Center 265
Figure 118. Google Council Bluffs Iowa Cloud 2.0 Mega Data Center 268
Figure 119. Google Council Bluffs Iowa Cloud 2.0 Mega Data Center Campus
Network Room 269
Figure 120. Google Douglas County Cloud 2.0 Mega Data Center 270
Figure 121. Google Team of Technical Experts Develop And Lead Execution Of’
Global Data Center Sustainability Strategy 271
Figure 122. Google Datacenter Manager Responsibilities 273
Figure 123. Google Meta Data Center 275
Figure 124. Google Server Warehouse in Former Paper Mill 276
Figure 125. Google Data Center in Hamina, Finland 278
Figure 126. Google Traffic Generated by Data Center Servers 279
Figure 127. Google Cloud 2.0 mega data center Multipathing: Implementing Lots And
Lots Of Paths Between Each Source And Destination 281
Figure 128. Google Cloud 2.0 mega data center Multipathing: Routing Destinations 282
Figure 129. Google Builds Own Network Switches And Software 282
Figure 130. Google Clos Topology Network Capacity Scalability 283
Figure 131. Google Jupiter Network Delivers 1.3 Pb/Sec Of Aggregate Bisection
Bandwidth Across A Datacenter 285
Figure 132. Jupiter Superblock Collection of Jupiter Switches Running SDN Stack
Based On Openflow Protocol: 286
Figure 133. Google Modernized Switch, Server, Storage And Network Speeds 287
Figure 134. Google Container Controller Positioning 291
Figure 135. Google Data Center Efficiency Measurements 295
Figure 136. Google Data Center PUE Measurement Boundaries 297
Figure 137. Google Continuous PUE Improvement with Quarterly Variatiion, 2008 to 2017 298
Figure 138. Cumulative Corporate Renewable Energy Purchasing in the United States,
Europe, and Mexico, November 2016 300
Figure 139. Images for Microsoft Dublin Cloud 2.0 Mega Data Center 304
Figure 140. Microsoft Azure Data Center 305
Figure 141. Microsoft Dublin Cloud 2.0 mega data center 307
Figure 142. Microsoft .Net Dynamic Definition of Reusable Modules 308
Figure 143. Microsoft .NET Compiling Source Code into Managed Assemblies 310
Figure 144. Microsoft Architecture Dynamic Modular Processing 311
Figure 145. Microsoft-Azure-Stack-Block-Diagram 326
Figure 146. Microsoft-Azure-Platform Stack-Services 328
Figure 147. Figure 175. Microsoft-Cloud Virtual Machine -Platform Stack-Services 329
Figure 148. Microsoft-Azure-Core Management-Services 330
Figure 149. Microsoft Data Centers 335
Figure 150. Multiple Pathways Open To Processing Nodes In The Cloud 2.0
Mega Data Center Functions 338
Figure 151. Layer 3 MPLS VPN Backbone 344
Figure 152. OSPF Network Types 345
Figure 153. Automatic Detection And Recovery From Network And System Failure 347
Figure 154. High Performance And Real-Time Message Throughput 351
Figure 155. Fault Tolerance Features 352
Figure 156. Functions Of An IP Addressing Device 354
Figure 157. Benefits Of an IP Addressing Device 355
Figure 158. Dynamic Visual Representation System Uses 356
Figure 159. Application Integration Health Care Functions 357
Figure 160. Application Integration Industry Functions 358
Figure 161. CERNE Cloud Architecture 362
Figure 162. Cern Cloud and Dev 363
Figure 163. CERN Use Cases 364
Figure 164. Cern Hardware Spectrum 365
Figure 165. Cern Operations Containers 366
Figure 166. Open Stack at Cern 367
Figure 167. Cern Open Space Containeers on Clouds 367
Figure 168. Amazon Principal Competitive Factors In The Online Retail Business 371
Figure 169. Amazon Improving Customer Experience Functions 373
Figure 170. Amazon Ways To Achieve Efficiency In Technology For Operations 375
Figure 171. Google / Nest Learning Thermostat 398
Figure 172. Microsoft Productivity and Business Processes Segment 404
Figure 173. Microsoft Intelligent Cloud Segment 405
Figure 174. Microsoft / Skype / GroupMe Free Group Messaging 408
Figure 175. Microsoft Service Orientated Architecture SOA Functions 410 



To request a free sample copy of this report, please complete the form below.

We never share your personal data. Privacy policy
Interested in this report? Get your FREE sample now! Get a Free Sample
Choose License Type
Single User - US $4200
Multi User - US $8400
Hexareeasearch Know

Did you know?

Research Assistance

Phone: 1-415-349-0054

Toll Free: 1-888-928-9744

Email: [email protected]

Why to buy from us

Custom research service

Speak to the report author to design an exclusive study to serve your research needs.

Quality assurance

A testimonial for service excellence represented in the form of BBB "A" Accreditation.

Information security

Your personal and confidential information is safe and secure.

verify