• Artificial Intelligence
  • Generative AI
  • Business Operations
  • Cloud Computing
  • Data Center
  • Data Management
  • Emerging Technology
  • Enterprise Applications
  • IT Leadership
  • Digital Transformation
  • IT Strategy
  • IT Management
  • Diversity and Inclusion
  • IT Operations
  • Project Management
  • Software Development
  • Vendors and Providers
  • Enterprise Buyer’s Guides
  • United States
  • Middle East
  • Italia (Italy)
  • Netherlands
  • United Kingdom
  • New Zealand
  • Data Analytics & AI
  • Newsletters
  • Foundry Careers
  • Terms of Service
  • Privacy Policy
  • Cookie Policy
  • Copyright Notice
  • Member Preferences
  • About AdChoices
  • Your California Privacy Rights

Our Network

  • Computerworld
  • Network World

essay on data centers

The role of data centers in building a sustainable future

As the demand for data grows, data centers around the world are finding creative ways to reduce energy use and protect the environment.

brandpost

Data is the fabric of our connected world.

The rise of streaming and enterprise cloud adoption have driven an explosive surge in computing demand, giving rise to data centers around the world.

Now, a new wave of demand driven by data-hungry generative AI applications is arriving, and it’s bringing with it increasing environmental pressures.

From 2020 to 2025, data usage is expected to quadruple. As data centers grow, so does the demand for power. To ensure a sustainable future for all, it’s now vital to increase energy efficiency and find creative ways to cut down on power consumption while still supporting the world’s expanding need for data.  

Data centers are changing to support sustainability

With NTT being one of the world’s largest data center companies, with more than 100 data centers in over 20 countries around the world, we recognize the important responsibility to help create a sustainable future and have developed systems and solutions to support energy efficiency and environmental sustainability. 

  • In Japan, the Mitaka data center’s building structure was designed to curve outwardly so hot air can escape more easily while cold air enters from below the building. Inside the data vaults, a secondary, highly efficient cooling system is deployed in addition to traditional conservation techniques to create the most highly efficient data center possible.  
  • In Santa Clara, California, one of the big challenges is the use of water. California has been in a drought for many years, so NTT is very conscious of how resources are used. A highly efficient cooling system has been designed that moves cold water down across fan walls adjacent to the data center. This blows the air over the cold water and creates chilled air that runs into the data vaults. At the same time, once the water has warmed up after running across the coils, it is run up to the roof and uses outside air to re-cool the water, then push it back down in that system. It runs a continuous cycle, so no fresh water is needed.   
  • In Germany, at the Berlin data centers, a system has been developed to handle the heat that’s created within each center. Rather than simply exhaust it into the environment, it is captured and reused to heat the offices adjacent to the data center. This delivers free heat to neighbors and enables part of the solution for carbon-free energy use. 

NTT Data Centers in Germany, Japan, and California and all over the world include unique solutions deployed today but are all small pieces of a much bigger effort. New solutions and technologies are continuously under development to address environmental concerns and support global sustainability goals.

Other technological developments are underway – including transitioning from electronics to photonics for networking, computing and other devices with low power, high capacity and ultra-low latency characteristics, or putting data centers in space for less climate impact. This is not being done alone.  NTT believes that collaborating with partners around the world on innovative solutions like these will make things better today for a more sustainable tomorrow. 

Learn more about the future of data centers.

Related content

Technology's role in social sustainability, ethics and the future of innovation, building the future of construction, the business value of social sustainability, from our editors straight to your inbox, show me more, what erie insurance does to develop transformational talent.

Image

Think you can ignore quantum computing? Think again.

Image

An Intelligent Future for Manufacturing

Image

CIO Leadership Live UK with Philip Clayson, Director, Technology Transformation, Virgin Media O2

Image

Gilbane CIO Karen Higgins-Carter on genAI pilots in construction

Image

CIO Leadership Live New Zealand with Ruth Russell, CIO at Office of the Māori Trustee

Image

Northern Light boosts market intelligence search through generative AI

Image

Sponsored Links

  • Digital infrastructure plays a big role in business outcomes. Read this IDC report to learn more.
  • IDC report: Life-cycle services can help align technology, operational, and business outcomes.

Illustration with collage of pictograms of computer monitor, server, clouds, dots

A data center is a physical room, building or facility that houses  IT infrastructure  for building, running, and delivering applications and services, and for storing and managing the data associated with those applications and services.

Data centers have evolved in recent years from privately-owned, tightly-controlled on-premises facilities housing traditional IT infrastructure for the exclusive use of one company, to remote facilities or networks of facilities owned by cloud service providers housing virtualized IT infrastructure for the shared use of multiple companies and customers.

Strategic application modernization is one key to transformational success that can boost annual revenue and lower maintenance and running costs.

Register for the guide on hybrid cloud

There are different types of data center facilities, and a single company may use more than one type, depending on workloads and business need.

Enterprise (on-premises) data centers

In this data center model, all IT infrastructure and data is hosted on-premises. Many companies choose to have their own on-premises data centers because they feel they have more control over information security, and can more easily comply with regulations such as the European Union General Data Protection Regulation (GDPR) or the U.S. Health Insurance Portability and Accountability Act (HIPAA). In an enterprise data center, the company is responsible for all deployment, monitoring, and management tasks.

Public cloud data centers

Cloud data centers (also called cloud computing data centers) house IT infrastructure resources for shared use by multiple customers—from scores to millions of customers—via an Internet connection.

Many of the largest cloud data centers—called hyperscale data centers—are run by major cloud service providers like Amazon Web Services (AWS), Google Cloud Platform, IBM Cloud, Microsoft Azure, and Oracle Cloud Infrastructure. In fact, most leading cloud providers run several hyperscale data centers around the world. Typically, cloud service providers maintain smaller, edge data centers located closer to cloud customers (and cloud customers’ customers). For real-time, data-intensive workloads such big data analytics,  artificial intelligence (AI) , and content delivery applications, edge data centers can help minimize latency, improving overall application performance and customer experience.

Managed data centers and colocation Facilities

Managed data centers and colocation facilities are options for organizations that don’t have the space, staff, or expertise to deploy and manage some or all of their IT infrastructure on premises—but prefer not to host that infrastructure using the shared resources of a public cloud data center.

In a managed data center, the client company leases dedicated servers, storage and networking hardware from the data center provider, and the data center provider handles the administration, monitoring and management for the client company.

In a colocation facility, the client company owns all the infrastructure, and leases a dedicated space to host it within the facility. In the traditional colocation model, the client company has sole access to the hardware and full responsibility for managing it; this is ideal for privacy and security but often impractical, particularly during outages or emergencies. Today, most colocation providers offer management and monitoring services for clients who want them.

Managed data centers and colocation facilities are often used to house remote data backup and disaster recovery technology for small and midsized businesses (SMBs).

Most modern data centers—even in-house on-premises data centers—have evolved from traditional IT architecture, where every application or workload runs on its own dedicated hardware, to cloud architecture, in which physical hardware resources—CPUs, storage, networking—are virtualized.  Virtualization  enables these resources to be abstracted from their physical limits, and pooled into capacity that can be allocated across multiple applications and workloads in whatever quantities they require.

Virtualization also enables software-defined infrastructure (SDI)—infrastructure that can be provisioned, configured, run, maintained and ‘spun down’ programmatically, without human intervention.

The combination of cloud architecture and SDI offers many advantages to data centers and their users, including the following:

Optimal utilization of compute, storage, and networking resources. Virtualization enables companies or clouds to serve the most users using the least hardware, and with the least unused or idle capacity.  

Rapid deployment of applications and services. SDI automation makes provisioning new infrastructure as easy as making a request via a self-service portal.  

Scalability. Virtualized IT infrastructure is far easier to scale than traditional IT infrastructure. Even companies using on-premises data centers can add capacity on demand by bursting workloads to the cloud when necessary.  

Variety of services and data center solutions. Companies and clouds can offer users a range of ways to consume and deliver IT, all from the same infrastructure. Choices are made based on workload demands, and include  infrastructure as a service (IaaS) ,  platform as a service (PaaS) , and  software as a service (SaaS) . These services can be offered in a private data center, or as cloud solutions in either a  private cloud ,  public cloud ,  hybrid cloud , or  multicloud  environment.  

Cloud-native  development.  Containerization  and  serverless  computing, along with a robust open-source ecosystem, enable and accelerate  DevOps  cycles and  application modernization  as well as enable develop-once-deploy-anywhere apps.

Servers are powerful computers that deliver applications, services and data to end-user devices. Data center servers come in several form factors:

Rack-mount servers  are wide, flat standalone servers—the size of a small pizza box— designed to be stacked on top of each other in a rack, to save space (vs. a tower or desktop server). Each rack-mount server has its own power supply, cooling fans, network switches, and ports, along with the usual processor, memory, and storage.  

Blade servers  are designed to save even more space. Each blade contains processors, network controllers, memory and sometime storage; they’re made to fit into a chassis that holds multiple blades and contains the power supply, network management and other resources for all the blades in the chassis.  

Mainframes  are high-performance computers with multiple processors that that can do the work of an entire room of rack-mount or blade servers. The first virtualizable computers, mainframes can process billions of calculations and transactions in real time.

The choice of form factor depends on many factors including available space in the data center, the workloads being run on the servers, the available power, and cost.

Storage systems

Most servers include some local storage capability—called direct-attached storage (DAS)—to enable the most frequently used data (hot data) to remain close the CPU.

Two other data center storage configurations include network-attached storage (NAS), and a storage area network (SAN).

NAS provides data storage and data access to multiple servers over a standard Ethernet connection. The NAS device is usually a dedicated server with multiple storage media—hard disk drives (HDDs) and/or solid state drives (SSDs).

Like NAS, a SAN enables shared storage, but a SAN uses a separate network for the data and consists of a more complex mix of multiple storage servers, application servers, and storage management software.

A single data center may use all three storage configurations—DAS, NAS, and SAN—as well as  file storage ,  block storage  and  object storage  types.

The data center network, consisting of various types of switches, routers and fiber optics, carries network traffic across the servers (called east/west traffic), and to/from the servers to the clients (called north/south traffic).

As noted above, a data center’s network services are typically virtualized. This enables the creation of software-defined overlay networks, built on top of the network’s physical infrastructure, to accommodate specific security controls or service level agreements (SLAs).

Power supply and cable Management

Data centers need to be always-on, at every level. Most servers feature dual power supplies. Battery-powered uninterruptible power supplies (UPS) protect against power surges and brief power outages. Powerful generators can kick in if a more severe power outage occurs.

With thousands of servers connected by various cables, cable management is an important data center design concern. If cables are too near to each other, they can cause cross-talk, which can negatively impact data transfer rates and signal transmission. Also, if too many cables are packed together, they can generate excessive heat. Data center construction and expansion must consider building codes and industry standards to ensure cabling is efficient and safe.

Redundancy and disaster recovery

Data center downtime is costly to data center providers and to their customers, and data center operators and architects go to great lengths to increase resiliency of their systems. These measures include everything from redundant arrays of independent disks (RAIDs) to protect against data loss or corruption in the case of storage media failures, to backup data center cooling infrastructure that keeps servers running at optimal temperatures, even if the primary cooling system fails.

Many large data center providers have data centers located in geographically distinct regions, so that if a natural disaster or political disruption occurs in one region, operations can be failed over to a different region for uninterrupted services.

The  Uptime Institute  (link resides outside ibm.com) uses a four-tier system to rate the redundancy and resiliency of data centers:

Tier I— Provides basic redundancy capacity components, such as uninterruptible power supply (UPS) and 24/7 cooling, to support IT operations for an office setting or beyond.  

Tier II— Adds additional redundant power and cooling subsystems, such as generators and energy storage devices, for improved safety against disruptions.  

Tier III— Adds redundant components as a key differentiator from other data centers. Tier III facilities require no shutdowns when equipment needs maintenance or replacement.  

Tier IV— Adds fault tolerance by implementing several independent, physically isolated redundant capacity components, so that when a piece of equipment fails there is no impact to IT operations.

Environmental controls

Data centers must be designed and equipped to control environmental factors—most of which are interrelated—that can damage or destroy hardware and lead to expensive or catastrophic downtime.

Temperature:  Most data centers employ some combination of air cooling and liquid cooling to keep servers and other hardware operating in the proper temperature ranges. Air cooling is basically air conditioning—specifically, computer room air conditioning (CRAC) targeted at the entire server room, or at specific rows or racks of servers. Liquid cooling technologies pump liquid directly to processors, or in some cases immerse servers in coolant. Data center providers are turning increasingly to liquid cooling for greater energy efficiency and sustainability—it requires less electricity and less water than air cooling.

Humidity : High humidity can cause equipment to rust; low humidity can increase the risk static electricity surges (see below). Humidity control equipment includes the aforementioned CRAC systems, proper ventilation, and humidity sensors.

Static electricity:  As little as 25 volts of static discharge to damage equipment or corrupt data. Data center facilities are outfitted with equipment to monitors static electricity and discharge it safely.

Fire:  For obvious reasons, data centers have must be equipped with fire-prevention equipment, and it must be tested regularly.

Modern converged infrastructure (CI) solutions combine servers, storage and networking into one integrated platform.

Learn about the IBM Cloud data centers located around the world to help you meet geography-specific requirements quickly.

IBM Cloud with Red Hat offers market-leading security, enterprise scalability and open innovation to unlock the full potential of cloud and AI.

Learn about the importance of security and privacy of your data in the cloud.

Simplify data and infrastructure management with IBM Storage FlashSystem, a high-performance, all-flash storage solution that streamlines administration and operational complexity across on-premises, hybrid cloud, virtualized and containerized environments.

Accessibility Links

  • Skip to content
  • Skip to search IOPscience
  • Skip to Journals list
  • Accessibility help
  • Accessibility Help

Click here to close this panel.

ERL graphic iopscience_header.jpg

Purpose-led Publishing is a coalition of three not-for-profit publishers in the field of physical sciences: AIP Publishing, the American Physical Society and IOP Publishing.

Together, as publishers that will always put purpose above profit, we have defined a set of industry standards that underpin high-quality, ethical scholarly communications.

We are proudly declaring that science is our only shareholder.

The environmental footprint of data centers in the United States

Md Abu Bakar Siddik 1 , Arman Shehabi 2 and Landon Marston 3,1

Published 21 May 2021 • © 2021 The Author(s). Published by IOP Publishing Ltd Environmental Research Letters , Volume 16 , Number 6 Citation Md Abu Bakar Siddik et al 2021 Environ. Res. Lett. 16 064017 DOI 10.1088/1748-9326/abfba1

You need an eReader or compatible software to experience the benefits of the ePub3 file format .

Article metrics

50927 Total downloads

Share this article

Author e-mails.

[email protected]

Author affiliations

1 Department of Civil & Environmental Engineering, Virginia Tech, Blacksburg, VA, United States of America

2 Energy Technologies Area, Lawrence Berkeley National Laboratory, Berkeley, CA, United States of America

Author notes

3 Author to whom any correspondence should be addressed.

Arman Shehabi https://orcid.org/0000-0002-1735-6973

Landon Marston https://orcid.org/0000-0001-9116-1691

  • Received 21 January 2021
  • Accepted 26 April 2021
  • Published 21 May 2021

Peer review information

Method : Single-anonymous Revisions: 2 Screened for originality? Yes

Buy this article in print

Much of the world's data are stored, managed, and distributed by data centers. Data centers require a tremendous amount of energy to operate, accounting for around 1.8% of electricity use in the United States. Large amounts of water are also required to operate data centers, both directly for liquid cooling and indirectly to produce electricity. For the first time, we calculate spatially-detailed carbon and water footprints of data centers operating within the United States, which is home to around one-quarter of all data center servers globally. Our bottom-up approach reveals one-fifth of data center servers direct water footprint comes from moderately to highly water stressed watersheds, while nearly half of servers are fully or partially powered by power plants located within water stressed regions. Approximately 0.5% of total US greenhouse gas emissions are attributed to data centers. We investigate tradeoffs and synergies between data center's water and energy utilization by strategically locating data centers in areas of the country that will minimize one or more environmental footprints. Our study quantifies the environmental implications behind our data creation and storage and shows a path to decrease the environmental footprint of our increasing digital footprint.

Export citation and abstract BibTeX RIS

1. Introduction

Data centers underpin our digital lives. Though relatively obscure just a couple of decades prior, data centers are now critical to nearly every business, university, and government, as well as those that rely on these organizations. Data centers support servers, digital storage equipment, and network infrastructure for the purpose of large-scale data processing and data storage [ 1 ]. Increasing demand for data creation, processing, and storage from existing and emerging technologies, such as online platforms/social media, video streaming, smart and connected infrastructure, autonomous vehicles, and artificial intelligence, has led to exponential growth in data center workloads and compute instances [ 2 ].

The global electricity demand of data centers was 205 TWh in 2018, which represents about 1% of total global electricity demand [ 3 ]. The United States houses nearly 30% of data center servers, more than any other country [ 3 – 5 ]. In 2014, 1.8% of US electricity consumption was attributable to data centers, roughly equivalent to the electricity consumption of New Jersey [ 1 ]. Previous studies found power densities per floor area of traditional data centers almost 15–100 times as large as those of typical commercial buildings [ 6 ], and data center power density has increased with the proliferation of compute-intensive workloads [ 7 ]. Though the amount of data center computing workloads has increased nearly 550% between 2010 and 2018, data center electricity consumption has only risen by 6% due to dramatic improvements in energy efficiency and storage-drive density across the industry [ 1 , 3 ]. However, it is unclear whether energy efficiency improvements can continue to offset the energy demand of data centers as the industry is expected to continue its rapid expansion over the next decade [ 8 ].

The growing energy demand of data centers has attracted the attention of researchers and policymakers not only due to scale of the industry's energy use but because the implications the industry's energy consumption has on greenhouse gas (GHG) emissions and water use. Data centers directly and indirectly consume water and emit GHG in their operation. Most data centers' energy demands are supplied by the electricity grid, which distributes electricity from connected power plants. Electricity generation is the second largest water consumer [ 9 ] and the second largest emitter of GHGs in the US [ 10 ]. These environmental externalities can be attributed to the place of energy demand using several existing approaches [ 11 , 12 ].

In addition to the electricity consumed directly by data centers, electricity is used to supply treated water to data centers and treat the wastewater discharged by data centers. Like data centers, water and wastewater facilities are major electricity consumers, responsible for almost 1.8% of total electricity consumption in the US in 2013 [ 13 ]. The electricity required in the provisioning and treatment of water and treatment of discharged wastewater also emits GHGs that can be attributed to data centers. Likewise, water used to generate the electricity used by water and wastewater utilities in their service of data centers contributes to the water footprint of these data centers. Water is also used directly within a data center to dissipate the immense amount of heat that is produced during its operation.

The geographic location [ 14 , 15 ] and the local electricity mix [ 16 ] are strong determinants of a data center's carbon footprint, though these spatial details are often excluded in data center studies. A preliminary water footprint assessment of data centers by Ristic et al [ 17 ] provided a range of water footprints associated with data center operation. Although Ristic et al provided general estimates based on global average water intensity factors, their study highlights the importance of considering both direct and indirect water consumption associated with data center operation. Moreover, Ristic et al highlights the importance of considering the type of power plants supplying electricity to a data center and the type/size of a data center, as each of these factors can significantly impact energy use and indirect water footprint estimates.

In this study we utilize spatially-detailed records of data center operations to provide the first sub-national estimates of data center water and carbon footprints. Here, water footprint is defined as the consumptive blue water use (i.e. surface water and groundwater). The carbon footprint of a data center, expressed as equivalent CO 2 , is used to represent its global warming potential. Our assessment focuses on the operational environmental footprint of data centers (figure 1 ), which includes the power plant(s), water supplier, and wastewater treatment plant servicing the data center. The non-operational stages of a data center's life cycle (e.g. manufacturing of servers) consume relatively much less energy [ 18 ] and are excluded in this study. The spatial detail afforded by our approach enables more accurate estimates of water consumption and GHG emissions associated with data centers than previous studies. Moreover, we evaluate the impact of data center operation on the local water balance and identify data centers located in, or indirectly reliant upon, already water stressed watersheds. We investigate the following questions: (i) What is the direct and indirect operational water footprint of US data centers? (ii) Which watersheds support each data center's water demand and what portion of these watersheds are water stressed? (iii) How much GHG emissions are associated with the operation of data centers? (iv) To what degree can strategic placement of future data centers within the US reduce the industry's operational water and carbon footprints?

Figure 1.

Figure 1.  The system boundaries and interlinkages defining the operational water and carbon footprints of data centers. Specific power plants, water utilities, and wastewater treatment (WWT) utilities are connected to each data center through their provisioning of electricity and water. Power plants emit GHGs and consume water in the production of electricity. These environmental impacts are attributed to data centers in proportion to how much electricity the data center uses (red and blue dashed lines connecting facilities). The GHG emissions and water consumption associated with the provisioning of treated water and disposal of wastewater, including the GHGs and water consumed in the generation of the electricity supplied to these facilities, are also attributed to data centers in proportion to their use of these utilities. Data centers do not directly emit GHGs but they do directly consume water to dissipate heat. All these facilities work together to keep data centers operational and contribute to the water and carbon footprint of data centers.

Download figure:

We utilize spatially detailed records on data centers, electricity generation, GHG emissions, and water consumption to determine the carbon footprint and water footprint of data centers in the US. Our approach connects specific power plants, water utilities, and wastewater treatment plants to each data center within the US. All data used in this study are for the year 2018, the most recent year where all data are publicly available. A visual summary of our methods is shown in supplementary figure S1 (available online at stacks.iop.org/ERL/16/064017/mmedia ).

2.1. Data center location and energy use

Information availability on data center location and size varies depending on its type and owner. Ganeshalingam et al [ 4 ] reports likely locations of in-house small and midsize data centers, which house approximately 40% of US servers. Detailed information on colocation and hyperscale data centers is derived from commercial compilations [ 19 – 21 ] that get direct support and input from data center service providers.

Table 1.  Combined direct and indirect water consumption and GHG emissions (carbon equivalence) by data center type. Water intensity and carbon intensity are reported per MWh of electricity used and per computing workload. Better energy utilization, more efficient cooling systems, and increased workloads per deployed server has increased the water efficiency of larger data centers. Computing workloads in hyperscale data centers are almost six times more water efficient compared to internal data centers. Workload estimates are based on traditional and cloud workloads from [ 2 , 3 ].

where PUE s is the power usage effectiveness of space type s , and A is the floor area of data center in ft 2 . We account for potential overstatement of data center capacity [ 4 ], a lack of distinction between gross and raised floor area, and unfilled rack capacity by scaling our server counts to match the 2018 estimate of servers by data center type [ 3 ], as shown in table 1 and figure S2. Scaled server estimates are then spatially distributed in proportion to the current spatial distribution of installed server bases. The number of servers by state is shown in figure S2.

Power usage effectiveness (PUE) is a key metric of data center energy efficiency [ 23 ]. A value of 1.0 is ideal as it indicates all energy consumed by a data center is used to power computing devices. Energy used for non-computing components, such as lighting and cooling, increases the PUE above 1.0 (see equation ( 2 )). Generally, a data center's PUE is inversely proportionate to its size since larger data centers are better able to optimize their energy usage. Average PUE values and energy use by data center type were taken from Masanet et al [ 3 ] and shown in table 1 and table S1.

2.2. Electricity generation, water consumption, and GHG emissions

Power plant-specific electricity generation and water consumption data come from the US Energy Information Administration (EIA) [ 24 ]. Of the approximately 9000 US power plants, the EIA requires nearly all power plants report electricity generation. However, only power plants with generation capacity greater than 100 MW (representing three-fourths of total generation) must report water consumption. We assigned national average values of water consumption per unit of electricity generation by fuel type (i.e. water intensity; m 3 MW h −1 ) to all power plants with unspecified water consumption. Operational water footprints of solar and wind power were taken from Macknick et al [ 25 ]. Following Grubert [ 26 ], we assign all reservoir evaporation to the dam's primary purpose (e.g. hydropower). We connected hydroelectric dams with their respective power plants using data from Grubert [ 27 ]. Reservoir specific evaporation comes from Reitz et al [ 28 ].

The U.S. Environmental Protection Agency's eGRID database [ 29 ] provided GHG emissions associated with each power plant. GHG emissions are converted to an equivalent amount of carbon dioxide (CO 2 )-eq with the same global warming potential so to derive a single carbon footprint metric [ 30 ]. Direct GHG emission during the operation of data centers are negligible [ 18 ] and therefore not considered in this study.

Data centers, water suppliers, and wastewater treatment plants typically utilize electricity generated from a mix of power plants connected to the electrical grid. Within the electrical grid, electricity supply matches electricity demand by balancing electricity generation within and transferred into/out of a power control area (PCA). Though it is infeasible to trace an electron generated by a particular power plant to the final electricity consumer, there are several approaches to relate electricity generation to electricity consumption (Siddik et al [ 31 ] summarizes the most common approaches).

Here, we primarily rely on the approach used by Colett et al [ 32 ] and Chini et al [ 33 ] to identify the generative source of electricity supplied to any given data center. This approach assesses electricity generation and distribution at the PCA level where it is primarily managed. PCA boundaries are derived from the Homeland Infrastructure Foundation level data [ 34 ] and crosschecked against Form EIA-861 [ 35 ], which identifies the PCAs operating in each state. Annual inter-PCA electricity transfers reported by the Federal Energy Regulatory Commission [ 36 ] are also represented within this approach. A data center (as well as water and wastewater utilities) draws on electricity produced within its PCA, unless the total demand of all energy consumers within the PCA exceeds local generation, in which case electricity imports from other PCAs are utilized. If a PCA's electricity production equals or exceeds the PCA's electricity demand, it is assumed all electricity imports pass through the PCA and are re-exported for utilization in other PCAs. Siddik et al [ 31 ] notes that water and carbon footprints are sensitive to the attribution method used to connect power plants to energy consumers. Therefore, we conduct a sensitivity analysis (see the supporting information for additional details) to test the degree to which our electricity attribution method affects our results. Additionally, we also test different assumptions regarding the water footprint of hydropower generation, as this too is a key source of uncertainty.

We focus on the annual temporal resolution and assume an average electricity mix proportional to the relative annual generation of each contributing power plant. Though the electricity mix within a PCA can fluctuate hourly depending on balancing measures, these intra-annual variations will not significantly impact our annual-level results. While it is infeasible to determine the precise amount of electricity each power plant provides to each data center, water utility, and wastewater treatment plant, our approach will enable us to estimate where each facility is most likely to draw its electricity. The dependency of a data center on local and imported electricity from other PCAs was calculated using equations ( 3 ) and ( 4 ).

where Import con is defined as the electricity from a linked PCA i that was consumed within PCA p . Any imported electricity not consumed with PCA p is re-exported.

Adjusted electricity consumption from the PCAs were assigned to the power plants using equation ( 5 ).

2.3. Water consumption and GHG emissions associated with data centers

The indirect water and carbon footprint of each data center consists of water consumption or GHG emissions associated with the generation of (i) electricity utilized during data center operation, (ii) electricity used by water treatment plants for treatment and supply of cooling water to data centers, and (iii) electricity used by wastewater treatment plants to treat the wastewater generated by a data center. The GHG emissions or water consumption of a power plant supplying electricity to a data center is attributed to the data center as follows:

Although the IPCC does not consider water treatment a notable emitter of GHGs [ 37 ], wastewater treatment plants are a major source of GHG emission [ 38 , 39 ]. In 2017, total GHG gas emission from wastewater treatment plants was estimated to be 20 million metric tons, with a direct emission rate of 0.3 kg CO 2 -eq/y per m 3 of wastewater treated [ 38 , 39 ]. In absence of facility specific emission data, we have used the average emission rate for treating wastewater for all wastewater generated from data center operation [ 39 ]. No direct GHG emissions are assumed to be associated with data center operation at the facility [ 18 ].

The EPA Safe Drinking Water Information System contains information on the location, system type, and source of water for each public water and wastewater utility [ 40 , 41 ]. We assumed the nearest non-transient water treatment plant and wastewater treatment plant services a data center's water demand and wastewater management, respectively. After calculating the water supply requirement of a data center (discussed later in this section), the electricity needed for treatment and distribution of cooling water can be calculated using the data from Pabi et al [ 13 ] (see table S2). Water and wastewater treatment plants were linked to power plants (as described previously) to estimate the indirect water footprint associated with electricity required to distribute and treat water and wastewater used by a data center. We then sum the water consumed by each power plant to directly or indirectly service a data center to determine the total indirect water footprint of that data center. The indirect water footprint associated with each power plant was also aggregated within watershed boundaries to determine which water sources each data center was reliant upon.

Direct water consumption of a data center can be estimated from the heat generation capacity of a data center [ 42 ], which is related to the amount of electricity used [ 43 ]. Estimates of data center specific electricity demand were multiplied by the typical water cooling requirement [ 1 ]—1.8 m 3 MWh −1 —to estimate the direct water footprint of each data center. The direct water consumption is assigned to the watershed where the water utility supplying the data center withdraws its water.

Data center wastewater is largely comprised of blowdown; that is, the portion of cooling water removed from circulation and replaced with freshwater to prevent excessive concentration of undesirable components [ 44 ]. We assume all data centers utilize potable water supplies and cycle this water until the concentration of dissolved solids is roughly five times the supplied water [ 44 ]. We calculate blowdown from data center cooling towers using the following commonly employed approach [ 45 ]:

2.4. Water scarcity footprint

The water scarcity footprint ( WSF ; as defined by ISO 14046 and Boulay et al [ 46 ]) indicates the pressure exerted by consumptive water use on available freshwater within a river basin and determines the potential to deprive other societal and environmental water users from meeting their water demands. We quantified the WSF of data centers using the AWARE method set forth by Boulay et al [ 46 ] (see the Supportive Information for more details). Other societal and environmental water use data, as well as data on natural water availability within each US watershed, come from [ 47 – 49 ].

3.1. The water footprint of data centers

The total annual operational water footprint of US data centers in 2018 is estimated at 5.13 × 10 8 m 3 . Data center water consumption is comprised of three components: (i) water consumed directly by the data center for cooling and other purposes (figure 2 (A)), (ii) water consumed indirectly through electricity generation (figure 2 (B)), and (iii) water consumed indirectly via the water embedded with the electricity consumption of water and wastewater utilities servicing the data center (figure 2 (C)). The data center industry directly or indirectly draws water from 90% of US watersheds, as shown in figure 3 (A).

Figure 2.

Figure 2.  The blue water footprint (m 3 ) of US data centers in 2018, resolved to each subbasin (8-digit Hydrologic Unit Code). (A) Direct water footprint of data centers, (B) indirect water footprints associated with electricity utilization by data center equipment, and (C) indirect water footprints associated with treatment of supplied cooling water and treatment of generated wastewater.

Figure 3.

Figure 3.  The subbasin or state of direct and indirect environmental impact associated with data center operation. (A) Water footprint (m 3 ). (B) WSF (m 3 US-eq water). (C) Carbon footprint (tons CO 2 -eq/y).

Roughly three-fourths of US data centers' operational water footprint is from indirect water dependencies. The indirect water footprint of data centers in 2018 due to their electricity demands is 3.83 × 10 8 m 3 , while the indirect water footprint attributed to water and wastewater utilities serving data centers is several orders of magnitude smaller (4.50 × 10 5 m 3 ). Nationally, we estimate that 1 MWh of energy consumption by a data center requires 7.1 m 3 of water. However, this national average masks the large spatial variation (range 1.8–105.9 m 3 ) in water demand associated with a data center's energy consumption. Data centers are indirectly dependent on water from every state in the contiguous US, much of which is sourced from power plants drawing water from subbasins in the eastern and western coastal states. Less than one-fifth of the industry's total electricity demand is from data centers in the West and Southwest US (regions as defined by NOAA [ 50 ]; see outlined areas in figures 2 – 5 , and figure S4 for region identification), yet nearly one-third of the industry's indirect water footprint is attributed to data centers in these regions. Indirect water consumption associated with energy production in Southwest subbasins is particularly high, despite relatively low electricity supplied from this region, due to the disproportionate amount of electricity from water-intensive hydroelectricity facilities and the high evaporative potential in this arid region. Conversely, the Southeastern region consumes one-quarter of the electricity used by the industry but only one-fifth of the indirect water since data centers in this region source their electricity from less water-intensive sources.

On-site, direct water consumption of US data centers in 2018 is estimated at 1.30 × 10 8 m 3 . Collectively, data centers are among the top-ten water consuming industrial or commercial industries in the US [ 47 ]. Approximately 1.70 × 10 7 m 3 of water directly consumed by data centers are sourced from a different subbasin than the location of the installed servers. Large direct water consumption in the Northeast, Southeast, and Southwest regions indicate clustering of servers in these regions. Combined direct and indirect water and carbon intensities are broken down by data center type in table 1 .

3.2. Reliance of data centers on scarce water supplies

The WSF of data centers in 2018 is 1.29 × 10 9 m 3 of US equivalent water consumption, which is more than twice that of the volumetric water footprint reported in the previous section. The WSF (including both direct and indirect water requirements) per unit of energy consumption is 17.9 m 3 US-eq water MWh −1 , more than double the nationally averaged water intensity (7.1 m 3 MWh −1 ) that does not account for water scarcity. WSFs that are larger than volumetric water footprints suggest that data centers disproportionately utilize water resources from watersheds experiencing greater water scarcity than average.

Only one-fourth of the volumetric water footprint of data centers resulted from onsite water use. Yet, more than 40% of the WSF is attributed to direct water consumption. This indicates that direct water consumption of data centers, which occurs close to where the data center is located, is skewed toward water stressed subbasins compared to its indirect water consumption, which is distributed more broadly geographically. We find that most of the watersheds that data centers draw from, particularly those in the Eastern US, face little to no water stress on average. In contrast, many of the watersheds in the Western US exhibit high levels of water stress, which is exacerbated by data centers direct and indirect water demands. Combined, the West and Southwestern watersheds supply only 20% of direct water and and 30% indirect water to data centers, while hosting approximately 20% of the nation's servers. Yet, 70% of the overall WSF occurs in these two regions (figure 3 (B)), which indicates a disproportionate dependency on scarce waters in the western US.

3.3. GHG emissions attributed to data centers

Total GHG emissions attributed to data centers in 2018 was 3.15 × 10 7 tons CO 2 -eq, which is almost 0.5% of total GHG emissions in the US [ 10 ]. A little over half (52%) of the total emissions of data center operations are attributed to the Northeast, Southeast, and Central US, which have a high concentration of thermoelectric power plants, along with large number of data centers (figure 3 (C)). Almost 30% of the data center industry's emissions occur within the Central US, which relies heavily on coal and natural gas to meet its electricity demand. Yet, only 10% the industry's energy demand comes from the Central US, and just 9% of the water consumption associated with data centers operation occurs in this region. Moreover, the Central region is a net exporter of electricity to other regions, providing electricity for data centers located in the Northeast and Southeast regions, which houses almost one-third of servers. Yet, the generation of less carbon intensive electricity in the Northeast (hydroelectricity) and Southeast (wind/solar) regions means that while their electricity consumption comprises 34% of data centers' national electricity demand, these regions only constitute 23% of the industry's GHG emissions. The GHG emissions from treating the wastewater generated from data centers is around 550 tons/y (0.002% of total GHG emissions associated with data centers).

3.4. Where to locate data centers to minimize water and carbon footprints

Our results indicate significant variability of environmental impacts depending on where a data center is located. Here we explore how the geographic placement of a data center can lead to improved environmental outcomes. We find that the total water intensity of a data center can range from 1.8–106 m 3 MWh −1 , the water scarcity intensity from 0.5 to 305 m 3 US-eq MWh −1 , and the carbon intensity from 0.02 to 1 ton CO 2 -eq MWh −1 depending on where the data center is placed (figure 4 ). Data center placement decisions are complicated by the electricity grid, which displaces environmental impacts from the physical location of a data center.

Figure 4.

Figure 4.  A data center's environmental footprint is highly contingent on where it is located. The (A) water intensity (m 3 MWh −1 ), (B) water scarcity intensity (m 3 US-eq MWh −1 ), and (C) GHG emissions intensity (tons CO 2 -eq MWh −1 ) of a hypothetical 1 MW data center placed in each of the 2110 subbasins of the continental United States.

Figure 5 depicts subbasins in the top quartile of environmental performance as it relates to water footprint ( 5 (A)), WSF ( 5 (B)), and carbon footprint ( 5 (C)) per MWh of electricity used by a hypothetical data center located within each subbasin. Less than 5% of subbasins are in the top quartile of environmental performance for both WSF and carbon footprint (hatched areas in figures 5 (B) and (C), meaning that 40% of subbasins will require making a trade-off between reducing WSFs and carbon footprints. The remaining 55% of subbasins (white areas shared by figures 5 (B) and (C) are not among the best locations to place a data center for either water or GHG reduction. Though the water footprint and WSF are related concepts, we show that nearly one-fifth of subbasins that were in the top quartile with respect to the water footprint are in the bottom quartile for WSF. In other words, a data center placed in these basins would use less water than 75% of potential sites, but it would draw that water from subbasins facing higher levels of water scarcity. In general, locating a data center within the Northeast, Northwest, and Southwest will reduce the facilities carbon footprint, while locating a data center in the Midwest and portions of the Southeast, Northeast, and Northwest will reduce its WSF.

Figure 5.

Figure 5.  The (A) water footprint, (B) WSF, and (C) carbon footprint of data centers can be reduced by placing them in subbasins with the smallest footprint (top quartile of all subbasins), as denoted by the shaded subbasins in each panel. The bar graphs represent the percent reduction/increase of each environmental footprint within the shaded subbains compared to the national average data center environmental footprint. Hatched areas indicate subbasin that are among the most (top quartile) environmentally favorable locations for both water scarcity and GHG emissions.

In the coming years, cloud and hyperscale data centers will replace many smaller data centers [ 3 ]. This shift will lower the environmental footprint in some instances but introduce new environmental stress in other areas. Assuming added servers employ similar technology as existing servers and are placed in cloud and hyperscale data centers in proportion to the current spatial distribution of data centers (i.e. business-as-usual scenario), these new data center servers will have a collective water footprint of 77.77 × 10 6 m 3 (15% of the current industry total), WSF of 170.56 × 10 6 m 3 US-eq (9%), and 4.36 × 10 6 tons CO 2 -eq (14%). However, if these new servers are strategically placed in areas identified to have a lower environmental footprint, their water and carbon burden could be significantly reduced.

The WSF and carbon footprint of new data centers can be reduced by 153.00 × 10 6 m 3 US-eq (90% less than business-as-usual expansion) and 2.34 × 10 6 tons CO 2 -eq (55%), respectively (figure 6 (A)) if they are placed in areas with the lowest carbon and WSFs (hatched areas in figure 5 ). However, placing all new data centers within a small area may strain local energy and water infrastructure due to their collective water and energy demands. Data centers can be dispersed more broadly in areas that are favorable with respect to water footprint (figure 5 (A)), WSF (figure 5 (B)), or carbon footprint (figure 5 (C)). However, only considering one environmental characteristic can lead to environmental trade-offs (figure 6 ).

Figure 6.

Figure 6.  Percent change in environmental footprints associated with new data center servers compared to the 'business-as-usual' scenario. While the business-as-usual scenario assumes new servers will be placed in proportion to historical server locations, alternative scenarios explicitly consider the environmental implications of data center placement. Scenario A places data center servers in subbasins within the top quartile of all subbasins in environmental performance for both carbon (CF) and water scarcity (WSF) footprints. Scenario B represents server placement within subbasins in the top quartile for carbon footprints, while scenario C and D represent the best (top 25%) subbasins to place data center servers with respect to minimizing WSFs and water footprints (WF), respectively.

4. Discussion and conclusion

The amount of data created and stored globally is expected to reach 175 Zettabytes by 2025, representing nearly a six-fold increase from 2018 [ 51 ]. The role of data centers in storing, managing, and distributing data has remained largely out of view of those dependent on their services. Similarly, the environmental implications of data centers have been obscured from public view. Here, for the first time, we estimate the water and carbon footprints of the US data center industry using infrastructure and facility-level data. Data centers heavy reliance on water scarce basins to supply their direct and indirect water requirements not only highlight the industry's role in local water scarcity, but also exposes potential risk since water stress is expected to increase in many watersheds due to increases in water demands and more intense, prolonged droughts due to climate change [ 52 – 54 ]. For these reasons, environmental considerations may warrant attention alongside typical infrastructure, regulatory, workforce, customer/client proximity, economic, and tax considerations when locating new data centers.

The data center industry can take several measures to reduce its environmental footprint, as well as minimize its water scarcity risks. First, the industry can continue its energy efficiency improvements. The ongoing shift to more efficient hyperscale and co-location data centers will lower the energy requirements per compute instance. Software and hardware advances, as well as further PUE improvements, can continue to reduce energy requirements, and thus environmental externalities. For instance, quarterly PUE of as low as 1.07 has been reported by Google for some of their data centers [ 55 ]. Liquid immersion cooling technologies show promise of further reductions in PUE, with one study reporting a PUE below 1.04 [ 56 ]. The prospect of recovering low-grade heat (i.e. low temperature or unstable source of heat) from data centers for space or water heating is limited; however, approaches such as absorption cooling and organic Rankine cycle are promising technologies for generating electricity from waste heat [ 57 ].

Second, the data center industry can make investments in solar and wind energy. Directly connecting data center facilities to wind and solar energy sources ensures that water and carbon footprints are minimized. Purchasing renewable energy certificates from electricity providers does not necessarily reduce the water or carbon footprints of a data center. However, these investments gradually shift the electrical grid toward renewable energy sources, thus lowering the overall environmental impact of all energy users. Data center workloads can be migrated between data centers to align with the portion of the grid where renewable electricity supplies exceed instantaneous demand [ 58 ].

Third, as we show in this study, strategically locating new data centers can significantly reduce their environmental footprint. Climatic factors can make some areas more favorable due to lower ambient temperatures, thereby reducing cooling requirements. Lower cooling requirements reduces both direct and indirect water consumption, as well as GHG emissions, associated with data center operation. Since most data centers meet their electricity demands from the grid, the composition of power plants supplying electricity to a data center plays a significant role in a data center's environmental footprint. For an industry that is centered on technological innovation, we show that real estate decisions may play a similar role as technological advances in reducing the environmental footprint of data centers.

Acknowledgments

L M acknowledges support by the National Science Foundation Grant No. ACI-1639529 (INFEWS/T1: Mesoscale Data Fusion to Map and Model the US Food, Energy, and Water (FEW) system). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Lawrence Berkeley National Laboratory is supported by the Office of Science of the United States Department of Energy and operated under Contract Grant No. DE-AC02-05CH11231.

Data availability statement

Data center locations come from [ 4 , 19 – 21 ]. Power plant electricity generation, water consumption, and GHG emission data come from [ 35 , 59 , 60 ]. Location of public water utility and wastewater treatment data comes from [ 40 , 41 ]. Study data and code can be found in the Supporting Information, as well as at https://doi.org/10.7294/14504913 . The DOI contains relevant shapefiles, tabular data, and scripts to help replicate and extend our work. All data that support the findings of this study are included within the article (and any supplementary files).

Author contributions

L M conceived and designed the study. M A B S conducted the analysis. A S provided data and fundamental concepts regarding the analysis. All authors contributed to the writing of the manuscript.

Conflict of interest

The authors declare that they have no competing financial interests.

Supplementary data

Data Center to the Cloud

Introduction.

Data centers form the backbone of a modern organization, where they store and process the company’s valuable data and associated applications. The requirement for powerful data centers for executives has become central in the ongoing digitization environment. In that capacity, many organizations are deciding to create some distance from their conventional On-premise arrangements and move towards cloud-based systems (Helali & Omri, 2021). In contrast with the customary on-site data centers, cloud computing gives advantages, including increased scalability, cost-effectiveness, and reduced hardware maintenance. Increased security measures can likewise be executed from a cloud-based platform. In that capacity, moving from an On-premise data center to the cloud presents a feasible choice for associations looking to build their productivity and smooth out their work process (Helali & Omri, 2021). This paper explores appropriate reference architectures for transitioning from an On-premise data center to the cloud, incorporating an additional Software-as-a-Service (SaaS) tool. It will also examine the potential challenges encountered during such a migration, the security considerations that need to be considered, and the designs of these novel architectures.

Reference Architecture

When organizations decide to transition from an on-site data center, they must consider the appropriate reference architectures that should be implemented to leverage this advancement in IT architecture fully. These structures should guarantee the progression of administrations and the security of information the association had recently been giving from their previous systems (Van Geest et al., 2021). One of the most famous reference models, the mixture cloud architecture, is reasonable for associations wishing to relocate their current framework to the cloud while utilizing extra SaaS tools. This architecture comprises a “mesh” of on-premise and cloud resources integrated to expand the current data center. The cloud assets are incorporated into the current foundation, and the SaaS tools are used close by the cloud assets to give extra elements, like adaptability, computerization, security, examination, and checking (Van Geest et al., 2021). Moreover, this design likewise gives a pathway to the association to move parts and administrations to the cloud in light of interest and need continuously. This considers a slow change instead of an unexpected full-scale movement from the current framework.

Similarly, as with any critical IT project, adopting a hybrid cloud infrastructure brings unique challenges. One of the main challenges organizations face is ensuring the seamless integration of the two platforms. As the cloud part of the architecture is, to a great extent, a virtual platform, the similarity between the cloud assets and the current data center environment is fundamental (Vinoth et al., 2022). This may include making changes following the current foundation to guarantee similarity and avoid clashes between the current environment and the cloud. An illustration of this is the need to change firewalls in both the cloud and the current environment to secure the hybrid architecture properly.

Another challenge facing organizations intending to migrate to the cloud is the need for high-quality cloud resources. With organizations looking to exploit the versatility and dexterity of cloud administrations, the cloud assets they should pick can meet and surpass their exhibition assumptions. For instance, if the association uses a virtual private cloud, the assets should be reasonable for the application load they wish to run on the cloud, as a lacking cloud environment would be inclined to inertness issues (Vinoth et al., 2022). Furthermore, associations should guarantee that they can execute distributed storage techniques with sufficient room for the enormous informational indexes they use. Isolation of cloud assets is another vital test, according to an exhibition viewpoint, yet additionally from a security stance. Associations should guarantee that they can separate and isolate their cloud assets to guarantee that information is not presented to unauthorized users or systems.

Security Considerations

When transitioning from an on-premise data center to the cloud, adequate security measures must be taken to ensure the integrity of the system and protect the privacy of the organization’s sensitive data (Lisdorf & Lisdorf, 2021). This way, the security of both the cloud framework and the SaaS applications should be considered while taking action to a hybrid cloud framework. Right off the bat, to guarantee the security of the cloud foundation, associations should guarantee that they can appropriately verify and oversee client access. This incorporates setting up access arrangements that permit just-confirmed clients to access specific assets and information and keep unapproved clients from accessing or controlling delicate information (Lisdorf & Lisdorf, 2021). Associations may likewise decide to carry out two-factor verification, requiring the client to enter their qualifications in two separate advances, for example, the mix of a username and secret word, with a code obtained utilizing SMS or a validation application.

Moreover, associations should likewise consider the security of the SaaS applications, as they are often reliant upon the client’s utilization propensities. The SaaS climate should be evaluated routinely to guarantee that it contains no malignant code, and associations should likewise guarantee that sufficient safety efforts are set up to safeguard delicate information (Lisdorf & Lisdorf, 2021). This should be possible through the execution of guidelines like encryption or utilizing extra safety efforts like tokens and digital signatures.

When the security contemplations have been met, and the proper reference models have been picked, associations can start to foster the plans for their hybrid cloud solutions. According to a planning point of view, the half-and-half cloud design ought to have the option to incorporate the current on-site data center with the cloud assets while utilizing the SaaS devices (Van Geest et al., 2021). A wholly planned hybrid cloud design should have a layer of confirmation covering both the on-site data center and the cloud-based assets. Furthermore, the cloud layer should have the option to designate assets and applications devoted to the association’s necessities. A firewall should be carried out between the two conditions to forestall any vindictive traffic stream or information control.

Regarding the SaaS tools, they should have the option to give a stage to the association’s tasks, like records, client/worker executives, and examination. The SaaS layer ought to likewise give extra security and automation measures, for example, multifaceted confirmation, encryption, checking, and cautioning (Van Geest et al., 2021). The SaaS layer should likewise be planned to coordinate with both the on-site data center and the cloud-based assets. The capacity to coordinate the hybrid cloud architecture with the on-site data center would permit the association to relocate jobs and applications steadily and without disturbance.

Data Centers and the need for effective data center management are essential for businesses in the current climate of digitization. The change from an on-site data center to the cloud gives advantages, like expanded versatility, cost-viability, and diminished equipment support, and is becoming progressively well-known among associations. The execution of a hybrid cloud architecture is one of the most famous decisions for organizations taking action on the cloud, as it empowers the association to utilize the cloud framework while likewise considering incorporating SaaS tools. In any case, the execution of such architecture represents many difficulties that should be addressed for such a framework to be viable. Associations should focus on guaranteeing the consistent mix of the cloud with the on-site data center, notwithstanding legitimate division and safety efforts for the cloud assets and SaaS applications. By following the suitable reference structures and plans and considering the potential difficulties and security contemplations, associations can effectively move from an on-site data center to the cloud while exploiting the elements and advantages of such a change.

Helali, L., & Omri, M. N. (2021). A survey of data center consolidation in cloud computing systems.  Computer Science Review ,  39 , 100366. https://www.sciencedirect.com/science/article/abs/pii/S157401372100006X

Lisdorf, A., & Lisdorf, A. (2021). Securing the Cloud.  Cloud Computing Basics: A Non-Technical Introduction , 131-143. https://link.springer.com/chapter/10.1007/978-1-4842-6921-3_11

Van Geest, M., Tekinerdogan, B., & Catal, C. (2021). Design of a reference architecture for developing smart warehouses in Industry 4.0.  Computers in industry ,  124 , 103343. https://www.sciencedirect.com/science/article/pii/S0166361520305777

Vinoth, S., Vemula, H. L., Haralayya, B., Mamgain, P., Hasan, M. F., & Naved, M. (2022). Application of cloud computing in banking and e-commerce and related security threats.  Materials Today: Proceedings ,  51 , 2172-2175. https://www.sciencedirect.com/science/article/abs/pii/S2214785321071285

Cite This Work

To export a reference to this article please select a referencing style below:

Related Essays

Ergogenic aids, supplements, and doping: caffeine, essay on cystic fibrosis, economic analysis of the renewable energy market, the controversy of athlete salaries, case-based clinical reasoning, association of talc use and ovarian cancer, popular essay topics.

  • American Dream
  • Artificial Intelligence
  • Black Lives Matter
  • Bullying Essay
  • Career Goals Essay
  • Causes of the Civil War
  • Child Abusing
  • Civil Rights Movement
  • Community Service
  • Cultural Identity
  • Cyber Bullying
  • Death Penalty
  • Depression Essay
  • Domestic Violence
  • Freedom of Speech
  • Global Warming
  • Gun Control
  • Human Trafficking
  • I Believe Essay
  • Immigration
  • Importance of Education
  • Israel and Palestine Conflict
  • Leadership Essay
  • Legalizing Marijuanas
  • Mental Health
  • National Honor Society
  • Police Brutality
  • Pollution Essay
  • Racism Essay
  • Romeo and Juliet
  • Same Sex Marriages
  • Social Media
  • The Great Gatsby
  • The Yellow Wallpaper
  • Time Management
  • To Kill a Mockingbird
  • Violent Video Games
  • What Makes You Unique
  • Why I Want to Be a Nurse
  • Send us an e-mail

Contact Us!

CCSI, A Sourcepass Company

  • Infrastructure
  • Free Cloud Migration Assessment
  • Free Risk Assessment
  • Blog & Podcast

5 Assumptions on How Data Centers of the Future Will Probably Look Like

  • Cloud Computing
  • 5 Assumptions on How Data…

data centers

The changes around the data center are coming in fast and thick. This industry is one that has experienced serious and significant changes in even more recent times. The constant development means that as we go into the future, the evolution of the tasks and concepts of the data center can only continue. There are already many expectations of future data centers as we look ahead to the future with much keenness. The feeling amongst experts is that large data centers such as Facebook, Google, and Microsoft will scale out their data centers with the mindset of supporting Information Technology (IT) workloads in many exciting ways.

The fact that organizations need technologies and business processes for them to perform to their optimal best explains the importance of data centers and why they are very essential to the IT of an organization. But the development and advancement of technologies in the world mean that there will also be changes to the use and implementation of the data centers.

Because of our enthusiasm and interest in what the future data centers hold, we’ve studied patterns in the development of data centers and have come up with some assumptions for the future development of data centers. Here are 5 top assumptions:

  • Smaller sized data centers A quick survey by essay writers and IT experts at Emerson Network Power showed that 58% of the people believed that data centers of the future will be a lot smaller than what we have now. 27% of the participants believe that these future data centers will be bigger than today’s data center. While 14% of the participants are of the opinion that future data centers will be the same as the present.Although data centers also have a need for more storage, you would expect that this will result in larger data centers with more power and more space. But density is not the same as the area. They are quite different and, in this case, they play an important role. It is easier to manage and construct data centers that are dense than the large data center.There are many futuristic innovations that can come as a result of working with data centers that are small but have great densities. The layout of data centers can be turned inside-out. All the IT stuffs and server racks can be taken off quickly and placed around the center perimeter.
  • Power generation and cooling option Because these futuristic data centers are expected to have higher density, this implies that they would also be needing greater cooling. Our experts predict that the data centers for the future will be built such that they can generate powerful and cooling options for themselves. Data centers are predicted to be built close to power generation. This will allow the data center to generate the power that they use for themselves. They will most likely also be built around areas where we have environmental cooling options. This means that the future data centers will not just be to generate power for themselves, they will also be able to dump the heat that they create into polar installations, freshwater containment or geothermal heat pumps. There are, however, several factors that will responsible for powering the data centers of the future. Some of these factors are: Climate factors: whether the data centers will be built in regions that have cool climates or not.Economic factor: whether they will be developed in areas that have a lower cost of energy. Demand factor: when there is a change in the demand for storage and computing. Evolving technology: is there a possibility that loads can shift between places that have a very high capacity for power depending on different times during the day or in a year.
  • Reduced consumption of energy One of the major predictions for future data centers is that they will consume a lot less power than they currently do. According to some of the best essays written about data centers, many experts really believe that to produce the same amount of output as today’s data centers produce, future data centers will not need as much energy as current data centers need. The congressional belief amongst experts is that by the year 2025, the amount of power that the data centers will receive through solar energy will be about one-third of its total power. When you consider the development of solar power technology today, then you will see how this makes sense. The fact that most data centers are skyscraper buildings and there will be nothing to block the sun just proves why it makes more sense. Natural gas, wind and nuclear energy will be used to power a large part of the 2/3 power that is left.
  • Private clouds The shift to the cloud is becoming a common feature with many organizations even though all of these organizations would like to remain in control and not give it up. For organizations to make use of their private cloud infrastructure, they will have to use data centers. In a survey by the IDC survey , it was discovered that over 28% of the total sum spent on the cloud was spent on a private cloud.Businesses already use software solutions for their private cloud service while ensuring that their information is intact in the corporate-controlled situation. Wherever the infrastructure for public cloud is required, it is utilized. Many clients also do not care about the location where their service comes from. But for organizations, data centers will be responsible for facilitating private cloud services and ensure that the workload is performed in the most secure and best way.
  • High-performance computing High-performance computing or HPC is now very accessible since it became a service in the open cloud. The growth and expansion of artificial intelligence (AI), and machine learning (ML) imply that there are applications that are based on these crucial technological developments. As these applications continue to expand, the implication for the accessibility of HPC is that it will become a crucial strategy for businesses and organizations that are looking to hold on to their cutting-edge advantage. Although the prototype and trials are done with the framework of a public cloud, the need for these large organizations to be in end-to-end control will be what turns applications based on artificial intelligence and machine learning into a business differentiator for them. The only means that this will be effective is if it is presented in the corporate data center.

In a few years from now, data centers will be playing more renowned roles in our technological setup. These are just a few of what we expect in the near future.

Cloud Migration: Challenges to Consider when Making the Move

Migrating applications and data services to the cloud is fraught with risk. While businesses expect improved flexibility, cost and control, many don’t anticipate the application performance problems that can arise due to significant infrastructure changes. Learn the challenges that will need to be tackled before making the jump to the cloud in this free whitepaper.

Cloud Migration

Author Bio: Michael Gorman is a highly skilled freelancer and one of the most renowned essay writers at Best Writing Services and proofreaders from the UK who currently works at assignment writing services where he provides Australian assignment help . Being interested in everyday development he writes various blog posts and discovers new aspects of human existence every day. Feel free to contact him via Facebook or check his Twitter .

Michael is a guest blogger, all opinions are his own.

Related Posts

essay on data centers

  • Current Issues
  • Upcoming Events
  • Past Events
  • Videos & Podcasts
  • Company Profiles
  • Company Interviews
  • Company Reports
  • Real Assets Glossary
  • IREI Glossary
  • Industry Top Ranks
  • Industry Products and Asset Manager’s Guide
  • RAA Newsline
  • Press Releases
  • Real Assets Adviser Publication
  • White Papers
  • Testimonials
  • Special Reports
  • Robert A. Stanger & Co., Inc.
  • Mountain Dell Consulting
  • FactRight, LLC
  • Blue Vault Partners
  • Market Navigator

Publications

Cities and regions with the highest concentration of data centers

Cities and regions with the highest concentration of data centers

by Dgtl Infra

Surging demand for data consumption and storage is driving a rapid expansion of data centers in the United States. These U.S. data centers are located in areas with abundant electricity for their intense power demands, copious amounts of water for cooling, access to fiber connectivity, affordable real estate, tax incentives and away from regions that are prone to natural disasters.

U.S. data centers are primarily located in Northern Virginia, Dallas, Silicon Valley, Phoenix, Chicago, Atlanta, Portland, the New York/New Jersey area, Seattle, and Los Angeles.

Data centers are highly specialized buildings equipped with power and cooling infrastructure that house computer servers and network equipment. Common questions about data centers are as follows.

Where are U.S. data centers more specifically located? The primary data center markets in the United States are located in Northern Virginia, Dallas, Silicon Valley, Phoenix, Chicago, Atlanta, Portland, New York/New Jersey, Seattle and Los Angeles. Secondary markets for data centers include Austin, Boston, Charlotte, Columbus, Denver, Houston, Kansas City, Las Vegas, Miami, Minneapolis, Salt Lake City and San Antonio.

How many data centers are located in the United States? There are more than 2,500 data centers, and of this total, about 50 percent are located in the primary data center markets, while the remaining 50 percent are scattered throughout many smaller secondary data center markets.

Who owns the most data centers? Digital Realty owns the most U.S. data centers with 132 facilities, comprising 1,226 megawatts of white space IT load across 23.4 million net rentable square feet. While Equinix operates the second most data centers with 75 facilities, only about 60 percent of these data centers are owned, with the remaining roughly 40 percent are being leased.

What is the largest data center in the United States? The largest U.S. data center is owned by Meta Platforms (Facebook), spanning 4.6 million square feet located in Prineville, Ore., a city situated about 150 miles southeast of Portland. Meta Platforms broke ground on its Prineville data center campus in 2010, with its initial buildings coming online in 2011. Cumulatively, the company is building 11 data centers at this campus as part of a more than $2 billion investment spanning.

The average full-scale data center is 100,000 square feet in size and runs around 100,000 servers, which are essentially powerful computers. Servers are often stored in racks, which is like a cabinet for multiple servers.

Though Amazon — both a giant online retailer and the largest cloud computer company in the United — doesn't disclose details of its infrastructure, including how many servers it uses, research estimates conclude that Amazon Web Services is using at least 454,400 servers in seven data center hubs around the world.

Northern Virginia is the largest data center market in the United States and comprises several counties located 20 to 40 miles west of Washington, D.C. Specifically, Northern Virginia includes Loudoun County (Ashburn, Sterling, Leesburg, Arcola), Prince William County (Manassas, Gainesville, Haymarket), and Fairfax County (Reston, Herndon, Chantilly, Vienna, McLean, Tysons), among others. Those data centers are powered by Dominion Energy, the largest electric utility serving Northern Virginia, and NOVEC (Northern Virginia Electric Cooperative) is another important provider of power. Cloud computing services operating in the area include Amazon Web Services (AWS), Microsoft Azure, Google Cloud, Meta Platforms (Facebook).

Northern Virginia is followed by Dallas and its neighboring suburbs, including Allen, Carrollton, Fort Worth, Frisco, Garland, Irving, Lewisville, Plano, and Richardson. In total, the area is home to more than 150 data centers and more than 650 megawatts of multi-tenant commissioned power. Among its wholesale data centers are Digital Realty, CyrusOne, QTS Data Centers, STACK Infrastructure. Retail colocation players such as Equinix, DataBank, Flexential and Cyxtera can be found there.

Silicon Valley, the country’s third-largest data center market, has more than 160 data centers with more than 625 megawatts of power supplied primarily by Pacific Gas & Electric and Silicon Valley Power. Operating from those facilities is a veritable who’s who of retail colocation and wholesale data center players, as well as cloud computer giants, including Alibaba Cloud and Oracle Cloud.

The greater Phoenix area with its surrounding communities of Chandler, Mesa, Tempe, Scottsdale and Goodyear, represent the fourth-ranked data center market, with more than 90 facilities and more than 600 megawatt of multi-tenant commissioned power. Digital Realty is a player there, as are Cyxtera, Flexential, CyrusOne, Iron Mountain, Aligned Data Centers and Compass Datacenters.

To spotlight one of the data center giants, consider Facebook, which owns and operates 18 data center campuses globally, encompassing 40 million square feet and investment in excess of $20 billion. In the United States, Facebook owns and operates 14 data center campuses spanning 34.2 million square feet at a cost of more than $16 billion, while in Europe and Asia Pacific, the company owns and operates four data center campuses covering 5.4 million square feet at an investment of more than $4 billion.

Not all data centers serve the same function. There are four common types of data centers: onsite, colocation facilities, hyperscale, and edge data centers.

An onsite data center is sited on a company’s headquarters or corporate campus, and are relatively easy to maintain and access, according to Lightyear. Their proximity to company operations alleviates network troubleshooting and they can readily be scaled up or down as needed.

Colocation facilities are defined by CoreSite as a data center offering space to host businesses' computing hardware and servers.

Hyperscale data centers, according to Vertiv, are massive business-critical facilities designed to efficiently support scalable applications and are often associated with big data-producing companies such as Google, Amazon, Facebook, IBM, and Microsoft.

TechTarget reports that an edge data center is located between connected IoT devices and the public cloud or a centralized data center. In an edge computing architecture, time-sensitive data may be processed at the point of origin by an intermediary server that is located in close geographical proximity to the client. The facilities enable new applications by reducing latency and optimizing bandwidth.

Excerpted from a report by Dgtl Infra, with sourcing from other firms. Read the full report here .

Share this article

essay on data centers

Why Should You Watch Our Market Navigator Series?

Glossary, videos, podcasts, research in the Resource Center

Suggest a story idea

Have a story idea? Submit your ideas to our editorial team below.

essay on data centers

DINT #60 - Essay: Data Centers on Hallowed Ground of My Ancestors

In today's issue i share an essay on a data center development story that i covered during my senior editor days at an industry publication. here, i reveal my opinion on pw gateway., amazon annual meeting fallout: no pay gap numbers on race and gender.

essay on data centers

In April we provided a preview of important initiatives up for vote at Amazon’s 2023 Annual Meeting. Two items caught our interest in the DEI space. Here’s the tally: Shareholders said ‘no’ to Item 13, which Shareholder Anne Bartol Butterfield proposed via Arjun Capital, calling for annual reporting on pay gaps based on race and gender. Shareholders also voted ‘no’ on Item 14 which centered on discrediting Amazon’s efforts towards a more equitable and diverse workplace. So far, Microsoft remains the only large tech firm willing to share pay gap information; a highly regarded data point in setting baselines for improvement and equity in pay among workers.

AI, Ain’t I A Woman by Dr. Joy

Check out this amazing piece by Dr. Joy Buolamwini of the Algorithmic Justice League.

Data Center Exec Calls Historic Black Schoolhouse ‘bunch of glass in somebody’s back yard’

essay on data centers

As Dice, the two-day event celebrating and educating data center denizens, wraps up roughly 50 Virginia residents protested the industry’s use of vital resources in their communities. This is near and dear to my heart because I covered data centers for several months as Senior Editor of an industry publication.

essay on data centers

Some data centers do guzzle resources. Some of them are quite noisy and near residences or schools where quiet is sometimes very necessary. Yet, without data centers, how will you stream Netflix or send that 5MB file of photos or get your lab results back in a matter of days? All of these everyday occurrences happen because of the efficient operation of data centers.

While there’s no solution eminent, some data centers seek to make a difference. For instance, Compass Datacenters is standing up a data center campus next to Manassas National Battlefield Park. Sadly, the plot of land has several unmarked burial grounds of Black and Native American people going back generations. I spoke with Compass representatives as one of my last acts as Senior Editor of Data Center Knowledge (DCK). For the first time, I’m writing about the interview with three of Compass’s representatives, one of whom is the co-founder and top executive at the company.

I’d been reporting aggressively for the entire six months I’d been at DCK. I reported on secret firings, network outages costing customers millions in lost revenue, and the social impact of data center expansions. In one article, I wrote passionately about the unmarked graves near Manassas National Battlefield Park. You see, five generations ago my family lived in Virginia. The thought that one of my ancestors is buried in the very field Compass is digging up chilled me. I had a very hard time writing anything about the planned data center development and I believe some of that came through in my writing. I applaud the Compass exec, lawyer, and public relations representative who agreed to speak with me. They allowed me to go on record about everything we discussed that day.

More DINT coverage of Data Center Alley

Are You Moving to the Commonwealth of Amazon?

County Near Data Center Alley Curbs Cooling Noise

Atlanta Captures Overflow from Virginia as Data Center Hot Spot

One statement, made by a Compass exec, chilled me to the core. I had to go off-camera for a few seconds to gather myself as I held my hand over my mouth, shut my eyes tight, and hoped the moment would pass quickly. I wanted to do my job. I wanted to persevere. I wanted to be unbiased and objective while still reporting the truth.

There’s a schoolhouse for Black children that once stood on the grounds of what will soon be Compass’s data center campus. The media rep and attorney shared insights into the development. It’s about as bucolic as a data center campus can get, which really isn’t saying much about the preservation of the wild/natural feel of pre-developed areas data center operators choose for their sites.

essay on data centers

The Thornton School , though, has seen better times. Built in 1923 , the one-room schoolhouse was a place where the children of the formerly enslaved could learn to read. It was also the last one-room schoolhouse built in Prince William County according to Historic Prince William , a group dedicated to preserving the region’s rich history. It was one of the few places Black children were welcome in Prince William County at that time.

essay on data centers

That’s why during the information session with Compass, I was surprised to hear a statement that may reveal a different side of the company’s intentions. The exec had been reticent during the entire information session. I had to specifically ask him what his impression was of the new development.

During his comments he said to me, “People have to understand, that ‘schoolhouse’ is nothing but a bunch of glass in somebody’s back yard.”

The media representative stepped in at that point and the exec couldn’t finish his thought. He did well up until the ‘bunch of glass’ comment. To call it a bunch of glass in someone’s back yard broke my heart. I didn’t go off screen, though. I was too shocked, too hurt, too dismayed.

His statement was a disregard for the meaning behind even the smallest artifact, the tiniest indication that at one time something great happened there or that someone’s ancestors fought to stay alive there.

I believe the exec’s comment was meant to say no one cares about the schoolhouse these days because it’s in pieces in a back yard.

In the plans for the development is a structure to honor the schoolhouse. The agreement with Prince William County in Virginia includes assurances that the developers will abide by state laws in regards to exhumed human remains.

The gray circles on this map represent known burial sites within and around the borders of the proposed Compass data center campus.

essay on data centers

This is where Compass plans to place the Thornton School Interpretive Center to showcase artifacts unearthed during construction relating to a settlement created by the formerly enslaved and the school their children attended, The Thornton School, a one-room schoolhouse.

A closeup of the plans for the interpretive center proposed by Compass.

essay on data centers

I’m still conflicted. We need data centers. And the new development addresses all of the sound complaints, plus the water- and energy-consumption issues so detrimental to everyday life for many Virginians living near data centers.

Still, my heart aches for the history of my people. The thought of bulldozers rolling over their remains, well, it’s indescribably difficult to bear. For descendants of Africans in the U.S., knowing the details of our past pulses in our souls with a deep yearning. We smile, we dance, we go about our lives but the desire to know endures.

Leave a comment

essay on data centers

The author isn’t the only one feeling conflicted. Residents also have many takes on the development of what is known to be burial grounds.

More coverage on Virginia burials for the enslaved and formerly enslaved:

Black Advocates Take Ownership of Ancestors’ History; Legislation Could Help (Stateline / Aallyah Wright)

The race to save African-American cemeteries from being ‘erased’ (Thomson Reuters Foundation / Carey L. Biron)

Long-lost slave cemetery discovered and preserved in rural Virginia (The Washington Post / Linda Wheeler)

In Virginia, A Family Tragedy Stirs New Life in a Burial Ground for the Enslaved (NPR – WAMU 88.5 / Daniella Cheslow)

essay on data centers

Ready for more?

Innovation Summit Paris 2024: Discover how to transform ambition into action and make your impact. Register for the Global Keynote livestream.

Schneider Electric Blog

Home > Data Centers > On-premise vs. cloud: Navigating the data center workload dilemma

Data Centers

On-premise vs. cloud: Navigating the data center workload dilemma

March 22, 2024

5 min read |  Joe Kramer

This audio was created using Microsoft Azure Speech Services

A critical success factor in a data center strategy is proper planning. Data center decisions shouldn’t be reactive but thoughtful of the longer-term implications for the business. Deciding whether to place data center workloads in the cloud, on-premise, or at the edge requires an assessment of needs, space utilization, and other factors such as deployment speed, infrastructure, and cybersecurity. Companies have become better at making these decisions, thanks to lessons learned in recent years.

data center growth

It’s one of the topics   that I discussed with Thomas Humphrey , North American Business Development Manager for Schneider Electric  Modular Data Centers , and Todd Boucher , Founder of  Leading Edge Design Group . If you’re curious, check out  Part 1 , which delves into why the public cloud might not always deliver the expected results and reveals how market trends have shifted strategies.

This blog post covers Part 2 of the series, which discusses what factors should be considered in reviewing a cloud vs on-premise strategy.  

Joe: What drives organizations to keep compute on premise or move it to the cloud?

Thomas:  In data center planning, organizations have to identify their primary business goal by looking at data center capacity, speed of deployment, cost structure, and infrastructure. If there’s free space to use, it could be leveraged for a data center. If they don’t have the space, then how quickly do they need the additional capacity? This is where speed of deployment comes into play. Do you build the data center or use the cloud? There are questions about CapEx vs. OpEx. How do you control costs over time? Also, do you have to beef up infrastructure with fiber routes, for instance? What about cybersecurity? It comes down to assessing the business processes and picking the best solution.

Todd:  With on-premise production environments, oversizing has been an issue. Compute capacity isn’t fully utilized. Virtualization helped but only partially solved the issue. Then, the cloud emerged as a way to right-size the environment. But you also have to oversize environments in the cloud for scalability. Some customers tried to move whole production environments to the cloud, leading to cost overruns. And these are the types of things you need to take into consideration.

Joe: Is right-sizing still a challenge, and how do you solve it?

Todd:  Forecasting capacity is always challenging, especially given how dynamic business is today. Modularity helps address this with prefabricated data centers that you can quickly deploy where needed. Now, owners can make more thoughtful decisions: “I know I will need an asset here. What should I do with it? What should that asset be? What do I already own? What’s its lifecycle?” If customers have too much infrastructure, they realize they can downsize, become more agile, and become more modular. 

Joe: As customers repatriate some assets, what lessons have they learned from their cloud experience?

Todd:  When something is transformational – like the cloud becoming commercially available – the pressure on IT is to deliver more services and more resiliency. But there were a lot of unknowns. When you have a new technology, you don’t know what the experience will be like. And one organization’s experience is different than another’s, even in the same industry. You have varying resources, different business cultures, and unique drivers for the technology. It’s not a homogeneous adoption strategy, and that’s why these lessons are so important.

Thomas:  There’s also a huge variability in how companies perceive their IT. For some, like e-commerce providers, it’s core to their business; it is their lifeblood. They rely on it 100%. Other companies see it as a support function for their core business. So, you have different schools of thought. An automaker will move slower to adopt technology than a highly transactional, web-based company. This is where factors like speed of deployment and data center space availability come into play. And they’re better understood now after the initial cloud experience.

  Joe: What impact do hybrid environments that combine on-premise assets with cloud and SaaS applications have on Chief Information Officers (CIOs)?

Todd:  The CIO has to work with other business leaders to determine how an application can support and scale with the business. An IT organization has to be really responsive to business capacity needs, which are difficult to forecast. Having competency and workloads in both on-premise and cloud is what makes the organization responsive. They can make quick choices about where to put resources, where to develop and test applications, and then deploy them in the right environment.

Joe: How important is the role of modularity, especially at the edge?

Thomas:  When you talk about modularity and scalability, in order to be agile, you have to consider the edge. Some data must be processed close to the source, and some can be pushed to the cloud for higher-level analytics. Modularity delivered by prefab data centers helps bring it all together. You plug in the prefab data center and get increments of multimegawatts of power as you scale. It sits at the edge, providing compute where data centers typically weren’t deployed.

Stay tuned for Part 3

In Part 3 of the blog series of cloud vs. on-premise, we will wrap up the discussion by reviewing specific solution options and weaving them into the business challenges. We will address questions such as:  How does an edge computing solution fit into current IT infrastructure? How should sustainability be addressed? What business justifications or use cases do we need to focus on?  and more.

In the meantime, visit our  website to access additional data center planning resources.

Tags: Cloud , data center capacity , Data Center Planning , modular data center , on-premise data center

Add a comment Cancel reply

Email (Your email address will not be published.)

Your comment

This site uses Akismet to reduce spam. Learn how your comment data is processed .

essay on data centers

Embarking on the Career Odyssey: New Graduates Share Their Journey Vol.1

essay on data centers

Summing up the Hottest Data Center Topics of 2016 – With an Eye to the Year Ahead

essay on data centers

An Unlikely Partnership Leads to Greece’s Very First LEED Platinum Building Certification

essay on data centers

The digital hub of connectivity in your building: EcoXpert, BMS and IoT

Home — Essay Samples — Information Science and Technology — Information Technology — Data center consolidation projects

test_template

Data Center Consolidation Projects

  • Categories: Data Mining Information Technology

About this sample

close

Words: 506 |

Published: Mar 1, 2019

Words: 506 | Page: 1 | 3 min read

  • REDUCED COSTS
  • INCREASED CONTROL FOR IT
  • REDUCING THE SCOPE OF SECURITY
  • DISASTER RECOVERY

Image of Alex Wood

Cite this Essay

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Verified writer

  • Expert in: Information Science and Technology

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

1 pages / 527 words

1 pages / 565 words

1 pages / 619 words

4 pages / 1993 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on Information Technology

Today's businesses rely more heavily on technology than ever before. From improved telecommunications to online payment options, most modern businesses could not function as effectively or efficiently without technology. Even [...]

In the field of information technology, the ability to effectively manage and manipulate data is crucial. PT1420 Unit 8 focuses on the concepts of file input and output, which are essential components of programming. This unit [...]

Since the advent of technology, the world has witnessed a significant transformation in almost every aspect of life. From communication and education to healthcare and entertainment, technology has revolutionized the way we [...]

Aldrich, Tim. About Time: Speed, Society, People and the Environment. Routledge, 2017. Ott, Kate. Christian Ethics for a Digital Society. Rowman & Littlefield Publishers, 2018. Posamentier, Alfred, and Christian Spreitzer. The [...]

In today’s generation, it is evident that media and information technology is widely used by people all across the globe. Kids and teens are taking massive amount of it everyday. As I grew up, I became more and more interested [...]

Ethical dilemma is a decision between two alternatives, both of which will bring an antagonistic outcome in light of society and individual rules. It is a basic leadership issue between two conceivable good objectives, neither [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

essay on data centers

More From Forbes

Here’s why data center cooling is the hottest innovation in the sector.

Forbes Technology Council

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

Andrew Schaap is CEO and Board Member of Aligned Data Centers .

A team of scientists uses a supercomputer to zero in on the most effective Covid vaccine . A busy executive saves time by prompting ChatGPT to draft an email. Taking stock of its contents, your smart fridge tells you what to pick up at the grocery store.

What do those three things have in common? They’re all powered by data centers, the backbone of the digital world .

They also generate heat and lots of it.

As a CEO in the rapidly growing data center industry, I’m excited by the massive demand that AI, machine learning and supercomputing are creating for our business. Besides making people’s lives easier and helping companies run better, there are many examples of how these advanced technologies are doing good for humanity.

But for data centers, they also pose a big challenge: how to keep all that hardware cool.

And what those unfamiliar with this space don’t know is that traditional cooling methods have started to outlive their usefulness. That’s putting data center providers in a race to create a cooling technology for the modern world—one that not only turns down the heat for high-performance computing but also uses energy more efficiently.

Best High-Yield Savings Accounts Of 2024

Best 5% interest savings accounts of 2024.

The sooner we can do that, the better. Demand for data centers will only keep growing, with their power consumption in the U.S. projected to double between 2022 and 2030.

Data Center Cooling 101

For data center providers, keeping things cool for customers is a priority. Doing that means responding to the laws of physics.

When electricity travels through a semiconductor, it creates heat. The more powerful the chip, the hotter it runs. Nvidia’s latest graphics processing unit (GPU)—which underpins many cutting-edge AI applications—contains a staggering 80 billion transistors that need cooling. It also uses as much energy as the typical resident of a U.S. household .

Pulling the resulting heat from a circuit board and its wiring is crucial to stop it from malfunctioning or even breaking down.

So far, the data center industry has mostly relied on air cooling to get the job done. This technology uses two sets of air ducts, bringing in air from outside and pushing it across the front of the equipment, then sending the warmed air to the back so it can be expelled.

The downsides? Besides consuming energy and water—needed to chill the machinery—the typical air-cooling system is bulky, taking up precious real estate. With its many moving parts, it’s also prone to breakdowns.

For data center providers, those sustainability and reliability concerns have prompted a race to develop alternatives to air cooling. The rise of GPUs and other hot-running hardware has only upped the ante over the past couple of years.

Enter Liquid Cooling

Increasingly, data centers are turning to liquid cooling to meet their needs.

Before I continue, let’s be clear: In the data center world, there is no flux capacitor that will solve everyone’s cooling problems overnight. Until innovation in electricity takes a quantum leap with superconductors that let computers operate at room temperature, we’ll have to make do with incremental improvements to existing technologies.

My money is on liquid cooling for one simple reason: Water is a better heat rejector than air, making it ideal for GPUs and other high-density computing hardware. Historically, the argument against liquid cooling has been that water shouldn’t be anywhere near electronic equipment. But the huge demand for Nvidia and other AI chips has brushed those concerns aside.

Liquid cooling systems also use less power—and, perhaps surprisingly, less water—than their air-driven rivals. They have fewer moving parts, too. From a maintenance point of view, it’s like the difference between an electric vehicle and one with an internal combustion engine.

But no matter what happens, some air must flow at data centers. Most facilities will keep using air to cool equipment running at lower computing densities, and not all customers are keen to make dramatic changes.

Why Liquid Is The Cooling Tech Of Tomorrow

Although the jury is still out, I believe liquid cooling is the future for data centers and will ultimately become the status quo.

Why? The business case is compelling. For customers and investors, liquid cooling means less capital expenditure for the same output. The payoff: Companies that depend on data centers will be able to crush the competition by making higher margins.

Switching to liquid cooling also means better water and power usage effectiveness (WUE and PUE), two key metrics in our industry. Compared to air cooling alone, it can shrink facility power by almost 20% and total data center power by 10%, a recent study found.

The leap to liquid cooling will happen without anyone really noticing. As innovation sweeps the data center industry, legacy cooling equipment will inevitably be replaced. To me, it’s like the gradual shift from the old MD-80 passenger plane to the Dreamliner that is now an airline industry standard.

Helping drive that change is an enormous amount of infrastructure investment. From $250 billion in 2022, annual capital spending on data centers is forecast to surge to more than $500 billion by 2027. With investment in high-performance servers for AI set to grow at five times the outlay on their general-purpose counterparts, expect liquid cooling to take center stage.

Of course, for some, the transition to liquid cooling will mean overcoming a few hurdles. It can be capital-intensive, and there’s still a lack of standardization for this relatively new technology, along with a learning curve to make the shift.

But looking ahead, one thing’s for sure: Because AI and so many other industries rely on data centers, innovations that serve customers better will only keep gathering pace.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Andrew Schaap

  • Editorial Standards
  • Reprints & Permissions

essay on data centers

  • Coronaeconomics
  • Expert opinions

INVESTMENTS , News

New super-powered data center built in moscow.

essay on data centers

In February 2019, the leading data center operator IXcellerate announced its strategic plans to expand data center network in Russia.

Six month later, the company launched its second data center, Moscow Two. IXcellerate spent a total of $80 mio to build two facilities and plans to open another two worth some $230 mio in the Moscow Region within the next five years. The company’s experts estimate that the move will allow the operator to occupy about 25% of the local market. Today, Moscow Two is one of Russia’s largest commercial data centers built almost solely on finances provided by foreign investors, which is particularly significant today when most foreign investors are winding down their operations in Russia. The investors of IXcellerate, which operates data centers in Moscow and has its head office located in London, include such financial corporations as Goldman Sachs, IFC and Sumitomo Corporation.

essay on data centers

The company’s top officials are confident that they will succeed in implementing their plans. Upon the new data center’s launch, current and new clients had already reserved over 85% of its capacities, or 1,480 rack units located in a 3,000 sq.m. room. IXcellerate provides services of safe equipment placement and connecting their clients to IT infrastructure network. The operator’s clients include over 100 international and Russian companies, such as Nestle, Thomson Reuters, Orange Business Services, Softline, Agricultural Bank of China (Moscow), Huawei, Tencent Cloud, Zenlayer, and others. Over 50% of the company’s revenues comes from Asia-based corporations. Last year’s revenues grew twofold against 2017, with this year’s growth rate remaining the same.

An increase in demand for data centers on the Russian and foreign markets is caused by several factors, including the increasingly growing need to store data. The snowballing growth of big data has caused the need to both store and to manage data, as well as to provide quick access to it. It is especially true for Russia: for instance, there are 31 data centers per 6 mio citizens in Singapore, while in Russia, there are over 50 data centers per 140 mio citizens. A large population, high level of internet connectivity and vast engineering and intellectual potential are the key factors of the growing demand for data centers in Russia.

Companies use data centers to ensure their business continuity and improve the efficiency of business processes. Businesses are based on IT operations, and a reliable infrastructure is required to maintain their operations. It is quite difficult and expensive to create and support this infrastructure, so the alternative is to outsource. One of the main objectives of data centers is to create suitable conditions for placing their clients’ computing infrastructure. Thus, for instance, IXcellerate provides the accessibility of its services at 99.99%. This means, it allows no more than 5 minutes and 16 seconds of idle time per year.

The so called Digital Silk Road, or the expansion of Chinese IT giants in Russia, is another reason why data centers are becoming popular. Moscow is a hub on the way to Europe for many Asian corporations such as Huawei, Tencent, Alibaba and others. They need reliable platforms for data storage and processing. According to iKS-Consulting agency, the data center market in Russia grows by 20% every year.

essay on data centers

Why AI Is So Thirsty: Data Centers Use Massive Amounts of Water

W ith its affordable land and modest electricity rates, Iowa has become a magnet for data centers for some of the world's tech giants.

Tech companies also like Iowa's wind power, which gives it the country's highest rate of renewable energy. Some 60 percent of Iowa's electricity comes from renewable sources, so tech companies can power their data centers there while also working toward ambitious climate goals for low-carbon power.

While Iowa is rich with green energy, there's another crucial resource that data centers need but Iowa often lacks: water.

As data center operators power up the servers that keep the internet humming and make artificial intelligence possible, they also need large volumes of water to cool those servers down, to keep them from overheating. The growing water consumption by data centers is becoming a challenge for some host communities.

"There's definitely parts of Iowa that are starting to feel the squeeze on water," Iowa Environmental Council Energy Program Director Kerri Johannsen told Newsweek .

Johannsen's group is tracking the increasing water usage by the state's data centers, most of them clustered in suburbs around Des Moines.

"There's a data center in a suburb called Altoona where they're using up to about one-fifth of the water that the city is using, and that's really significant," she said.

Iowa is in the midst of one of its most prolonged droughts in decades. The National Oceanic and Atmospheric Administration reported that by mid-March about 85 percent of the state was in drought condition, and about 56 percent of the state was in extreme or severe drought.

State Geologist Keith Schilling warned earlier this year that groundwater levels are declining in some of the state's aquifers and that the state "needs a plan to safeguard groundwater reserves." Along with the traditional water uses the state must balance, such as agriculture and residential systems, Schilling also listed "data centers requiring vast quantities of cooling water."

The boom in AI is adding to the water demand. Recent research shows that the enormous computing power, larger chips and additional servers required for AI not only add significantly to electricity demands, but also make many of those data centers much thirstier.

AI is likely already driving an increase in water use by some tech companies, researchers say, and their projections for the coming years show global AI growth could require more water than some small nations consume.

As many major tech companies position themselves to rapidly scale up operations to support AI, they must also reckon with their growing water demand. And some data center host communities in water-scarce regions could face difficult choices amid a dearth of solid information about how much water they might be giving up.

What Makes AI So Thirsty?

Shaolei Ren, an associate professor in the Electrical & Computer Engineering Department at the University of California , Riverside, has been researching big tech's water use for about a decade.

"People were talking about carbon emissions when they study sustainable computing, but I believe that water is also a very important metric," Ren told Newsweek .

Ren's most recent work focuses precisely on how AI is increasing water use. A large language model like OpenAI's popular ChatGPT-3 must first be trained, a data and energy intensive process that can also boost water use. Ren found that training GPT-3 in Microsoft 's high-end data centers can directly evaporate 700,000 liters, or about 185,000 gallons, of water.

Once the AI model is in use, each inference, or response to queries, also requires energy and cooling, and that, too, is thirsty work. Ren and his colleagues estimate that GPT-3 needs to "drink" a 16-ounce bottle of water for roughly every 10-50 responses it makes, and when the model is fielding billions of queries, that adds up.

To get the full measure of water use, Ren looked at both the water directly used in cooling servers in a data center and the water used in the generation of electricity the data centers require.

"When we generate electricity using coal-based power plants, using nuclear power plants, we are actually also consuming a lot of water," he explained, and AI is sharply boosting the energy that data centers need.

Add all that water up and apply it to the projected growth in AI in the next few years and the total quickly reaches a staggering level. Ren concluded that global AI demand could result in as much as 6.6 billion cubic meters, or 8.6 billion cubic yards, of water withdrawal by 2027. To put that in perspective, he compared it to how much water some countries use.

"This will be roughly equivalent to four to six Denmarks of national water withdrawal," Ren said. "That's quite a lot."

Microsoft's AI Training

Ren said Microsoft's total water use grew by 34 percent in 2022, the most recent year for which data is available, and that increase is likely due at least in part to AI's demands. That has major implications for places like Iowa where Microsoft has used its data centers to train OpenAI models and where water use is becoming a concern.

Microsoft opened its first Iowa data center in West Des Moines in 2012 and had built four more in the area by the end of 2023. Those facilities include the Azure supercomputer Microsoft built for OpenAI to train its AI models, making "a small city in America's heartland an unlikely epicenter for the AI revolution," the company wrote in a press release last year.

A Microsoft spokesperson declined to comment on Ren's calculations about its water use but pointed out that the company has set a goal to become "water positive" by 2030 , meaning it will return more water to the environment than it uses. Microsoft said it has also met an interim goal to reduce water waste by 95 percent.

Microsoft ranks 34th on Newsweek 's 2024 list of America's Most Responsible Companies , and fifth among companies in software and communications.

Microsoft's Vice President of Energy Bobby Hollis said energy and water needs are closely intertwined and must be considered together to optimize a data center's sustainability performance.

"You've got sort of the yin and the Yang, the counterbalance of the two, figuring out which one you're going to focus more of your effort on," Hollis told Newsweek .

Where clean energy is abundant and water is scarce, he said, the company uses more air-cooling methods to reduce water impacts, and closed-loop systems prevent loss due to evaporation.

"And it depends, of course, on geography and what you're looking at for that location: how dry the place is, how cool it is," Hollis said.

Iron Mountain's Deep Thinking

At a data center just north of Pittsburgh in Boyers, Pennsylvania, the company Iron Mountain dug deep for sustainability solutions—literally. The company's facility there is more than 200 feet underground in a former limestone quarry that keeps servers naturally cool and offers a water source that doesn't draw from the neighboring community's system.

"We utilize a large underground lake to cool the data center, and regularly monitor the water level and temperature of the lake," Iron Mountain Director of Energy & Sustainability Chris Pennington told Newsweek in an email exchange. "It works extremely well and is a feature that helped the site become the first underground facility to receive Energy Star certification."

(The U.S. Environmental Protection Agency's Energy Star certification recognizes facilities that meet its criteria for energy efficiency.)

A pump lifts the water from the lake to a heat exchanger where the cool water carries heat away from the closed-loop cooling system that circulates among the servers. That water is then returned to the lake. Being underground also makes the site highly secure, a selling point for some Iron Mountain clients who handle sensitive data.

Innovations like that helped Iron Mountain earn four stars on Newsweek 's 2024 ranking of America's Greenest Companies .

"Not every data center can be underground with its own lake, however," Pennington said, and the main takeaway from Iron Mountain's underground example may be this: Think deeply about local characteristics when building data centers.

Equinix is one of the world's biggest data infrastructure providers and occupies slot No. 181 among the 600 companies on Newsweek 's ranking of America's Most Responsible Companies.

When Equinix Vice President of Global Sustainability Christopher Wellise spoke to Newsweek , he happened to be in northern Europe, and he offered some regional examples of how the company is cleverly adapting its data center operations to local conditions and needs.

"Here in the Nordics, we can use lots of free air cooling," Wellise said. "But then you've potentially got some valuable heat that can be used for other purposes."

In most U.S. settings, unfortunately, there's no easy way to put that heat to use. But in European cities with district heating systems, Wellise said, the "waste" heat from data servers can be sent through a central network of pipes and ducts to heat homes and buildings.

"In Helsinki, we heat thousands of homes," he said. "You have to think about these things in terms of full life-cycle management, not just the electrons that you're using and how green or brown are they."

That holistic thinking will have a highlight moment in July when athletes gather in Paris for the Summer Olympics . The new Olympic aquatic center's pool will be connected to an Equinix data center, and diving and swimming competitors will be kept comfortable using the excess heat produced by the data servers. It's an intriguing example of the possibilities available to us in this new AI age.

Related Articles

  • How AI Generates Both Climate Pollution and Solutions
  • MethaneSAT: A Space Mission to Cut Emissions of a Powerful Greenhouse Gas
  • How AI Will Help the World's Top Hospital CEOs Transform Health Care

Start your unlimited Newsweek trial

A river with low levels of water flows under Highway 65 near Bondurant, Iowa. Iowa is suffering through a prolonged drought that heightens concerns about the water use by the state's data centers.

polis: a collective blog about cities worldwide

  • Data Collection in the Moscow Metro

essay on data centers

  • ►  August (1)
  • ►  July (2)
  • ►  June (1)
  • ►  May (1)
  • ►  April (1)
  • ►  March (2)
  • ►  February (2)
  • Territory and Transgression: An Interview with Stu...
  • Elizabeth Blackmar on Public Space
  • Housing Demolition and the Right to Place
  • ►  December (2)
  • ►  November (3)
  • ►  October (1)
  • ►  September (1)
  • ►  July (4)
  • ►  June (5)
  • ►  May (3)
  • ►  April (5)
  • ►  March (7)
  • ►  February (13)
  • ►  January (24)
  • ►  December (31)
  • ►  November (28)
  • ►  October (30)
  • ►  September (29)
  • ►  August (31)
  • ►  July (31)
  • ►  June (30)
  • ►  May (30)
  • ►  April (29)
  • ►  March (31)
  • ►  February (29)
  • ►  January (31)
  • ►  December (28)
  • ►  November (29)
  • ►  October (31)
  • ►  August (30)
  • ►  July (32)
  • ►  June (28)
  • ►  May (29)
  • ►  April (19)
  • ►  March (26)
  • ►  February (21)
  • ►  January (27)
  • ►  December (25)
  • ►  November (21)
  • ►  October (26)
  • ►  September (25)
  • ►  August (33)
  • ►  July (23)
  • ►  June (23)
  • ►  May (33)
  • ►  April (28)
  • ►  February (25)
  • ►  January (25)
  • ►  December (35)
  • ►  November (25)
  • ►  October (34)
  • ►  September (27)
  • ►  August (2)
  • Share full article

Advertisement

Supported by

Essay; Bailing Out Moscow

By William Safire

  • Feb. 25, 1988

Essay; Bailing Out Moscow

We have just been told by a well-placed informant inside the Kremlin that the Soviet Union is not the economic power our intelligence analysts have long thought it was.

Throughout the Reagan years, our experts have assumed that Soviet growth averaged slightly over 3 percent yearly. That is a vital statistic: we then put a price each year on what we know the Soviet military machine cost, and get what we hope is a clear idea of what percentage of its economy Moscow is devoting to armament.

That's just about the most important intelligence number of all. In the 70's, a ''Team B'' of outsiders was brought in by the C.I.A. to challenge the conventional wisdom, and doubled the previous estimate to 13 percent in the Soviet Union. That laid the basis for our own increased defense spending, which now amounts to 6 percent of our gross national product.

In a little-noted passage of his long speech last week to his Central Committee, Mikhail Gorbachev made a stunning revelation that kicks our estimates into a cocked hat.

He pointed out that during the Brezhnev years, economic growth had been artificially hiked by the sale of oil at high prices (the U.S.S.R. is the world's largest producer) and the accelerated sale of vodka (Soviet spending on alcohol may have reached 10 percent of total output, compared with less than 2 percent of ours).

''If we purge economic growth indicators of the influence of these factors,'' said Mr. Gorbachev, ''it turns out that, basically, for four five-year periods there was no increase in the absolute growth of the national income and, at the beginning of the 80's, it had even begun to fall. That is the real picture, comrades!''

No doubt the current Kremlin leader is trying to make the present bad economic picture look better by saying the old days under his predecessor were really much worse. But we should allow for the possibility that, concerning the 80's at least, Mr. Gorbachev may be telling the truth.

If that is the real picture, comrades, we have to do some fast reassessing of our own. During the 80's, as the price of oil has been cut in half, and the Soviet gulping of booze has been restricted, the total Soviet output is not likely to have risen much, if at all, from what Mr. Gorbachev says was its falling state in 1980.

Here is what that new assessment leads us to deduce: the Soviet economy has been stagnant (or possibly declining) for seven years - most definitely not growing steadily at the over-3-percent rate per year our analysts had been assuming. That means our assessment of total growth of about one-fourth in this decade has been egregiously mistaken. That supposedly seven-foot giant turns out to be closer to five feet tall, same as he was in the Brezhnev years.

Apply that new assessment to arms control. The way we estimate Soviet arms expenditures is by simple bean-counting, mainly from satellites, and that total is not affected. What does change is the percentage of the output devoted to arms; if it was 14 percent by the old assessment, it must be an unbearable 20 percent in the new reality Mr. Gorbachev reveals.

Thus, under pressure to reduce arms spending, he seeks treaties; forced to cut losses, he announces withdrawal from Afghanistan and may offer to reduce subsidies in Central America; faced with the prospect of having to match serious Star Wars spending, he rails at the idea of strategic defense.

Apply that no-growth, one-fourth-smaller fact to economic diplomacy. It explains why the Russians finally settled the old Czarist debt for a dime on the dollar, paving the way for a recent $77 million Soviet bond issue. That's also why the Kremlin will be seeking entry into the International Monetary Fund, GATT and the World Bank at the next meetings (in West Berlin) this fall. Soviet Communism is starving for capital.

Our European allies are rushing to lend Moscow money and to subsidize pipelines, while accommodationists here want to offer the Russians most-favored-nation status on trade. Commerce and State Department detenteniks await only vague ''economic reforms'' to end our opposition to Soviet entry into Western credit markets.

Here is a genuine issue to toss at the candidates in our election. In light of what the Soviet leader admits is ''a very serious financial problem,'' should U.S. policy seek to finance our adversary? Or should we ''stress'' Moscow now, as it surely would do to us if the roles were reversed?

Or should we use this moment of admitted Soviet economic weakness to put an irrevocable, verifiable, behavior-modifying price on every concession we confer?

IMAGES

  1. 😂 Big data essay. Big Data free essay sample. 2019-01-07

    essay on data centers

  2. Data Science Personal Statement Sample http://www

    essay on data centers

  3. Advances in Data Storage Technology

    essay on data centers

  4. The Importance of EHS in Data Centers

    essay on data centers

  5. Business Data Communications 4e

    essay on data centers

  6. Data Analysis Introduction Example

    essay on data centers

VIDEO

  1. data resident essay roll

  2. Canonized: The Games You Can't Stop Talking About

  3. 🌵Caktus AI Writer Review 2024: Is It Legit and Safe?🤔(Additional Alternative)

  4. How To Collect Data For CSS English Essay Part 2

  5. Data Collection for CSS ESSAY

  6. Data Center Networking

COMMENTS

  1. The Good, the Bad, & the Ugly of Data Centers

    Reports also show that data centers across the globe consume thirty billion watts of power, which is comparable to that outputted from thirty nuclear power plants. US data centers accounts for 25-34% of this load. Peter Gross, who has designed many data centers mentioned, "A single data center can take more power than a medium-size town.".

  2. The Future of the Data Center

    The qualitative future of data centers lies in their conceptual evolution. While there will always need to be a room full of computers, a data center is now available on demand, as a service, for rent and use to anyone with a credit card and the technical knowledge to leverage it. The magic and power of the new data center is that it is an ...

  3. Data Center Essay

    Satisfactory Essays. 1544 Words. 7 Pages. Open Document. Data Center is a complicated facility with extremely sophisticated and powerful I.T. equipment intricately paired with cooling systems inside a specifically designed building environment. This dynamic setting makes the objective of energy efficiency in such facilities more complicated ...

  4. Four reasons why data centers matter, five implications of their social

    Data centers have a high demand for energy and water, competing with local residents for these resources. The data center industry is a state-led niche economy. The uneven distribution of data centers can invoke inter-county competition for tax revenue, in addition to access to the water, power, and land resources that data centers require.

  5. The role of data centers in building a sustainable future

    With NTT being one of the world's largest data center companies, with more than 100 data centers in over 20 countries around the world, we recognize the important responsibility to help create a ...

  6. What Is a Data Center?

    A data center is a physical room, building or facility that houses IT infrastructure for building, running, and delivering applications and services, and for storing and managing the data associated with those applications and services. Data centers have evolved in recent years from privately-owned, tightly-controlled on-premises facilities ...

  7. Data Centers Essay Examples

    Data Centers Essays. Data Center to the Cloud. Introduction Data centers form the backbone of a modern organization, where they store and process the company's valuable data and associated applications. The requirement for powerful data centers for executives has become central in the ongoing digitization environment. In that capacity, many ...

  8. The environmental footprint of data centers in the United States

    The global electricity demand of data centers was 205 TWh in 2018, which represents about 1% of total global electricity demand [ 3 ]. The United States houses nearly 30% of data center servers, more than any other country [ 3 - 5 ]. In 2014, 1.8% of US electricity consumption was attributable to data centers, roughly equivalent to the ...

  9. Data Center to the Cloud

    Introduction Data centers form the backbone of a modern organization, where they store and process the company's valuable data and associated applications. The requirement for powerful data centers for executives has become central in the ongoing digitization environment. In that capacity, many organizations are deciding to create some distance from their conventional On-premise arrangements ...

  10. Essay On Google Data Center

    Essay On Google Data Center. 1135 Words5 Pages. Google Data Centers. Google data center are the large data center facilities Google uses to provide their services, which combine large amounts of digital storage, compute nodes organized in aisles of racks, internal and external networking, environmental controls, and operations software. [1 ...

  11. 5 Assumptions on How Data Centers of the Future Will Probably Look Like

    Smaller sized data centers A quick survey by essay writers and IT experts at Emerson Network Power showed that 58% of the people believed that data centers of the future will be a lot smaller than what we have now. 27% of the participants believe that these future data centers will be bigger than today's data center. While 14% of the ...

  12. Data Centers Waste Vast Amounts of Energy, Belying Industry Image

    Nationwide, data centers used about 76 billion kilowatt-hours in 2010, or roughly 2 percent of all electricity used in the country that year, based on an analysis by Jonathan G. Koomey, a research ...

  13. Data center

    Data center network (DCN) is the interconnection between the data centers. There are several existing architectures for the interconnections between these data centers. Several aspects are taken into consideration while designing data center network architecture. This paper shows analysis of several data center network architectures using ...

  14. Cities and regions with the highest concentration of data centers

    Surging demand for data consumption and storage is driving a rapid expansion of data centers in the United States. These U.S. data centers are located in areas with abundant electricity for their intense power demands, copious amounts of water for cooling, access to fiber connectivity, affordable real estate, tax incentives and away from regions that are prone to natural disasters.

  15. Essay: Data Centers on Hallowed Ground of My Ancestors

    DINT #60 - Essay: Data Centers on Hallowed Ground of My Ancestors In today's issue I share an essay on a data center development story that I covered during my Senior Editor days at an industry publication. Here, I reveal my opinion on PW Gateway. May 26, 2023. 2. Share this post.

  16. Neighbor vs. Neighbor: A Community Debates Data Center Growth

    Featured. Neighbor vs. Neighbor: A Community Debates Data Center Growth. Aug. 8, 2022. The Prince William Digital Gateway in Manassas is prompting heated debate between groups of residents about the role of data centers in their community. The project was proposed by local landowners seeking to capitalize on rising valuations for data center land.

  17. Energy Efficiency in Data Center: [Essay Example], 1685 words

    No, quite the opposite. The power consumption of data center in Germany raised stalwartly yet again in 2016, by 4.2 % to 12.4 billion kWh. This means data centers in Germany want as much power as 4 medium-sized coal-fired power stations create in a year.

  18. Why Data Centers Are Important for the Health Care Industry

    We begin with the digital transformation of health care. The entire health care industry is getting a top-to-bottom overhaul thanks to modern technology, timely legislation and a changing demographic landscape. The health care industry is expected to spend close to $2.7 trillion per year on IT infrastructure, including data centers, by 2020.

  19. Solving the Data Center Dilemma: On-Premise or Cloud?

    Data center decisions shouldn't be reactive but thoughtful of the longer-term implications for the business. Deciding whether to place data center workloads in the cloud, on-premise, or at the edge requires an assessment of needs, space utilization, and other factors such as deployment speed, infrastructure, and cybersecurity.

  20. Data center consolidation projects: [Essay Example], 506 words

    Data center consolidation is an IT strategy brought on by the changing IT climate to combine large amounts of servers into a compact, cost-efficient system. A consolidation strategy can take years to plan and deploy, but has ultimately been shown to reduce costs, increase the business value of IT, and make a realistically manageable footprint.

  21. Here's Why Data Center Cooling Is The Hottest Innovation In ...

    For data center providers, keeping things cool for customers is a priority. Doing that means responding to the laws of physics. When electricity travels through a semiconductor, it creates heat.

  22. New super-powered data center built in Moscow

    In February 2019, the leading data center operator IXcellerate announced its strategic plans to expand data center network in Russia. Six month later, the company launched its second data center, Moscow Two. IXcellerate spent a total of $80 mio to build two facilities and plans to open another two worth some $230 mio in the Moscow Region within ...

  23. Why AI Is So Thirsty: Data Centers Use Massive Amounts of Water

    W ith its affordable land and modest electricity rates, Iowa has become a magnet for data centers for some of the world's tech giants.. Tech companies also like Iowa's wind power, which gives it ...

  24. Data Collection in the Moscow Metro

    They can also rate places and leave tips for others based on their experience. These data show, for example, that stations in Moscow's outlying residential districts have surprisingly higher rates of check-in per user than those in the city center. Biological data can tell us about human-environment interactions at local and global scales.

  25. PDF Buldii ng smart transport in Moscow

    Maksim Liksutov: Data protection is a primary concern in the management of any . smart city. The introduction of smart technologies involves many risks, and we want to provide the most reliable protection available. This month, the Moscow Center for Traffic Management set up a new protective barrier for the virtual infrastructure of the ITS,

  26. Opinion

    Apply that new assessment to arms control. The way we estimate Soviet arms expenditures is by simple bean-counting, mainly from satellites, and that total is not affected.

  27. Examining Racial Identity Responses Among People with Middle Eastern

    In processing survey data for public release, the Census Bureau classifies these responses as White in accordance with Federal guidance set by the U.S. Office of Management and Budget. Research that uses these edited public data relies on limited information on MENA people's racial identification.