Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Data center
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Energy use== [[File:Google Data Center, The Dalles.jpg|thumb|[[Google Data Centers|Google Data Center]], [[The Dalles, Oregon]]]] {{main|IT energy management}} Energy use is a central issue for data centers. Power draw ranges from a few kW for a rack of servers in a closet to several tens of MW for large facilities. Some facilities have power densities more than 100 times that of a typical office building.<ref>{{cite web|url=http://www1.eere.energy.gov/femp/program/dc_energy_consumption.html|title=Data Center Energy Consumption Trends|publisher=U.S. Department of Energy|access-date=2010-06-10|archive-date=2010-06-05|archive-url=https://web.archive.org/web/20100605052937/http://www1.eere.energy.gov/femp/program/dc_energy_consumption.html|url-status=live}}</ref> For higher power density facilities, electricity costs are a dominant [[operating expense]] and account for over 10% of the [[total cost of ownership]] (TCO) of a data center.<ref>[http://www.intel.com/assets/pdf/general/servertrendsreleasecomplete-v25.pdf J. Koomey, C. Belady, M. Patterson, A. Santos, K.D. Lange: Assessing Trends Over Time in Performance, Costs, and Energy Use for Servers] Released on the web August 17th, 2009.</ref> ===Greenhouse gas emissions=== In 2020, data centers (excluding cryptocurrency mining) and data transmission each used about 1% of world electricity.<ref name=":0">{{Cite web |title=Data Centres and Data Transmission Networks – Analysis |url=https://www.iea.org/reports/data-centres-and-data-transmission-networks |access-date=2022-03-06 |website=IEA |language=en-GB |archive-date=2023-07-05 |archive-url=https://web.archive.org/web/20230705054839/https://www.iea.org/reports/data-centres-and-data-transmission-networks |url-status=live }}</ref> Although some of this electricity was low carbon, the [[International Energy Agency|IEA]] called for more "government and industry efforts on energy efficiency, renewables procurement and RD&D",<ref name=":0" /> as some data centers still use electricity generated by fossil fuels.<ref>{{Cite news |last=Kantor |first=Alice |date=2021-05-18 |title=Big Tech races to clean up act as cloud energy use grows |work=Financial Times |url=https://www.ft.com/content/c719f655-149c-4ce0-a7a5-18527c7776cf |archive-url=https://ghostarchive.org/archive/20221210/https://www.ft.com/content/c719f655-149c-4ce0-a7a5-18527c7776cf |archive-date=2022-12-10 |url-access=subscription |url-status=live |access-date=2022-03-06}}</ref> They also said that lifecycle emissions should be considered, that is including ''embodied'' emissions, such as in buildings.<ref name=":0" /> Data centers are estimated to have been responsible for 0.5% of US greenhouse gas emissions in 2018.<ref>{{Cite journal |last1=Siddik |first1=Md Abu Bakar |last2=Shehabi |first2=Arman |last3=Marston |first3=Landon |date=2021-05-21 |title=The environmental footprint of data centers in the United States |journal=Environmental Research Letters |language=en |volume=16 |issue=6 |pages=064017 |doi=10.1088/1748-9326/abfba1 |bibcode=2021ERL....16f4017S |s2cid=235282419 |issn=1748-9326|doi-access=free |hdl=10919/109747 |hdl-access=free }}</ref> Some Chinese companies, such as [[Tencent]], have pledged to be carbon neutral by 2030, while others such as [[Alibaba Group|Alibaba]] have been criticized by [[Greenpeace]] for not committing to become carbon neutral.<ref>{{Cite web |last=James |first=Greg |date=2022-03-01 |title=Tencent pledges to achieve carbon neutrality by 2030 |url=https://supchina.com/2022/03/01/tencent-pledges-to-achieve-carbon-neutrality-by-2030/ |access-date=2022-03-06 |website=SupChina |language=en-US |archive-date=2022-07-11 |archive-url=https://web.archive.org/web/20220711133323/https://supchina.com/2022/03/01/tencent-pledges-to-achieve-carbon-neutrality-by-2030/ |url-status=dead }}</ref> Google and Microsoft now each consume more power than some fairly big countries, surpassing the consumption of more than 100 countries.<ref>{{Cite web |author1=Craig Hale |date=2024-07-15 |title=Google and Microsoft now each consume more power than some fairly big countries |url=https://www.techradar.com/pro/google-and-microsoft-now-each-consume-more-power-than-some-fairly-big-countries |access-date=2024-07-18 |website=TechRadar |language=en}}</ref> ===Energy efficiency and overhead=== The most commonly used energy efficiency metric for data centers is [[power usage effectiveness]] (PUE), calculated as the ratio of total power entering the data center divided by the power used by IT equipment. :<math> \mathrm{PUE} = {\mbox{Total Facility Power} \over \mbox{IT Equipment Power}} </math> PUE measures the percentage of power used by overhead devices (cooling, lighting, etc.). The average USA data center has a PUE of 2.0,<ref name="energystar1">{{cite web |title=Report to Congress on Server and Data Center Energy Efficiency |url=http://www.energystar.gov/ia/partners/prod_development/downloads/EPA_Datacenter_Report_Congress_Final1.pdf |publisher=U.S. Environmental Protection Agency ENERGY STAR Program}}</ref> meaning two watts of total power (overhead + IT equipment) for every watt delivered to IT equipment. State-of-the-art data centers are estimated to have a PUE of roughly 1.2.<ref>{{cite web|url=https://microsite.accenture.com/svlgreport/Documents/pdf/SVLG_Report.pdf|title=Data Center Energy Forecast|publisher=Silicon Valley Leadership Group|access-date=2010-06-10|archive-url=https://web.archive.org/web/20110707080409/https://microsite.accenture.com/svlgreport/Documents/pdf/SVLG_Report.pdf|archive-date=2011-07-07|url-status=dead }}</ref> [[Google]] publishes quarterly efficiency metrics from its data centers in operation.<ref>{{cite web|url=https://www.google.com/about/datacenters/efficiency/internal/|title=Efficiency: How we do it – Data centers|access-date=2015-01-19|archive-date=2019-09-30|archive-url=https://web.archive.org/web/20190930070102/http://www.google.com/about/datacenters/efficiency/internal/|url-status=live}}</ref> PUEs of as low as 1.01 have been achieved with two phase immersion cooling.<ref>{{cite web | url=https://www.networkworld.com/article/969739/immersion-cooling-firm-liquidstack-launches-as-a-stand-alone-company.html | title=Immersion cooling firm LiquidStack launches as a stand-alone company | access-date=2024-02-28 | archive-date=2024-02-28 | archive-url=https://web.archive.org/web/20240228183457/https://www.networkworld.com/article/969739/immersion-cooling-firm-liquidstack-launches-as-a-stand-alone-company.html | url-status=live }}</ref> The [[U.S. Environmental Protection Agency]] has an [[Energy Star]] rating for standalone or large data centers. To qualify for the ecolabel, a data center must be within the top quartile in energy efficiency of all reported facilities.<ref>Commentary on introduction of Energy Star for Data Centers {{cite web|title=Introducing EPA ENERGY STAR for Data Centers|url=http://www.emerson.com/edc/post/2010/06/15/Introducing-EPA-ENERGY-STARc2ae-for-Data-Centers.aspx|format=Web site|publisher=Jack Pouchet|access-date=2010-09-27|date=2010-09-27|url-status=dead|archive-url=https://web.archive.org/web/20100925210539/http://emerson.com/edc/post/2010/06/15/Introducing-EPA-ENERGY-STARc2ae-for-Data-Centers.aspx|archive-date=2010-09-25 }}</ref> The Energy Efficiency Improvement Act of 2015 (United States) requires federal facilities—including data centers—to operate more efficiently. California's [[California Energy Code|Title 24]] (2014) of the California Code of Regulations mandates that every newly constructed data center must have some form of airflow containment in place to optimize energy efficiency. The European Union also has a similar initiative: EU Code of Conduct for Data Centres.<ref>{{cite web|url=http://iet.jrc.ec.europa.eu/energyefficiency/ict-codes-conduct/data-centres-energy-efficiency|title=EU Code of Conduct for Data Centres|publisher=iet.jrc.ec.europa.eu|access-date=2013-08-30|archive-date=2013-08-11|archive-url=https://web.archive.org/web/20130811202339/http://iet.jrc.ec.europa.eu/energyefficiency/ict-codes-conduct/data-centres-energy-efficiency|url-status=dead}}</ref> ===Energy use analysis and projects=== The focus of measuring and analyzing energy use goes beyond what is used by IT equipment; facility support hardware such as chillers and fans also use energy.<ref>{{cite web|url=http://www.gtsi.com/cms/documents/white-papers/green-it.pdf|title=Reducing Data Center Power and Energy Consumption: Saving Money and "Going Green"|website=www.gtsi.com|access-date=2012-02-08|archive-date=2012-12-03|archive-url=https://web.archive.org/web/20121203013823/http://www.gtsi.com/cms/documents/white-papers/green-it.pdf|url-status=dead}}</ref> In 2011, server racks in data centers were designed for more than 25 kW and the typical server was estimated to waste about 30% of the electricity it consumed. The energy demand for information storage systems is also rising. A high-availability data center is estimated to have a 1 megawatt (MW) demand and consume $20,000,000 in electricity over its [[Product lifecycle|lifetime]], with cooling representing 35% to 45% of the data center's [[total cost of ownership]]. Calculations show that in two years, the cost of powering and cooling a server could be equal to the cost of purchasing the server hardware.<ref>{{Cite book|title=Designing Green Networks and Network Operations: Saving Run-the-Engine Costs|author=Daniel Minoli|publisher=CRC Press|year=2011|isbn= 9781439816394|pages=5}}</ref> Research in 2018 has shown that a substantial amount of energy could still be conserved by optimizing IT refresh rates and increasing server utilization.<ref>{{cite journal|title=A Comprehensive Reasoning Framework for Hardware Refresh in Data Centres|author=Rabih Bashroush|journal=IEEE Transactions on Sustainable Computing|volume=3|issue=4|pages=209–220|date=2018|doi=10.1109/TSUSC.2018.2795465|s2cid=54462006|doi-access=free}}</ref> Research for optimizing task scheduling is also underway, with researchers looking to implement energy-efficient scheduling algorithms that could reduce energy consumption by anywhere between 6% to 44%. <ref>{{Cite journal |last1=Kaur |first1=Rupinder |last2=Kaur |first2=Gurjinder |last3=Goraya |first3=Major Singh |date=2025-05-05 |title=EESF: Energy-Efficient Scheduling Framework for Deadline-Constrained Workflows with Computation Speed Estimation Method in Cloud |url=https://linkinghub.elsevier.com/retrieve/pii/S0167819125000158 |journal=Parallel Computing |volume=124 |pages=103139 |doi=10.1016/j.parco.2025.103139 |issn=0167-8191}}</ref> In 2011, [[Facebook]], [[Rackspace]] and others founded the [[Open Compute Project]] (OCP) to develop and publish open standards for greener data center computing technologies. As part of the project, Facebook published the designs of its server, which it had built for its first dedicated data center in Prineville. Making servers taller left space for more effective [[heat sinks]] and enabled the use of fans that moved more air with less energy. By not buying [[commercial off-the-shelf]] servers, energy consumption due to unnecessary expansion slots on the [[motherboard]] and unneeded components, such as a [[graphics card]], was also saved.<ref name=PSayer>{{cite web|date=March 28, 2018|title=What is the Open Compute Project?|author=Peter Sayer|publisher=NetworkWorld|url=https://www.networkworld.com/article/965446/what-is-the-open-compute-project.html|access-date=February 3, 2019|archive-date=November 29, 2023|archive-url=https://web.archive.org/web/20231129200453/https://www.networkworld.com/article/965446/what-is-the-open-compute-project.html|url-status=live}}</ref> In 2016, Google joined the project and published the designs of its 48V DC shallow data center rack. This design had long been part of [[Google data centers]]. By eliminating the multiple [[transformer]]s usually deployed in data centers, Google had achieved a 30% increase in energy efficiency.<ref>{{cite web|date=March 9, 2016|title=OCP Summit: Google joins and shares 48V tech|author=Peter Judge|publisher=DCD Data center Dynamics|url=https://www.datacenterdynamics.com/news/ocp-summit-google-joins-and-shares-48v-tech/|access-date=February 3, 2019|archive-date=February 3, 2019|archive-url=https://web.archive.org/web/20190203201732/https://www.datacenterdynamics.com/news/ocp-summit-google-joins-and-shares-48v-tech/|url-status=live}}</ref> In 2017, sales for data center hardware built to OCP designs topped $1.2 billion and are expected to reach $6 billion by 2021.<ref name=PSayer/> ===Power and cooling analysis=== [[File:Cern datacenter.jpg|thumb|300px|Data center at [[CERN]] (2010)]] Power is the largest recurring cost to the user of a data center.<ref name=DRJ_Choosing>{{Citation|title = Choosing a Data Center|url = http://www.atlantic.net/images/pdf/choosing_a_data_center.pdf|publisher = Disaster Recovery Journal|year = 2009|author = Joe Cosmano|access-date = 2012-07-21}}{{Dead link|date=March 2023 |bot=InternetArchiveBot |fix-attempted=yes }}</ref> Cooling at or below {{convert|70|F|C}} wastes money and energy.<ref name=DRJ_Choosing/> Furthermore, overcooling equipment in environments with a high relative humidity can expose equipment to a high amount of moisture that facilitates the growth of salt deposits on conductive filaments in the circuitry.<ref name=Processor>{{Citation|title=Heat Of The Moment|url=http://www.processor.com/editorial/article.asp?article=articles%2Fp2628%2F21p28%2F21p28.asp|archive-url=https://archive.today/20130131214828/http://www.processor.com/editorial/article.asp?article=articles/p2628/21p28/21p28.asp|url-status=dead|archive-date=2013-01-31|journal=Processor |volume=26 |issue=28 |date=July 9, 2004 |author=David Garrett|access-date=2012-07-21}}</ref> A '''power and cooling analysis''', also referred to as a thermal assessment, measures the relative temperatures in specific areas as well as the capacity of the cooling systems to handle specific ambient temperatures.<ref>{{cite web|url=http://www.internetnews.com/xSP/article.php/3690651/HPs+Green+Data+Center+Portfolio+Keeps+Growing.htm|title=HP's Green Data Center Portfolio Keeps Growing |website=InternetNews |first1=David |last1=Needle |date=25 July 2007 |url-status=live |archive-url=https://web.archive.org/web/20201025201651/http://www.internetnews.com/xSP/article.php/3690651/HPs+Green+Data+Center+Portfolio+Keeps+Growing.htm |archive-date= Oct 25, 2020 }}</ref> A power and cooling analysis can help to identify hot spots, over-cooled areas that can handle greater power use density, the breakpoint of equipment loading, the effectiveness of a raised-floor strategy, and optimal equipment positioning (such as AC units) to balance temperatures across the data center. Power cooling density is a measure of how much square footage the center can cool at maximum capacity.<ref name=Inc_Howtochoose>{{Citation|title=How to Choose a Data Center|url=http://www.inc.com/guides/2010/11/how-to-choose-a-data-center_pagen_2.html|date=Nov 29, 2010 |website=Inc. |access-date=2012-07-21 |url-status=dead |archive-url=https://web.archive.org/web/20130308163559/http://www.inc.com/guides/2010/11/how-to-choose-a-data-center_pagen_2.html |archive-date= Mar 8, 2013 }}</ref> The cooling of data centers is the second largest power consumer after servers. The cooling energy varies from 10% of the total energy consumption in the most efficient data centers and goes up to 45% in standard air-cooled data centers. ===Energy efficiency analysis=== An energy efficiency analysis measures the energy use of data center IT and facilities equipment. A typical energy efficiency analysis measures factors such as a data center's Power Use Effectiveness (PUE) against industry standards, identifies mechanical and electrical sources of inefficiency, and identifies air-management metrics.<ref>{{cite web|url=http://www.triplepundit.com/2011/04/hp-launches-program-companies-integrate-manage-energy-carbon-reduction-strategies|title=HP Shows Companies How to Integrate Energy Management and Carbon Reduction |first1=Siranosian |last1=Kathryn |website=TriplePundit |date=April 5, 2011 |access-date=February 8, 2012|archive-date=August 22, 2018|archive-url=https://web.archive.org/web/20180822151326/https://www.triplepundit.com/2011/04/hp-launches-program-companies-integrate-manage-energy-carbon-reduction-strategies/|url-status=dead}}</ref> However, the limitation of most current metrics and approaches is that they do not include IT in the analysis. Case studies have shown that by addressing energy efficiency holistically in a data center, major efficiencies can be achieved that are not possible otherwise.<ref>{{cite journal|author1=Rabih Bashroush|author2=Eoin Woods|title=Architectural Principles for Energy-Aware Internet-Scale Applications|journal=IEEE Software|volume=34|issue=3|pages=14–17| date=2017|doi=10.1109/MS.2017.60|s2cid=8984662}}</ref> ===Computational Fluid Dynamics (CFD) analysis=== {{main|Computational fluid dynamics}} This type of analysis uses sophisticated tools and techniques to understand the unique thermal conditions present in each data center—predicting the temperature, [[airflow]], and pressure behavior of a data center to assess performance and energy consumption, using numerical modeling.<ref>[http://blog.transitionaldata.com/aggregate/bid/37840/Seeing-the-Invisible-Data-Center-with-CFD-Modeling-Software Bullock, Michael. "Computation Fluid Dynamics - Hot topic at Data Center World," Transitional Data Services, March 18, 2010.] {{webarchive|url=https://web.archive.org/web/20120103183406/http://blog.transitionaldata.com/aggregate/bid/37840/Seeing-the-Invisible-Data-Center-with-CFD-Modeling-Software|date=January 3, 2012}}</ref> By predicting the effects of these environmental conditions, CFD analysis of a data center can be used to predict the impact of high-density racks mixed with low-density racks<ref>{{cite web|url=http://www.thegreengrid.org/~/media/WhitePapers/White_Paper_27_Impact_of_Virtualization_Data_On_Center_Physical_Infrastructure_020210.pdf?lang=en|title=Bouley, Dennis (editor). "Impact of Virtualization on Data Center Physical Infrastructure," The Green grid, 2010.|access-date=2012-02-08|archive-url=https://web.archive.org/web/20140429215242/http://www.thegreengrid.org/~/media/WhitePapers/White_Paper_27_Impact_of_Virtualization_Data_On_Center_Physical_Infrastructure_020210.pdf?lang=en|archive-date=2014-04-29|url-status=dead}}</ref> and the onward impact on cooling resources, poor infrastructure management practices, and AC failure or AC shutdown for scheduled maintenance. ===Thermal zone mapping=== Thermal zone mapping uses sensors and computer modeling to create a three-dimensional image of the hot and cool zones in a data center.<ref>{{cite web|url=http://searchdatacenter.techtarget.com/news/1265634/HP-Thermal-Zone-Mapping-plots-data-center-hot-spots|title=HP Thermal Zone Mapping plots data center hot spots|access-date=2012-02-08|archive-date=2021-01-26|archive-url=https://web.archive.org/web/20210126131207/https://searchdatacenter.techtarget.com/news/1265634/HP-Thermal-Zone-Mapping-plots-data-center-hot-spots|url-status=dead}}</ref> This information can help to identify optimal positioning of data center equipment. For example, critical servers might be placed in a cool zone that is serviced by redundant AC units. ===Green data centers=== {{Main|Green data center}} [[File:Magazin Vauban E.jpg|thumb| This water-cooled data center in the [[Independent Port of Strasbourg|Port of Strasbourg]], France claims the attribute ''green''.]] Data centers use a lot of power, consumed by two main usages: The power required to run the actual equipment and then the power required to cool the equipment. Power efficiency reduces the first category. Cooling cost reduction through natural means includes location decisions: When the focus is avoiding good fiber connectivity, power grid connections, and people concentrations to manage the equipment, a data center can be miles away from the users. Mass data centers like Google or Facebook don't need to be near population centers. Arctic locations that can use outside air, which provides cooling, are becoming more popular.<ref>{{cite web|url=http://www.gizmag.com/fjord-cooled-data-center/20938/|title=Fjord-cooled DC in Norway claims to be greenest|date=23 December 2011 |access-date=23 December 2011}}</ref> Renewable electricity sources are another plus. Thus countries with favorable conditions, such as Canada,<ref>[https://www.theglobeandmail.com/report-on-business/canada-called-prime-real-estate-for-massive-data-computers/article2071677/ Canada Called Prime Real Estate for Massive Data Computers - Globe & Mail] Retrieved June 29, 2011.</ref> Finland,<ref>[http://datacenter-siting.weebly.com/ Finland - First Choice for Siting Your Cloud Computing Data Center.]. Retrieved 4 August 2010.</ref> Sweden,<ref>{{cite web|url=http://www.stockholmbusinessregion.se/templates/page____41724.aspx?epslanguage=EN|title=Stockholm sets sights on data center customers|access-date=4 August 2010|archive-url=https://web.archive.org/web/20100819190918/http://www.stockholmbusinessregion.se/templates/page____41724.aspx?epslanguage=EN|archive-date=19 August 2010}}</ref> Norway,<ref>[http://www.innovasjonnorge.no/en/start-page/invest-in-norway/industries/datacenters/ In a world of rapidly increasing carbon emissions from the ICT industry, Norway offers a sustainable solution] {{Webarchive|url=https://web.archive.org/web/20201029174125/https://www.innovasjonnorge.no/en/start-page/invest-in-norway/industries/datacenters/ |date=2020-10-29 }} Retrieved 1 March 2016.</ref> and Switzerland<ref>[http://www.greenbiz.com/news/2010/06/30/swiss-carbon-neutral-servers-hit-cloud Swiss Carbon-Neutral Servers Hit the Cloud.] {{Webarchive|url=https://web.archive.org/web/20170703113445/https://www.greenbiz.com/news/2010/06/30/swiss-carbon-neutral-servers-hit-cloud |date=2017-07-03 }}. Retrieved 4 August 2010.</ref> are trying to attract cloud computing data centers. Singapore lifted a three-year ban on new data centers in April 2022. A major data center hub for the Asia-Pacific region,<ref>{{Cite web |last=Baxtel |title=Singapore Data Centers & Colocation |url=https://baxtel.com/data-center/singapore |access-date=2024-09-18 |website=baxtel.com |language=en}}</ref> Singapore lifted its moratorium on new data center projects in 2022, granting 4 new projects, but rejecting more than 16 data center applications from over 20 new data centers applications received. Singapore's new data centers shall meet very strict green technology criteria including "Water Usage Effectiveness (WUE) of 2.0/MWh, Power Usage Effectiveness (PUE) of less than 1.3, and have a "Platinum certification under Singapore's BCA-IMDA Green Mark for New Data Centre" criteria that clearly addressed decarbonization and use of hydrogen cells or solar panels.<ref>{{Cite web |title=Singapore authorities invite applications for new data centers |date=20 July 2022 |url=https://www.datacenterdynamics.com/en/news/singapore-authorities-invite-applications-for-new-data-centers/}}</ref><ref>{{Cite web |title=BCA-IMDA Green Mark for Data Centres Scheme |url=https://www.imda.gov.sg/how-we-can-help/bca-imda-green-mark-for-data-centres-scheme |access-date=2024-09-18 |website=Infocomm Media Development Authority |language=en}}</ref><ref>{{Cite web |date=2024-05-30 |title=Singapore to free up 300MW for data centres |url=https://www.capacitymedia.com/article/2davfx8ylwhh1au8329kw/news/singapore-to-free-up-300mw-for-data-centres |access-date=2024-09-18 |website=Capacity Media |language=en}}</ref><ref>{{Cite web |title=4 Proposals Selected from Data Centre Application |url=https://www.imda.gov.sg/resources/press-releases-factsheets-and-speeches/press-releases/2023/four-data-centre-proposals-selected-as-part-of-pilot-data-centre-call-for-application |access-date=2024-09-18 |website=Infocomm Media Development Authority |language=en}}</ref> ====Direct current data centers==== {{See also|Power distribution unit|Inspur Server Series|Switched-mode power supply|Off-the-grid}} Direct current data centers are data centers that produce [[direct current]] on site with [[solar panel]]s and store the electricity on site in a [[battery storage power station]]. Computers run on direct current and the need for [[Power inverter|inverting]] the [[Alternating current|AC]] [[Electrical grid|power from the grid]] would be eliminated. The data center site could still use AC power as a grid-as-a-backup solution. DC data centers could be 10% more [[Electrical efficiency|efficient]] and use less [[floor space]] for inverting components.<ref>[https://www.datacenterdynamics.com/en/opinions/could-dc-win-the-new-data-center-war-of-the-currents/ Could DC win the new data center War of the Currents?]</ref><ref>{{cite web | url=https://datacenters.lbl.gov/direct-current-dc-power | title=Direct Current (DC) Power | Center of Expertise for Energy Efficiency in Data Centers }}</ref> ===Energy reuse=== It is very difficult to reuse the heat which comes from air-cooled data centers. For this reason, data center infrastructures are more often equipped with heat pumps.<ref>{{cite web|url=https://stockholmdataparks.com/wp-content/uploads/white-paper-cost-efficient-and-sustainable-data-center-cooling-2017-01-23-1.pdf|title=Data Center Cooling with Heat Recovery|website=StockholmDataParks.com|date=January 23, 2017|access-date=2018-11-16|archive-date=2017-03-25|archive-url=https://web.archive.org/web/20170325212318/https://stockholmdataparks.com/wp-content/uploads/white-paper-cost-efficient-and-sustainable-data-center-cooling-2017-01-23-1.pdf|url-status=live}}</ref> An alternative to heat pumps is the adoption of liquid cooling throughout a data center. Different liquid cooling techniques are mixed and matched to allow for a fully liquid-cooled infrastructure that captures all heat with water. Different liquid technologies are categorized in 3 main groups, indirect liquid cooling (water-cooled racks), direct liquid cooling (direct-to-chip cooling) and total liquid cooling (complete immersion in liquid, see [[server immersion cooling]]). This combination of technologies allows the creation of a [[thermal cascade]] as part of [[temperature chaining]] scenarios to create high-temperature water outputs from the data center.{{citation needed|date=February 2025}} === Impact on electricity prices === [[Cryptomining]] and the [[artificial intelligence]] boom of the 2020s has also led to increased demand for electricity,<ref>{{Cite news |last=Halper |first=Evan |date=2024-03-07 |title=Amid explosive demand, America is running out of power |url=https://www.washingtonpost.com/business/2024/03/07/ai-data-centers-power/ |access-date=2024-08-19 |newspaper=Washington Post |language=en}}</ref><ref>{{Cite magazine |last=Rogers |first=Reece |date=July 11, 2024 |title=AI's Energy Demands Are Out of Control. Welcome to the Internet's Hyper-Consumption Era |url=https://www.wired.com/story/ai-energy-demands-water-impact-internet-hyper-consumption-era/ |access-date=2024-08-19 |magazine=Wired |language=en-US |issn=1059-1028}}</ref> that the [[International Energy Agency|IEA]] expects could double global overall data center demand for electricity between 2022 and 2026.<ref name=":1">{{Cite web |last=Calma |first=Justine |date=2024-01-24 |title=AI and crypto mining are driving up data centers' energy use |url=https://www.theverge.com/2024/1/24/24049047/data-center-ai-crypto-bitcoin-mining-electricity-report-iea |access-date=2024-08-21 |website=The Verge |language=en}}</ref> The US could see its share of the electricity market going to data centers increase from 4% to 6% over those four years.<ref name=":1" /> [[Bitcoin]] used up 2% of US electricity in 2023.<ref name=":3">{{Cite magazine |last=Chow |first=Andrew R. |date=2024-06-12 |title=How AI Is Fueling a Boom in Data Centers and Energy Demand |url=https://time.com/6987773/ai-data-centers-energy-usage-climate-change/ |access-date=2024-08-21 |magazine=TIME |language=en}}</ref> This has led to increased electricity prices in some regions,<ref>{{Cite news |last1=Halper |first1=Evan |last2=O'Donovan |first2=Caroline |date=November 1, 2024 |title=As data centers for AI strain the power grid, bills rise for everyday customers |url=https://www.washingtonpost.com/business/2024/11/01/ai-data-centers-electricity-bills-google-amazon/ |newspaper=Washington Post |archive-date=November 12, 2024 |access-date=November 1, 2024 |archive-url=https://web.archive.org/web/20241112002908/https://www.washingtonpost.com/business/2024/11/01/ai-data-centers-electricity-bills-google-amazon/ |url-status=live }}</ref> particularly in regions with lots of data centers like [[Santa Clara, California]]<ref>{{Cite web |last=Petersen |first=Melody |date=2024-08-12 |title=Power-hungry AI data centers are raising electric bills and blackout risk |url=https://www.latimes.com/environment/story/2024-08-12/california-data-centers-could-derail-clean-energy-goals |access-date=2024-08-19 |website=Los Angeles Times |language=en-US}}</ref> and [[upstate New York]].<ref>{{Cite web |last1=Benetton |first1=Matteo |last2=Compiani |first2=Giovanni |last3=Morse |first3=Adair |date=2023-08-12 |title=When cryptomining comes to town: High electricity use spillovers to the local economy |url=https://cepr.org/voxeu/columns/when-cryptomining-comes-town-high-electricity-use-spillovers-local-economy |access-date=2024-08-20 |website=[[VoxEU]] |language=en}}</ref> Data centers have also generated concerns in [[Northern Virginia]] about whether residents will have to foot the bill for future power lines.<ref name=":3" /> It has also made it harder to develop housing in London.<ref>{{Cite web |last=Vincent |first=James |date=2022-07-28 |title=The electricity demands of data centers are making it harder to build new homes in London |url=https://www.theverge.com/2022/7/28/23282007/electricity-demands-london-data-centers-blocking-new-housing-development |access-date=2024-08-21 |website=The Verge |language=en}}</ref> A Bank of America Institute report in July 2024 found that the increase in demand for electricity due in part to AI has been pushing electricity prices higher and is a significant contributor to electricity [[inflation]].<ref>{{Cite web |last=Walton |first=Robert |date=July 8, 2024 |title=US electricity prices rise again as AI, onshoring may mean decades of power demand growth: BofA |url=https://www.utilitydive.com/news/electricity-price-inflation-rising-again-BofA/720673/ |access-date=2024-08-19 |website=[[Utility Dive]] |language=en-US}}</ref><ref>{{Cite web |last=Mott |first=Filip De |title=Utility bills are getting cheaper, but AI could spoil the party |url=https://markets.businessinsider.com/news/commodities/ai-power-grid-demand-utility-bill-outlook-electricity-ev-2024-7 |access-date=2024-08-19 |website=[[Markets Insider]] |language=en-US}}</ref><ref>{{Cite web |date=May 17, 2024 |title=A.I. Power Shortage. Plus, Oil & Gas Stock Picks - Barron's Streetwise Barron's Podcasts |url=https://www.barrons.com/podcasts/streetwise/ai-power-shortage-plus-oil-gas-stock-picks/7ec80059-8d8c-4913-bc15-f5f80a50be1a?page=1& |access-date=2024-08-21 |website=[[Barron's]] |language=en}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)