AI Data Center Boom and Renaissance of Sustainability Tech
Exploring the surge in data center growth, its environmental impact, and innovative solutions for a sustainable AI-powered future
This article was first published in
, one of Substack's top AI newsletters, where I contributed as a guest writer.Thank you,
, for inviting me! You can also check it out directly here.
The first data center was created in 1945, and in less than a century, it has become a structure that has already influenced many areas behind the scenes and even directed new ones.
From humble beginnings to today's digital palaces - data centers have gone from 'nerdy closet' to 'coolest kid on the block' faster than you can say 'upload complete'.
In the coming days, it will become something we will all hear about much more frequently, permeating our way of thinking. Let's dive deep to understand it better.
1. The Data Center Boom
1.1 The Digital Backbone of AI
Data centers are the physical backbone of our digital world, housing thousands of computer servers that run continuously to support everything from cloud storage and financial transactions to social media and government operations. When we talk about data being stored in the "cloud," it's actually being housed in these massive facilities.
Just a few decades ago, data centers were simple server warehouses, often tucked away in remote areas. The internet boom of the late '90s changed that, with companies like Exodus Communications leading the charge. But by the early 2000s, overinvestment led to a crash, leaving many centers underused.
Now, in 2024, data centers are booming again—this time fueled by a 24/7 digital world and the skyrocketing demand for AI-driven computing power. The rapid advancement of AI is reshaping our world in countless ways, yet it's also having a profound and often overlooked impact: an unprecedented surge in energy consumption.
1.2 AI's Insatiable Appetite for Energy
The rise of generative AI and other advanced technologies has led to a dramatic increase in data center construction and expansion. According to the International Energy Agency (IEA), global data center electricity demand is projected to more than double between 2022 and 2026, with AI playing a major role in this increase.
Research has shown that the cost of computational power required to train AI models is doubling every nine months, with no signs of slowing down. This exponential growth in AI capabilities comes at a significant energy cost, as these models require substantially more processing power and energy compared to traditional computing tasks.
For example, ChatGPT requires 15 times more energy than a traditional web search, says Arm exec. As Harry Handlin, U.S. Data Segment Leader at ABB, noted, "A decade ago, a 30-megawatt data center was considered huge. Now when a company announces plans to build a 300-megawatt data center, no one blinks an eye. And with AI, 1-gigawatt data centers are the norm.” It's like we went from 'large fries' to 'would you like a server farm with that?
Leading tech giants, other companies, and utilities are expected to spend an estimated ~$1tn on capex in the coming years, including significant investments in data centers, chips, other AI infrastructure, and the power grid. This aggressive expansion is fueling an unprecedented surge in demand for electricity.
2. The Environmental Cost
2.1 Electricity Consumption
Data centers now account for more than 1% of global electricity use, according to the IEA. This figure is expected to rise dramatically, with predictions suggesting that by 2026, data centers could consume as much energy as entire countries like Sweden or Germany.
To put this into perspective, in 2022, data centers used more than 4% of all electricity in the U.S., with 40% of that energy being spent to keep equipment cool. As demand on data centers increases, even more energy will be required to maintain their operations.
2.2 Water Usage
The environmental impact extends beyond electricity. Researchers at UC Riverside estimated that global AI demand could cause data centers to consume over 1 trillion gallons of fresh water by 2027. 1 trillion gallons of water is equivalent to filling approximately 1.5 million Olympic-sized swimming pools. This staggering figure highlights the often-overlooked water consumption associated with cooling these massive facilities.
Of the 8,000 data centers that exist globally, about a third are in the U.S., compared to 16% in Europe and almost 10% in China. The Hong Kong-based think tank China Water Risk estimates that data centers in China consume 1.3 billion cubic meters of water per year—nearly double the volume that the city of Tianjin, home to 13.7 million people, uses for households and services.
2.3 Carbon Footprint
The global data center industry is projected to emit 2.5 billion tons of CO2 by 2030, according to Morgan Stanley. The rising energy demand is significantly driving up carbon emissions. By then, the industry's greenhouse gas emissions will amount to about 40% of the U.S.'s annual emissions.
For instance, Microsoft, a major player in AI development, saw its greenhouse gas emissions increase by 30% last year, primarily due to its ambitious AI pursuits. This trend isn't unique to Microsoft, as other tech giants are also grappling with similar challenges in balancing their AI ambitions with their sustainability goals.
3. Challenges and Concerns
3.1 Strain on Infrastructure
The explosive growth of data centers is putting enormous pressure on local infrastructure, especially power grids and water resources. In regions like Northern Virginia's "Data Center Alley," concerns are growing about potential brownouts as data center expansion threatens to outpace power generation capacity. Similar issues are seen in hubs such as Santa Clara, California, and Phoenix, where delays in securing new power connections are becoming common. Internationally, cities like Amsterdam, Dublin, and Singapore have imposed moratoriums on new data center construction due to power infrastructure limitations.
The industry is also grappling with constraints across the power value chain, including shortages and bottlenecks that slow progress. With the industry approaching physical limits on node sizes and transistor densities, coupled with long lead times for new connections, scaling efforts have become increasingly difficult.
3.2 Local Opposition
Local opposition to data center construction is intensifying, particularly in places like Dublin, where data centers already consume nearly 20% of Ireland's electricity. In Northern Virginia, historically residential areas are being rezoned for industrial use to accommodate new data centers, raising concerns about quality of life and infrastructure strain. Virginia state delegate Ian Lovejoy highlights that data centers have become the top issue for constituents, with concerns about threats to electricity and water access, and fears that taxpayers may be left to foot the bill for future infrastructure upgrades.
3.3 Competition
The race for clean energy is turning into a high-stakes game of musical chairs. When the music stops, let's hope we're not the ones left standing with a coal-powered server.
For example, CarbonCapture, a California-based climate tech company, had to cancel plans for its Project Bison, a direct air capture (DAC) facility in Wyoming, due to a lack of sufficient access to clean energy. Launched in 2022, Project Bison aimed to capture 5 million tons of carbon annually by 2030, but the competition for clean energy with industries like data centers and cryptocurrency mining hindered progress.
At the same time, OpenAI has pitched the Biden administration, highlighting the need for massive data centers that could each consume as much power as entire cities. OpenAI argues that this unprecedented expansion is crucial for developing advanced artificial intelligence models and staying competitive with China. It's like an arms race, but instead of missiles, we're stockpiling electricity bills.
3.4 Transparency Issues
The exact energy consumption of many AI models remains opaque. Major tech companies have become increasingly secretive about data sources, training time, hardware, and energy usage especially since the release of ChatGPT. This lack of transparency makes it difficult to accurately assess and address the environmental impact of AI development.
Adding to this concern are questions about the influence of major tech players on organizations that set corporate climate standards. For instance, there are worries about the impact of Amazon and the Bezos Earth Fund on entities like the Science Based Targets initiative (SBTi). The Bezos Earth Fund is a significant funder of SBTi, and Amazon's separate climate initiative allows for unrestricted use of carbon credits. This has led to fears that SBTi's rules might be swayed to favor more lenient carbon offsetting practices, a particularly troubling prospect as companies like Amazon face increased emissions from their expanding data center operations.
3.5 Moore's Law and Its Limitations
Moore's Law, which posited that the number of transistors in computer chips doubles roughly every two years, has been a driving force in the tech industry for decades. However, we may be reaching the physical limitations of silicon-based CPUs. Without a practical alternative, engineers can no longer increase the computing power of chips as rapidly or as cheaply as they did in years past, potentially impacting the efficiency gains that have helped offset increased energy demands.
4. Sustainable Future Solutions
A 2023 McKinsey & Company study projects that we're going to need about 35GW of cloud data center power capacity by 2030 to meet AI demand, compared to approximately 17GW required at the end of 2022. This means many more data centers must be built and operating in a short timeframe, creating a need for numerous innovative solutions to establish data centers, ensure their sustainability, and provide optimization.
4.1 Beating the Heat: Cool Smarter
In the world of data centers, heat is the enemy. Traditional cooling methods devour energy, often accounting for a substantial portion of a facility's power consumption. Currently, data centers generally are cooled with either air-moving fans or liquid that moves heat away from computer racks. Here are some things that I think we can look at for a few solutions that stand out:
Futuristic Cooling Systems: University of Missouri researchers, led by Professor Chanwoo Park, are developing a two-phase cooling system that efficiently dissipates heat using phase change, requiring little to no energy.

Ultra-Energy-Efficient Temperature Control: Oregon State and Baylor University researchers have developed a method to reduce energy for photonic chip cooling by a factor of more than 1 million, using gate voltage to control temperature with almost no electric current.
Higher Operating Temperatures: A study published in Cell Reports Physical Science suggests that keeping data centers at 41°C (105°F) could save up to 56% in cooling costs worldwide. This approach challenges the conventional wisdom of keeping data centers cold and proposes new temperature guidelines for more efficient operations.
Metal foam technology: Swiss company Apherossecured $1.9 million to advance its metal foam technology, boosting cooling efficiency by 90% and addressing data centers' growing energy consumption. Apheros aims to provide a more sustainable thermal management solution.
So, cooling data centers is becoming an exercise in thermodynamic acrobatics.
4.2 Leaving Carbon-Heavy Power
Nuclear Power Options: A Controversial Comeback
As data centers' appetite for energy grows, nuclear power offers strong potential. Here are some notable developments:
On a life-cycle basis, nuclear power emits just a few grams of CO2 equivalent per kWh of electricity produced. Whilst estimates vary, the United Nations (UN) Intergovernmental Panel on Climate Change (IPCC) has provided a median value among peer-reviewed studies of 12g CO2 equivalent/kWh for nuclear, similar to wind, and lower than all types of solar. (source)
Nuclear Advocacy from Tech Titans: For example, Microsoft and Constellation Energy signed a 20-year power purchase agreement to restart Pennsylvania's Three Mile Island Unit 1 nuclear reactor. Microsoft will use energy from the plant to power its data centers with carbon-free electricity, supporting its goal to be carbon-negative by 2030 and to match 100% of its electricity consumption with clean energy.
🟢 A relaunch of Three Mile Island, which had a separate unit suffer a partial meltdown in 1979 in one of the biggest industrial accidents in the country's history, still requires federal, state, and local approvals.

Major tech executives, including ChatGPT developer OpenAI CEO Sam Altman and Microsoft co-founder Bill Gates, have touted nuclear energy as a solution to the growing power needs of data centers.
Altman has backed and is the chairman of nuclear power startup Oklo (OKLO.N), opens new tab, which went public through a blank-check merger in May, while TerraPower - a startup Gates co-founded - broke ground on a nuclear facility in June.
Small Reactors, Big Impact: Micro-nuclear developer Last Energy is taking a new approach with its 20 MWe microreactors. After raising $40 million in Series B funding, the D.C.-based company aims to create modular, mass-manufacturable reactors that can be deployed in just 24 months.
Clean Energy Options: Harnessing Nature's Power
While nuclear energy draws attention, renewable sources remain vital for sustainable data center power. Key developments include:
Google's Renewable Energy Addendum: To address a 48% rise in GHG emissions over five years from AI-driven data centers, Google launched this initiative, urging major suppliers to shift to 100% renewable energy by 2029.
Solar Power Partnership: Google partnered with BlackRock to develop solar energy in Taiwan, investing in New Green Power (NGP) to supply up to 300 MW of solar energy for its data centers and suppliers.
Power Solutions Acquisition: Blackstone Energy Transition Partners acquired Trystar to support its growth amid the energy transition and AI-driven data center expansion. Blackstone plans to invest $100 billion in energy transition projects over the next decade.
Solar Thermal Storage System: Exowatt, backed by Andreessen Horowitz, Sam Altman and others, raised $20 million to launch a modular solar energy platform for data centers. The Exowatt P3 system offers electricity at $0.01 per kWh, providing a cost-effective alternative to fossil fuels, and other renewable energy alternatives.
4.3 Maximize Data Storage
Innovative approaches to data storage not only allow us to accommodate the ever-growing volume of information but also promise to dramatically reduce the physical footprint and energy consumption of data centers. Here are some groundbreaking approaches that could revolutionize how we store and manage data:
World's Smallest Electro-Optic Modulator: Oregon State University researchers have designed and fabricated the world's smallest electro-optic modulator, which could mean major reductions in energy used by data centers and supercomputers. This new modulator is 10 times smaller and potentially 100 times more energy efficient than the best previous devices. It is about the size of a bacterium, measuring 0.6 by 8 microns.
An electro-optic modulator controls light with electrical signals, crucial for modern communication systems. In data centers, where data moves via fiber-optic cables, these modulators convert electrical signals into optical ones (light), allowing fast and efficient data transmission over long distances, improving both speed and energy efficiency.
DNA Data Storage: Storing data in DNA sounds like science fiction, yet it lies in the near future. Professor Tom de Greef expects the first DNA data center to be up and running within five to ten years. Data won't be stored as zeros and ones in a hard drive but in the base pairs that make up DNA: AT and CG. Such a data center would take the form of a lab, many times smaller than the ones today.
De Greef can already picture it all. In one part of the building, new files will be encoded via DNA synthesis. Another part will contain large fields of capsules, each capsule packed with a file. A robotic arm will remove a capsule, read its contents and place it back.
🟢 The concept of DNA data storage is a prime example of biomimicry, where human technology imitates solutions found in nature. I wonder if this is just a transition for everything to be completely organic when we think that everything is going to be increasingly inorganic?

4.4 Energy-Efficient Computing
Innovative computing paradigms are emerging as game-changers in the quest for sustainability. Making computers more energy efficient is crucial because the worldwide energy consumption by on-chip electronics stands at #4 in the global rankings of nation-wise energy consumption, and it is increasing exponentially each year, fueled by applications such as artificial intelligence. Let's explore some of the most promising innovations in this space:
Quantum Computing: Quantum computers have the potential to perform complex calculations far more efficiently than classical computers. Current exascale and petascale supercomputers typically require about 15 to 25 MW to operate, compared with the 25-kW typical energy consumption of quantum computers.
Neuromorphic Computing: The human brain remains unrivaled in one crucial area: energy efficiency. Even the most advanced computers today require about 10,000 times more energy than the brain to perform tasks such as image processing and recognition. Unlike traditional computers that keep memory and processing separate, neuromorphic systems combine data storage and computation, similar to how brain cells function. This "processing in memory" design reduces energy consumption and enhances performance for AI and machine learning tasks.
Green AI: AI itself can be used to optimize the energy consumption of data centers and computational processes, potentially offsetting its own energy costs and more. For example:
In 2016, for example, Google's DeepMind AI had helped reduce its company's data center cooling energy usage by up to 40%.
Machine learning algorithms can study vast amounts of data, uncovering patterns that are usually too complex for human operators to spot, leading to more efficient energy management.
🟢 AI Model Optimization: The Scaling Laws for Neural Language Models paper from OpenAI demonstrates that by optimizing factors like model size, dataset size, and compute usage in line with power-law scaling, it's possible to enhance the efficiency of these models. Specifically, larger models can be more sample-efficient, requiring fewer optimization steps and less data to achieve comparable performance, which ultimately reduces the overall computational load and energy use.
5. The Road Ahead
As we stand at the threshold of an AI-driven era, our path forward demands a reimagining of the relationship between technological advancement and environmental stewardship. The road ahead is both challenging and promising, requiring us to understand and effectively manage what comes with it.
This technological leap forward, however, comes with significant challenges in energy consumption and environmental impact. Yet, these very challenges are catalyzing a unique phenomenon: A Renaissance in Sustainable Solutions. Unlike previous waves of environmental innovation, this renaissance is characterized not only by the rapid development of new technologies but also by an unprecedented urgency in their execution and implementation.
The digital revolution need not come at the cost of our environment; instead, it can be the catalyst for a more sustainable, hyper-efficient, and intelligently managed world. The synergy between tech innovators, environmental scientists, policymakers, and communities will be crucial in crafting a digital ecosystem that enhances rather than depletes our planet's resources.
It's time to embrace the seeds of change, learning not only from the past but also from the future that is eager to emerge.
Thanks :)
P.S. If this post sparked your curiosity, hit the ❤️ button—it helps more curious minds discover this journey!
Enjoying my writing? Wanna join me on the road and support my work?
And if these ideas got you thinking, consider sharing or recommending the journey to others.
Wanna read more?
Stay tuned for the next blog post coming!
Until then, take care!
A similar but different take on this topic: https://open.substack.com/pub/aiofthecoast/p/ai-unbound-128-the-looming-energy?utm_source=share&utm_medium=android&r=1xg2vh
Really appreciated your take on this.
Honestly, sustainability is one of the most overlooked risks of the AI boom.
Despite a long history of people charging forward and wrongly assuming we’ll be able to figure out the mess it makes later, we keep repeating our mistakes.
While we are definitely making some progress in important areas, we’ve got a long way to go.
Thanks for drawing attention to it.