
Artificial Intelligence Becomes a Major Consumer of Energy and Water: How the Growth of Neural Networks Affects the Climate and What Risks and Opportunities It Creates for Investors and the Global Economy
The Booming Growth of AI and its Appetite for Energy
The demand for computational power in AI has skyrocketed in recent years. Since the launch of public neural networks like ChatGPT at the end of 2022, businesses worldwide have been rapidly adopting AI models, necessitating vast volumes of data processing. Industry estimates suggest that by 2024, AI could account for about 15–20% of total energy consumption across global data centers. The power required to run AI systems may reach 23 GW by 2025—comparable to the total electricity consumption of a country like the United Kingdom. In comparison, this figure exceeds the energy consumption of the entire Bitcoin mining network, highlighting AI as one of the most energy-intensive forms of computation.
This exponential trajectory is driven by massive investments from tech companies in infrastructure: new data centers are opened almost weekly, and specialized chips for machine learning are launched every few months. The expansion of such infrastructure directly leads to increased electricity consumption needed to power and cool thousands of servers serving modern neural networks.
Emissions Equivalent to a Metropolis
Such high energy consumption inevitably leads to significant greenhouse gas emissions if the energy is partially sourced from fossil fuels. According to a recent study, AI could be responsible for 32–80 million metric tons of carbon dioxide (CO2) per year by 2025. This elevates the "carbon footprint" of AI to the level of an entire city: for example, New York’s annual emissions are about 50 million tons of CO2. For the first time, a technology that seemed purely digital demonstrates an environmental impact equivalent to that of large industrial sectors.
It is important to note that these estimates are considered conservative. They primarily consider emissions from electricity generation to power servers, while the complete lifecycle of AI—from equipment manufacturing (servers, chips) to disposal—creates an additional carbon footprint. If the AI boom continues at its current pace, the volume of associated emissions will rise rapidly. This complicates global efforts to reduce greenhouse gases and presents technological companies with the challenge of how to integrate the explosive growth of AI into their commitments to achieve carbon neutrality.
The Water Footprint of Neural Networks
Another hidden resource "appetite" of AI is water. Data centers consume vast amounts of water for cooling servers and equipment; evaporative cooling and air conditioning do not operate without water resources. Besides direct consumption, significant amounts of water are required indirectly—at power plants for cooling turbines and reactors when generating the very electricity consumed by computing clusters. According to experts, AI systems alone could consume between 312 to 765 billion liters of water by 2025. This is comparable to the total volume of bottled water consumed by humanity in a year. Thus, neural networks create a colossal water footprint, previously largely unnoticed by the general public.
Official estimates often do not reflect the full picture. For instance, the International Energy Agency cited approximately 560 billion liters of water consumed by all data centers globally in 2023, but this statistic did not include water used at power stations. The actual water footprint of AI could be several times higher than formal estimates suggest. Major players in the industry are currently reluctant to disclose details: in a recent report about its AI system, Google explicitly stated that it does not consider water consumption at external power plants in its metrics. Such an approach has faced criticism, as a significant amount of water is consumed specifically to meet the electrical needs of AI.
Right now, the scale of water consumption raises concerns in several regions. In arid areas of the U.S. and Europe, communities oppose the construction of new data centers, fearing they would siphon scarce water from local sources. Corporations themselves are also noting an increase in "hunger" for their server farms: Microsoft reported that global water consumption at its data centers surged by 34% in 2022 (to 6.4 billion liters), largely due to increased loads associated with AI model training. These facts emphasize that the water factor is rapidly coming to the forefront when assessing the environmental risks of digital infrastructure.
Lack of Transparency Among Tech Giants
Paradoxically, despite the scale of the impact, there is very little publicly available data on the energy and water consumption of AI. Large tech companies (Big Tech) typically provide aggregated figures on emissions and resource use in their sustainability reports without separately disclosing the share related to AI. Detailed information about data center operations—such as how much energy or water is specifically used for neural network computations—often remains within companies. There is almost no information on "indirect" consumption, such as water used in the production of electricity for data centers.
Consequently, researchers and analysts often have to act like detectives, piecing together the picture from fragmentary data: snippets from corporate presentations, estimates of the number of server chips sold for AI, data from energy companies, and other indirect indicators. This lack of transparency hampers the understanding of the full scale of AI's environmental footprint. Experts are calling for the introduction of strict disclosure standards: companies should report on the energy consumption and water use of their data centers, breaking it down by key areas, including AI. Such transparency would allow society and investors to objectively evaluate the impact of new technologies and encourage the industry to seek ways to reduce the environmental load.
Impending Environmental Risks
If current trends continue, the growing "appetite" of AI could exacerbate existing environmental problems. Additional tens of millions of tons of greenhouse gas emissions annually will complicate the achievement of the goals set out in the Paris Agreement on climate. The consumption of hundreds of billions of liters of fresh water will occur against a backdrop of a global freshwater shortage, which is projected to reach 56% by 2030. In other words, without sustainable development measures, the expansion of AI risks conflicting with the ecological limits of the planet.
If nothing changes, such trends could lead to the following negative consequences:
- Acceleration of global warming due to increased greenhouse gas emissions.
- Worsening freshwater scarcity in already arid regions.
- Increased strain on energy systems and socio-environmental conflicts surrounding limited resources.
Local communities and authorities are already beginning to respond to these challenges. In some countries, restrictions have been implemented on the construction of "energy-guzzling" data centers, requiring the use of water recycling systems or the purchase of renewable energy. Experts note that without significant changes, the AI industry could risk transforming from a purely digital domain into a source of tangible environmental crises—from droughts to failures in climate plans.
Investor Perspective: The ESG Factor
Environmental aspects of the rapid development of AI are becoming increasingly important for investors. In an era where ESG (Environmental, Social, and Governance) principles are coming to the forefront, the carbon and water footprint of technologies directly influences company valuations. Investors are questioning whether a "green" shift in policy will lead to increased costs for companies betting on AI. For example, stricter carbon regulations or the introduction of a fee for water usage could raise expenses for those companies whose neural network services consume substantial amounts of energy and water.
On the other hand, companies that invest now in mitigating the environmental impact of AI may gain an advantage. Transitioning data centers to renewable energy, optimizing chips and software for enhanced energy efficiency, and implementing water reuse systems reduce risks and improve reputation. The market highly values progress in sustainability: investors worldwide are increasingly incorporating environmental metrics into their business evaluation models. Therefore, for tech leaders, the question is acute: how to continue increasing AI capacity while simultaneously meeting societal expectations for sustainability? Those who find a balance between innovation and responsible resource management will ultimately win—both in terms of image and business value.
The Path to Sustainable AI
Despite the scale of the problem, the industry has opportunities to steer AI growth towards sustainability. Global tech companies and researchers are already working on solutions that can minimize the environmental footprint of AI without stifling innovation. Key strategies include:
- Enhancing energy efficiency of models and equipment. Developing optimized algorithms and specialized chips (ASIC, TPU, etc.) that perform machine learning tasks with lower energy consumption.
- Transitioning to clean energy sources. Utilizing electricity from renewable resources (solar, wind, hydro, and nuclear power) to power data centers, effectively nullifying carbon emissions from AI operations. Many IT giants are already entering into "green" contracts, procuring clean energy for their needs.
- Reducing and recycling water consumption. Implementing new cooling systems (liquid or immersion cooling) that require significantly less water, as well as reusing technical water. Location selection for data centers should consider local water conditions: preference for areas with cold climates or sufficient water resources. Research shows that intelligent location choices and cooling technologies can reduce the water and carbon footprint of data centers by 70–85%.
- Transparency and accountability. Introducing mandatory monitoring and data disclosure on energy consumption and water use associated with AI infrastructure. Public accounting encourages companies to manage resources more efficiently and enables investors to track progress in reducing ecosystem impact.
- Applying AI to manage resources. Ironically, AI itself can help address this problem. Machine learning algorithms are already being used to optimize cooling in data centers, predict loads, and distribute tasks to minimize peak demands on networks and enhance server utilization efficiency.
The next few years will be crucial for integrating sustainability principles into the core of the rapidly growing AI sector. The industry stands at a crossroads: either continue on inertia, risking environmental barriers, or turn the problem into an impetus for new technologies and business models. If transparency, innovation, and responsible resource management become integral parts of AI strategies, the "digital mind" may evolve hand in hand with care for the planet. Such a balance is what investors and society as a whole expect from this new technological era.