Using AI to Answer Your Questions May be Costlier than You Realize, especially for the Environment.
Since ChatGPT burst onto the scene, the adoption of Large Language Models (LLMs) has grown rapidly.
For example, OpenAI’s ChatGPT is believed to have 200 million active users each month, while Google is adding AI-generated summaries to its traditional search results – broadening AI’s reach to its entire user base.
To this, we can add Meta’s customer base as well, as Meta adds AI feature sets to its Facebook, Instagram, and Threads applications.
And let’s not forget the Generative AI category that is capable of producing computer-generated still images, such as Midjourney and Stable Diffusion, which is now expanding to incorporate the production of video imagery as well.
But what impact is this rapid adoption of AI having on the environment?
You may not realize it, but each LLM query (such as asking ChatGPT or Google for an AI-based answer) uses significantly more energy than you might expect.
How much more?
According to John Hennessy, the Chairman of Alphabet (Google’s parent company), each query sent to a large language model (LLM) currently uses roughly 10 times more energy than a traditional search query. (Hennessy does believe the cost will come down over time, a prediction we will look at in more depth below.)
As companies race to implement LLMs into every imaginable application, this order-of-magnitude jump in energy usage will create shockwaves through the energy markets – and make it that much harder to meet the goals of reducing fossil fuel emissions.
Rising Demand for AI-Based Queries is Putting Increased Pressure on Fresh Water Resources Used by Data Centers
The rapid adoption of LLMs is not just driving up demand for increased energy production, it’s also having a major impact on water demand – water that is used to cool all the new data centers being constructed around the world.
For example, researchers at UC Riverside and UT Arlington (doi.org/10.48550/arXiv.2304.03271) report that by 2027, global AI use could require between 4.2 – 6.6 billion cubic meters of water.
To put that number in context, it’s a little less than 1% of US annual water consumption (444 cubic meters as of 2020). That may seem like a relatively small amount today, but given that many data centers are being built in places with rapidly depleting groundwater, it’s putting additional stress on available freshwater resources – and there is valid concern is that this number could increase significantly in the coming years as AI adoption grows.
Big Tech Companies, such as Google, are Seeking to Restore Operations at Mothballed Fossil Fuel Power Plants to Power AI Datacenters.
These growing energy and water demands are putting high-tech companies such as Google, Microsoft, and Meta in a difficult spot. Many of these companies had publicly pledged to reduce their overall carbon emissions to net zero in the future – in the case of Google by 2030 and Microsoft by 2050.
That doesn’t seem so plausible now.
Instead, AI tech giants are seeking new sources of energy to feed the demands of their growing data centers.
At the CERAWeek conference in Houston (hosted by S&P Global back in March 2024), executives from Bill Vass, vice president of engineering at Amazon Web Services, to Bill Gates, co-founder of Microsoft, voiced their opinion that AI is going to create massive demands for power.
This is confirmed by a recent study by Goldman Sachs that estimates that the power used by data centers will jump 160% by 2030.
Where will we get this power from?
Unfortunately, it now looks like tech companies are going to turn to fossil fuel power plants that were either already decommissioned or scheduled to be retired to make up the shortfall.
For example, Google and Meta are planning to source electricity for their Nebraska data center operations from a large coal-burning power plant in Omaha, which had been planned to be retired to reduce air pollution and carbon emissions.
Other coal-powered fuel plants, such as those located around Morganton, West Virginia, are now expected to continue in operation to provide electricity for the growing “data center alley” centered around Loudoun County, Virginia, just outside of Washington D.C.
Three Mile Island Nuclear Plant, the site of the US’ Worst Nuclear Accident, May be Recommissioned thanks to AI Energy Demand.
Coal-fired power plants are not the only energy source experiencing a revival thanks to the explosive demand for power by AI data centers.
Thanks to AI, nuclear power is making a comeback in the USA, starting with the infamous Three Mile Island nuclear plant near Harrisburg Pennsylvania, the site of the nation’s worst commercial nuclear power plant accident (in 1975). Microsoft has reportedly inked a 20-year energy supply contract with Constellation Energy, which plans to invest 1.6 billion to restart the Three Mile Island reactor that was mothballed in 2019.
Amazon is also reportedly pursuing energy contracts with Constellation Energy to use nuclear power-generated electricity for data centers located in Pennsylvania.
Meanwhile, Google is investing in seven new small nuclear-power reactors in the U.S. built by nuclear-energy startup Kairos Power. These will be small modular nuclear reactors, expected to initially produce 500 megawatts of nuclear power by 2030. (For more information, see our recent article on small modular nuclear reactors.)
Could Redesigning Large Language Models (LLMs) Algorithms and Implementations Help Make AI More Energy Efficient?
Unless the industry changes direction, the cost of current LLM implementations will remain prohibitively expensive, not only for the environment but for tech companies themselves, who are expected to splash out billions, if not trillions, on AI initiatives in the coming decade. Already, investors are coming to terms with the reality that tools such as ChatGPT lose money every time a user asks a question.
Fortunately, this means that there is a significant incentive for AI companies to improve the efficiency of their AI offerings, which could also help reduce the need for increased power generation and water use.
How can we achieve this so-called Greener AI?
One way is to vet whether it makes sense to use energy-intensive general knowledge LLMs to solve problems when alternatives that use much less energy, such as traditional machine-learning algorithms, can deliver similar results.
In use cases where LLMs are the most suitable choice, there is a lot of opportunity for energy use optimization at each stage of developing LLM solutions.
For example, data scientists are looking at ways to minimize energy usage when creating and training large language models – as well as once they are fully deployed for use by consumers or other software (known as the “inference” stage).
Optimizing AI hardware is another major initiative. Not only are AI chipsets and boards (primarily sourced from industry leader Nvidia) costly to purchase, but they are also expensive to operate due to the huge amount of electricity required to run them – and the cost to keep them cool, which, as we’ve discussed above, drives up water usage for cooling.
Despite all these efforts, LLMs may continue to be costly to use well into the future, leading some AI energy and environmental activists to push for limiting the use of LLMs to those applications which provide the greatest social benefits (such as providing accurate AI-powered weather forecasting) and discouraging the use of frivolous AI applications that produce “unnecessary” work, such as funny cat memes.
Time will tell how all of this pans out.
Formaspace is Your Laboratory Research Partner
Evolving Workspaces. It’s in our DNA.
Talk to your Formaspace Sales Representative or Strategic Dealer Partner today to learn more about how we can work together to make your next construction project or remodel a success.