Skip to main content

ChatGPT’s resource demands are getting out of control

a server
panumas nikhomkhai / Pexels

It’s no secret that the growth of generative AI has demanded ever increasing amounts of water and electricity, but a new study from The Washington Post and researchers from University of California, Riverside shows just how many resources OpenAI’s chatbot needs in order to perform even its most basic functions.

In terms of water usage, the amount needed for ChatGPT to write a 100-word email depends on the state and the user’s proximity to OpenAI’s nearest data center. The less prevalent water is in a given region, and the less expensive electricity is, the more likely the data center is to rely on electrically powered air conditioning units instead. In Texas, for example, the chatbot only consumes an estimated 235 milliliters needed to generate one 100-word email. That same email drafted in Washington, on the other hand, would require 1,408 milliliters (nearly a liter and a half) per email.

Recommended Videos

Data centers have grown larger and more densely packed with the rise of generative AI technology, to the point that air-based cooling systems struggle to keep up. This is why many AI data centers have switched over to liquid-cooling schemes that pump huge amounts of water past the server stacks, to draw off thermal energy, and then out to a cooling tower where the collected heat dissipates.

ChatGPT’s electrical requirements are nothing to sneeze at either. According to The Washington Post, using ChatGPT to write that 100-word email draws enough current to operate more than a dozen LED lightbulbs for an hour. If even one-tenth of Americans used ChatGPT to write that email once a week for a year, the process would use the same amount of power that every single Washington, D.C., household does in 20 days. D.C. is home to roughly 670,000 people.

This is not an issue that will be resolved any time soon, and will likely get much worse before it gets better. Meta, for example, needed 22 million liters of water to train its latest Llama 3.1 models . Google’s data centers in The Dalles, Oregon, were found to consume nearly a quarter of all the water available in the town , according to court records, while xAI’s new Memphis supercluster is already demanding 150MW of electricity — enough to power as many as 30,000 homes — from the the local utility, Memphis Light, Gas and Water.

Andrew Tarantola
Former Computing Writer
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
OpenAI makes its most advanced coding model available to paid ChatGPT users
ChatGPT models list.

OpenAI has made GPT-4.1 more widely available, as ChatGPT Plus, Pro, and Team users can now access the AI model.

On Wednesday, the brand announced that it brought the model to its direct chatbot service following its original launch, where it was unveiled as an API in April. Its popularity among developers urged OpenAI to make the model available for paid users. It also plans to roll out GPT-4.1 for ChatGPT Enterprise and Edu users in the coming weeks.

Read more
Key ChatGPT and Gemini features compared. Who did it better?
Opera Mini Aria AI chatbot vs ChatGPT and Google Gemini running on Android phones resting on a blue fabric sofa.

The AI industry has blossomed quickly in recent years, and several companies have been in steep competition with one another. Two brands that have especially been neck and neck are OpenAI and Google. These two companies have many services in common within the AI game. Notably, OpenAI has its ChatGPT chatbot and Google has its Gemini tool as flagship features; however, each brand has since launched additional AI services under their respective umbrellas.

Here’s a rundown of the functions and features that ChatGPT and Gemini have in common, and which are ideal to use.

Read more
Tired of monthly payments? ChatGPT could soon offer a lifetime subscription
ChatGPT giving a response about its knowledge cutoff.

ChatGPT usage is more prevalent than ever, and its current model offers a monthly subscription of $20 for ChatGPT Plus or the mind-boggling steep $200 per month for ChatGPT Pro. Beyond that, there are no other options, but an APK teardown suggests the service might soon offer both annual and lifetime subscription plans.

@M1Astra, a user on X, found code strings in the latest build that point to multiple new subscription tiers. There's no clear price on these, but the second-to-last string suggests a discount will be offered for annual subscribers. There's also a line for a possible weekly subscription that would allow users to use more advanced features as needed, rather than paying one month at a time.

Read more