Skip to main content

Your politeness toward ChatGPT is increasing OpenAI’s energy costs

ChatGPT's Advanced Voice Mode on a smartphone.
OpenAI

Everyone’s heard the expression, “Politeness costs nothing,” but with the advent of AI chatbots, it may have to be revised.

Just recently, someone on X wondered how much OpenAI spends on electricity at its data centers to process polite terms like “please” and “thank you” when people engage with its ChatGPT chatbot .

Recommended Videos

To the poster’s likely surprise, OpenAI Sam Altman actually responded , saying: “Tens of millions of dollars well spent,” before adding: “You never know.”

Many folks who engage with AI chatbots — whether via text or speech — find the conversational experience so realistic that it just feels normal to request and respond politely. But as Altman confirmed, those little extras need to be processed by its power-hungry AI tools, which means more costs to the company, and also to the environment, as most data centers are still powered by electricity generated from fossil fuels

Think about it. Each polite phrase adds to the processing burden, which, when multiplied across billions of queries, results in a significant additional energy use.

A survey carried out in the U.S. last year found that 67% of respondents reported being polite to AI chatbots, suggesting that 33% like to skip the niceties and get straight to the point.

So, should we try to drop the manners and be less courteous in our exchanges with ChatGPT and other AI chatbots? Or just continue being polite, despite the drawbacks.

Research conducted last year found that the level of politeness may well affect the quality of the large language model (LLM) that delivers responses via the chatbots.

“Impolite prompts may lead to a deterioration in model performance, including generations containing mistakes, stronger biases, and omission of information,” the researchers concluded.

On the same issue, a TechRadar reporter who recently experimented by conversing with ChatGPT in a less courteous manner found that the responses “seemed less helpful.”

For many, being less polite toward AI chatbots may be a challenge, and it could even do a lot more than simply lower OpenAI’s energy costs and ease the burden on the environment. The fear among some studying the matter is that if it becomes socially acceptable to be blunt toward AI chatbots, such behavior could begin to leech into interpersonal interactions , potentially making human exchanges less courteous over time.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
OpenAI makes its most advanced coding model available to paid ChatGPT users
ChatGPT models list.

OpenAI has made GPT-4.1 more widely available, as ChatGPT Plus, Pro, and Team users can now access the AI model.

On Wednesday, the brand announced that it brought the model to its direct chatbot service following its original launch, where it was unveiled as an API in April. Its popularity among developers urged OpenAI to make the model available for paid users. It also plans to roll out GPT-4.1 for ChatGPT Enterprise and Edu users in the coming weeks.

Read more
Key ChatGPT and Gemini features compared. Who did it better?
Opera Mini Aria AI chatbot vs ChatGPT and Google Gemini running on Android phones resting on a blue fabric sofa.

The AI industry has blossomed quickly in recent years, and several companies have been in steep competition with one another. Two brands that have especially been neck and neck are OpenAI and Google. These two companies have many services in common within the AI game. Notably, OpenAI has its ChatGPT chatbot and Google has its Gemini tool as flagship features; however, each brand has since launched additional AI services under their respective umbrellas.

Here’s a rundown of the functions and features that ChatGPT and Gemini have in common, and which are ideal to use.

Read more
Tired of monthly payments? ChatGPT could soon offer a lifetime subscription
ChatGPT giving a response about its knowledge cutoff.

ChatGPT usage is more prevalent than ever, and its current model offers a monthly subscription of $20 for ChatGPT Plus or the mind-boggling steep $200 per month for ChatGPT Pro. Beyond that, there are no other options, but an APK teardown suggests the service might soon offer both annual and lifetime subscription plans.

@M1Astra, a user on X, found code strings in the latest build that point to multiple new subscription tiers. There's no clear price on these, but the second-to-last string suggests a discount will be offered for annual subscribers. There's also a line for a possible weekly subscription that would allow users to use more advanced features as needed, rather than paying one month at a time.

Read more