Next time you say “please” or “thank you” to ChatGPT, remember that your good manners are burning electricity. A report from Futurism shows that these polite words are costing millions in computing power. Sam Altman, who runs OpenAI (the company behind ChatGPT), and Kurtis Beavers, a design manager at Microsoft, have both pointed out this surprising energy drain.
When you add polite words to your AI questions, you’re making the computer work harder. This extra work uses more electricity, which costs money and affects our environment. It’s like asking someone to carry your bag and then adding a small rock to it – the bag gets a tiny bit heavier, but when millions of people do this, it adds up.
tens of millions of dollars well spent–you never know
— Sam Altman (@sama) April 16, 2025
Why Are We So Nice to Robots?
A late 2024 survey found that 67% of Americans regularly say “please” and “thank you” to AI chatbots. But why be polite to a computer program? The reasons include:
- 55% of people believe being polite to AI is simply the right thing to do, like saying “thank you” to a store clerk
- 12% of users worry that AI might judge them negatively in the future (or even rebel!)
- Many think polite questions get better answers from the AI
- Most people just treat AI like they would treat humans, carrying over their everyday habits
“It feels weird to be rude to something that talks back to you,” explained one survey respondent. Others believe that being nice to AI helps it learn to be nicer in its responses. This human tendency to treat computers like people has been noticed by researchers for years.
How Your “Please” and “Thank You” Use Extra Energy
When you type words to an AI like ChatGPT, each word is broken into tokens – these are the small pieces of text (words or parts of words) that the AI processes. Adding polite phrases means more tokens to process, which requires more computing power.
This extra politeness increases the computational load – how hard the computer has to work – by about 0.3% per question. While that might sound small, it adds up to tens of millions of dollars in extra costs yearly for OpenAI.
Every AI email uses about 0.14 kilowatt-hours (kWh) of electricity. To understand what a kilowatt-hour means: it’s the energy needed to keep a 100-watt light bulb burning for 10 hours, or to run a 1,000-watt appliance (like a small microwave) for one hour. It’s the standard unit that appears on your electricity bill.
The Growing Energy Appetite of AI
Data centers – the massive buildings filled with powerful computers that run AI systems – already use about 2% of all electricity in the world. In 2024, they consumed about 415 terawatt-hours (TWh) of energy. A terawatt-hour is a very large unit of electricity – one TWh could power about 10 crore (100 million) Indian homes for a day.
Time Period | AI Energy Consumption | Percentage of Global Electricity |
---|---|---|
2024 | 415 TWh | 1.5-2% |
2030 (Projected) | 945 TWh | 3-4% |
By 2030, experts predict AI energy use will more than double, reaching about 945 terawatt-hours. This would be roughly equal to all the electricity used in Japan, a country with 126 million people.
The Hidden Cost of Every AI Chat
A single question to ChatGPT uses 2.9 watt-hours of electricity – about 10 times more energy than a Google search. When you add polite words, you’re making each interaction slightly more energy-hungry.
The relationship between politeness and energy use shows a small but meaningful connection (what scientists might call a correlation of around 0.3). This means that while being polite isn’t the biggest energy problem, it does contribute to the overall environmental footprint of AI.
Should We Stop Being Nice to AI?
No one is suggesting we should be rude to AI chatbots. However, understanding that every word has an energy cost might help us be more thoughtful about how we interact with technology.
As AI becomes more common in our daily lives, these small energy costs will grow. Between 2025 and 2030, the energy used by AI-focused data centers could quadruple, putting pressure on our electricity grids and the environment.
Finding the Balance
The challenge is finding a balance. Being mindful of unnecessary words in AI prompts might help save energy. But many experts believe that politeness helps create better AI systems that understand human social norms.
“We’re teaching these systems how humans interact,” said one AI researcher not involved in the study. “The energy cost might be worth it if it helps create more helpful and respectful AI.”
As AI continues to grow, the question of how we talk to our digital assistants will become more important – both for how well they serve us and for the planet we all share.