How Kind Words Are Racking Up Electricity Bills
It turns out that our online manners might be costing more than we think, especially if you’re someone who says “please” and “thank you” to ChatGPT. Recently, OpenAI CEO Sam Altman revealed that these tiny courtesies are contributing and costing around tens of millions of dollars in energy expenses.
And yes, he’s not playing around. This is serious business.
It All Started with a Tweet
This revelation came after a curious user on X, tweeted and asked this:

Why Politeness Comes at a Price
Let me explain it to you in simple words, every time you send a message to ChatGPT, even a small one such as “thanks” it activates a large language model (LLM) which runs on powerful data centers. Furthermore, these data centers rely on thousands of high-performance GPUs. These GPUs require massive amounts of electricity to function and stay cool.
Even small and simple phrases trigger a new round of data processing, generating responses in real-time. When you multiply that by millions of users around the globe, the cost of being polite becomes surprisingly very high.
While OpenAI grapples with the high energy cost for every polite message, some companies have been working on solutions. Firmus, for instance, has been making in headlines by revolutionizing AI data centers with a focus on energy efficiency.
“Thank You” Isn’t Free
According to a Goldman Sachs Research report, the amount of electricity ChatGPT-4 consumes is about 2.9 watt-hours per query which is nearly ten times more than a standard Google search. ChatGPT gets over 1 billion queries per day, therefore, OpenAI burns through an estimated 2.9 million kilowatt-hours daily! That’s enough energy to power thousands of homes.
Another prediction by Goldman Sachs is that by the year 2028, AI will account for around 19% of global data center power demand. This is a staggering figure that shows just how resource-heavy AI models have already become and a prediction of what is yet to come.
Sam Altman’s Longstanding Warnings
At the World Economic Forum in Davos in January 2024, Sam Altman made it clear: AI is going to need a lot more energy. He emphasized that the next wave of generative AI will use far more power than most people realize. And he also gave us a warning that our current energy systems just won’t be able to keep up, as reported by The Guardian.
To make AI’s future possible, he said we need real breakthroughs in energy such as nuclear fusion. Without it, scaling AI could hit a wall.
Why This Actually Matters
This might seem like a just another funny “tech moment,” but in reality it’s actually a glimpse into a much bigger picture:
AI runs on real energy. Real resources. Real costs.
Moreover, as AI becomes more ingrained in our lives, the smallest things like how we speak to it will start to matter at scale(it actually has already started).
So… Should We Stop Saying “Thank You”?
No! Be kind. Be human. But at the same time just be aware that every little interaction with AI is part of a much bigger ecosystem.
Next time you thank ChatGPT, know it’s costing OpenAI and the planet. But also know that being polite never really goes out of style.
Even if the servers feel it.