OpenAI CEO Sam Altman recently shared that users being polite to ChatGPT is costing the company “tens of millions of dollars” in processing power and electricity. When asked on social media to estimate the cost, Altman responded, “Tens of millions of dollars well spent — you never know.”
tens of millions of dollars well spent–you never know
— Sam Altman (@sama) April 16, 2025
The revelation has sparked discussions about the real-world costs of AI interactions that many users take for granted. When people type phrases like “please” and “thank you” to ChatGPT, each of these inputs must be processed by the system, consuming computing resources and electricity.
A December 2024 survey by Future found that 67% of American users are polite to AI assistants. Of those, 55% do so because they feel it’s the right thing to do. The remaining 12% are polite out of concern that mistreating AI could have future consequences.
Some users believe in treating AI models with courtesy as a moral practice. Engineer Carl Youngblood explained his perspective: “Treating AIs with courtesy is a moral imperative for me. I do it out of self-interest. Callousness in our daily interactions causes our interpersonal skills to atrophy.”
Treating AIs with courtesy is a moral imperative for me. I do it out of self-interest. Callousness in our daily interactions causes our interpersonal skills to atrophy. https://t.co/fNZxeG6kjj
— Carl Youngblood (@cayblood) April 16, 2025
The Environmental Impact
The electricity consumption of AI systems like ChatGPT has become a topic of debate among researchers. A September 2023 paper by Digiconomist founder Alex de Vries claimed that a single ChatGPT query requires approximately three watt-hours of electricity.
However, data analyst Josh You from AI research institute Epoch AI argues this figure is an overestimate. You suggests the actual amount is closer to 0.3 watt-hours, citing more efficient models and hardware compared to 2023.
One report even suggested that a short three-word response like “You are welcome” from a large language model uses roughly 40-50 milliliters of water (likely in cooling systems for the servers).
Despite these costs, Altman has stated that the price of AI output has been falling at a rate of tenfold every year as AI models become more efficient. This efficiency improvement helps offset the growing number of interactions as more users adopt the technology.
Some observers have wondered why ChatGPT doesn’t implement a solution to save electricity costs on courtesy words. However, programming models to handle common phrases could be more complex than it appears.
Financial Outlook
While spending millions on polite phrases might seem wasteful, OpenAI is focusing on long-term growth. The company expects to more than triple its revenue this year to $12.7 billion, despite increasing competition from rivals like China’s DeepSeek.
OpenAI does not anticipate becoming cash-flow positive until 2029, when it projects its revenue will exceed $125 billion. This long-term outlook suggests the company is prioritizing growth and user experience over immediate profitability.
Researchers at OpenAI and MIT have suggested that some people may become emotionally dependent or even addicted to AI chatbots. This behavior is expected to increase as AI conversations become more human-like.
For premium users who are charged on a per-token basis, phrases like “thank you” are already included in the services they’ve paid for. Free users contribute to the costs without direct payment.
The discussion around the cost of politeness highlights the complex economics of operating large AI systems. While these courtesy phrases come with a price tag, they also help create a more natural interaction between humans and AI.
As Altman’s comment suggests, OpenAI views these expenses as an investment in user experience rather than wasteful spending. The future of AI-human interaction may well be shaped by these seemingly small but costly exchanges.
Source: https://blockonomi.com/your-please-to-chatgpt-is-more-expensive-than-you-think/