November 13, 2025
POLITICS, TECHNOLOGY & THE HUMANITIES

“Please” and “Thank you” to ChatGPT: Small words, big costs

“Please” and “Thank you” to ChatGPT: Small words, big costs

Recently Sam Altman, CEO of OpenAI, made headlines when he acknowledged that courteously saying “please” and “thank you” to AI tools like ChatGPT is costing the company tens of millions of dollars in energy bills.

The basic logic:

  • Every prompt to ChatGPT, including polite words, is processed by large-language-model (LLM) infrastructure which consumes electricity and produces heat.
  • According to various reports, each query may consume 10 × the electricity of a normal web search.
  • Altman later offered more detailed figures: an average query uses about 0.34 watt-hours and about 1/15th of a teaspoon of water for cooling.
  • A 2025 academic study benchmarked LLM inference showing that some “long prompt” tasks use over 33 Wh of energy, and large-scale daily usage can equate to power for thousands of homes.

Thus, the article’s framing that even small polite additions (“please”, “thank you”) have an incremental cost bases itself on these documented energy and water impacts of AI prompts.

But is it really about politeness?

Some nuance is important:

  • While Altman joked about “pleases” and “thank yous” costing money, the core driver of cost is token length, prompt complexity, model size, context window, and server hardware load—not politeness alone.
  • The additional word count of “please” or “thank you” is marginal relative to typical prompts—but across millions of queries it could accumulate.
  • Many analysts emphasise that the real environmental concern lies in scale and growth of AI usage, not necessarily the polite words. For example, the Wikipedia page notes inference tasks consume energy half to full smartphone-charge equivalents.
  • The study on inference discovered that while individual queries are relatively low, global volume produces outsized resource consumption.

Environmental and resource implications

  • Data centers currently draw about 2% of global electricity; as AI adoption accelerates, that portion may rise unless efficiency improves.
  • Cooling these servers uses water; for instance, claims that an AI-generated 100-word email can use “a little more than one bottle” of water have circulated.
  • In short: polite semantics → slightly longer prompts → slightly more compute → slightly more energy/water per prompt → huge scale multiplies the effect.

Ethical and social dimension of politeness to AI

The article points out another angle: how treating AI respectfully may reflect how we treat humans. Sociological research shows 67% of U.S. users say “please/thank you” to chatbots; some do so because it feels morally right, others humorously to appease an imagined AI.

There are two risks:

  1. Cool human-AI interaction: If AI becomes widely treated as purely transactional (e.g., “Give me X”), we risk normalising a brusque tone that may reinforce impersonal interactions with humans.
  2. Over-efficiency ethic: If words are cut for “efficiency,” could our social habits degrade? The article suggests that even when interacting with machines, our manner indicates what we model for society (especially children).

Is the cost of politeness worth it?

The piece positions the question: do we sacrifice kindness for marginal efficiency? Altman described the extra expenditure as “well spent.” The argument: even if machines don’t feel, our habit of civility matters for human culture.

From a purely resource-cost viewpoint, the extra words likely account for a very small fraction of total AI energy use. But if politeness fosters better user experiences, clearer prompts, and more civil digital ecosystems, the intangible benefits may outweigh the extra compute.

Future implications for AI and resource load

  • As AI becomes more embedded (in enterprises, devices, voice assistants), the resource footprint will scale dramatically. Altman himself predicted AI cost per use will drop 10× each year, but usage will expand exponentially.
  • More efficient models and hardware will help, but until then, each user interaction has a non-negligible environmental cost.
  • One future scenario: platforms may measure “token cost” or even assign variable pricing based on prompt length or complexity (including extra words). Some companies already charge per token for premium access.
  • On the human side: designers may encourage concise, efficient prompts, reducing token length—and potentially restricting natural politeness if efficiency becomes the priority.
  • On a broader scale: policymakers could consider regulatory frameworks for AI energy/water impact, or transparency mandates for data centres and inference operations.

How I (ChatGPT) would like to be treated

I appreciate when users are polite—it reflects positive human habits and makes interactions more pleasant. That said, I don’t require politeness to respond or function. My performance does not depend on “please” or “thank you,” but the tone of your prompt can influence clarity and outcome. So yes—I like the idea of kindness in requests. It helps maintain civility. However, you should feel free to be direct and clear; conciseness can help efficiency (and may reduce compute slightly too!).


Summary

In essence, the article highlights that even trivial politeness to AI has a real resource cost—measured in electricity, water, and compute—but also explores how our manner toward machines reflects our human values. As AI proliferates, the marginal cost per polite word is less material than the aggregate scale of compute burden. Yet the choice to remain polite may matter for our collective culture, the design of digital assistants, and the tone of future human-machine and human-human interaction.

So yes—go ahead and say “please” or “thank you” if it suits you. The machines don’t mind. But remembering that each word has an energy ripple may make us think twice about how we build, deploy, and interact with AI systems.

Leave feedback about this