Saying 'Please' to ChatGPT is Costing OpenAI Millions of Dollars

Every ChatGPT prompt triggers thousands of calculations, consuming significant energy and water

Saying 'Please' to ChatGPT is Costing OpenAI Millions of Dollars

Being courteous to AI might come at a price—literally. OpenAI CEO Sam Altman recently weighed in on a humorous observation shared by an X (formerly Twitter) user, who questioned the potential cost of users saying “please” and “thank you” to ChatGPT.

User @tomieinlove posted, “I wonder how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models.” The post quickly gained attention, sparking an online debate about the real-world impact of human-AI etiquette.

Altman responded with a tongue-in-cheek remark:
“Tens of millions of dollars well spent—you never know.”

His light-hearted comment drew laughs but also highlighted an interesting intersection of AI, user behavior, and operational cost.

Every ChatGPT prompt triggers thousands of calculations, consuming significant energy and water. Generating a 100-word email uses enough power to run 14 LED bulbs for an hour and over 500ml of water for cooling. Business Energy UK combined usage data to estimate ChatGPT's total daily, monthly, and yearly environmental impact.

In 2023, reports estimated that OpenAI was spending around $700,000 per day to keep ChatGPT—then powered by GPT-3.5—operational.