Small Talk, Big Impact: The Energy Cost of Thanking AI
arXiv:2601.22357v1 Announce Type: new
Abstract: Being polite is free – or is it? In this paper, we quantify the energy cost of seemingly innocuous messages such as “thank you” when interacting with large language models, often used by users to convey politeness. Using real-world conversation traces and fine-grained energy measurements, we quantify how input length, output length and model size affect energy use. While politeness is our motivating example, it also serves as a controlled and reproducible proxy for measuring the energy footprint of a typical LLM interaction. Our findings provide actionable insights for building more sustainable and efficient LLM applications, especially in increasingly widespread real-world contexts like chat. As user adoption grows and billions of prompts are processed daily, understanding and mitigating this cost becomes crucial – not just for efficiency, but for sustainable AI deployment.