Training and running large language models like ChatGPT consumes significant energy and water resources.
Key Data Points
- Training GPT-3 consumed ~1,287 MWh of electricity
- Each ChatGPT query uses roughly 10x the energy of a Google search
- Data centers require massive water cooling systems
- OpenAI and Microsoft have committed to renewable energy goals
As AI scales, its environmental footprint is a growing area of policy and research concern.
Reference: