MIT Report Reveals Shocking Energy Costs Behind AI Queries and Video Generation

Published on 12 March, 2026

A comprehensive investigation by the MIT Technology Review has provided new clarity on the energy demands of the artificial intelligence industry, moving beyond general statistics to reveal the specific power costs of generative models. The report indicates that the environmental footprint of AI is substantial and escalating, particularly as the technology integrates into everyday services.


Quantifying the Cost of Intelligence


While a popular analogy suggests a single ChatGPT query consumes a bottle of water, the actual energy expenditure varies significantly based on the model's complexity. Researchers found that generating a text response can require anywhere from 114 to 6,706 joules. To visualize this, the lower end is comparable to running a microwave for one-tenth of a second, while the upper end equates to running the same appliance for eight seconds.


The study notes a clear trade-off between efficiency and capability. Models that consume less energy typically rely on fewer parameters, resulting in reduced accuracy. Conversely, more resource-intensive models offer better results but demand significantly more power.


The High Price of Video Generation


The energy requirements increase drastically when moving from text to video content. According to the report, creating a five-second video using a newer AI model consumes approximately 3.4 million joules. This figure is roughly 700 times the energy needed to produce a high-quality image and is equivalent to running a microwave for over an hour.


To demonstrate the cumulative impact of daily usage, researchers calculated the energy cost of a hypothetical session involving 15 text questions, 10 image generations, and three five-second videos. This specific workload would consume about 2.9 kilowatt-hours of electricity—comparable to running a microwave continuously for three and a half hours.


Data Center Strain and Future Projections


The surge in AI adoption has effectively ended a period of stagnation in data center energy usage. Prior to the current AI boom, electricity consumption in US data centers had remained flat due to efficiency gains. However, the report confirms that usage has doubled since 2017.


Looking ahead, government data suggests that by 2028, AI tools will account for half of all electricity used by data centers. This comes at a time when tech giants like Google are aggressively integrating AI into core products such as Search, Gmail, and Docs. As generative AI becomes ubiquitous across various sectors, the findings underscore the urgent need to address the growing energy footprint of digital infrastructure.

Comments

Leave a comment