What is AI really costing the planet?
Briefly

What is AI really costing the planet?
"In 2023, data centres consumed 4.4% of all US electricity, a figure that could triple by 2028. The recent surge in AI workloads is something categorically different."
"A 2021 paper from Google and UC Berkeley estimated that training GPT-3 alone consumed around 1,287 megawatt hours of electricity, enough to power roughly 120 average American homes for a year."
"Deploying models in real-world applications and fine-tuning them for better performance draws large amounts of energy long after the original development phase is complete."
"A ChatGPT text search was estimated to use nearly ten times as much electricity as a standard Google search, multiplying the impact by billions of queries per day."
AI interactions, such as chatbots and image generation, rely on data centers that consume substantial electricity. In 2023, these centers accounted for 4.4% of US electricity usage, a figure projected to triple by 2028. Training models like GPT-3 requires immense energy, with estimates showing it consumed 1,287 megawatt hours, equivalent to powering 120 homes for a year. The environmental impact extends beyond initial training, as deploying and fine-tuning models also demands significant energy, highlighting the hidden costs of AI technology.
Read at Medium
Unable to calculate read time
[
|
]