More

    GPT-5 drives AI resource surge, but data centers show surprising efficiency

    Published on:


    AI is everywhere now, with topics like GPT-5 and Grok’s recent open-source announcement dominating headlines. One of the primary (and most valid) criticisms of the technology is how energy hungry it is; some studies suggest GPT-5 burns through enough electricity to power 1.5 million homes daily, not to mention the amount of water it takes to keep those data centers cooled.

    But a new report indicates there could be another option, and one that doesn’t mean we have to scorch the planet in order to use AI technology. The UK Environment Agency and techUK worked together to survey 73 data centers around England, and the results are both surprising and encouraging.

    According to TechUK, 51 percent of the data centers used waterless cooling systems, and 64 percent used less than 10,000 m³ of water per year. For reference, that’s less than the average recreation center. And 89 percent of data centers monitored water usage to seek out more climate-conscious results.

    Sacramento Kings PR / N/A

    The findings are contrary to the commonly-held belief that AI data centers burn through tremendous amounts of water. Richard Thompson, Deputy Director for Water Resources at the UK Environment Agency, said “I am encouraged by the work techUK have undertaken to better understand water usage – the findings suggest UK data centres are utilising a range of cooling technologies and becoming more water conscious. Advancements in technology must go hand-in-hand with protecting public water supplies, food security and the environment. It is vital the sector puts sustainability at its heart, and minimises water use in line with evolving standards. We are working with industry and other regulators to raise these to secure the best outcomes for our environment and our water supply for future generations.”

    There’s still the issue of power

    While it’s encouraging to see that data centers use less water than previously thought, no one can deny that they consume an absurd amount of power. Although the situation has improved since 2023, the planet is still facing an energy crisis, and the ever-growing need for power for artificial intelligence is throwing gasoline on an already out-of-control flame.

    The problem is that the specific amount of energy required isn’t clear. A 2024 study from Berkeley Lab shows an increase in energy usage that corresponds to the rise in popularity of AI assistants, but exact numbers are hard to pinpoint because so few AI companies publicize their usage data.

    Nirave Gondhia / Digital Trends

    According to ChatGPT’s Sam Altman, the average query uses about 0.34 watt-hours of energy. That might not seem like much, but it’s a substantial amount when you consider the millions of queries that pass through the AI each day. It’s about the equivalent amount of energy that an LED lightbulb uses in two minutes.

    Following that, Google released its own usage numbers. However, the search giant warns that these “substantially” underestimate Gemini’s overall footprint, with an average of 0.10 watt-hours of energy and 0.12mL of water used per query.

    As for Grok, it isn’t clear how much energy the platform uses, but a chat suggested it might use 1 to 2 watt-hours of energy.

    AI energy requirements are dropping as the technology matures. Google reported that “over a recent 12-month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33x and 44x, respectively, all while delivering higher quality responses.” (Transcribed by ZDNet.)

    AI might be the future, but it can’t cost the planet

    The environmental concerns surrounding artificial intelligence haven’t gone unnoticed. Google recently proposed a “full-stack” sustainability solution that will address the issue fro multiple angles. The company promised to scale back usage during peak hours to prevent blackouts, as well as to look into maximize performance and implementing techniques like speculative decoding that reduces the workload, and hence, the energy demand.

    Nadeem Sarwar / Digital Trends

    However, Google is only one company, and even though its numbers are improving, the company’s overall energy usage has doubled over the past four years. A 2023 study showed that nearly 30 percent of Americans use artificial intelligence on a daily basis (a number that has surely grown since then), while a recent Reuters/Ipsos poll shows that 61 percent are concerned about the energy costs.

    These numbers make it clear that companies like OpenAI and Google have to find solutions that reduce the impact of artificial intelligence. The technology is new, and some waste is expected with any cutting-edge tech — but it has been around for long enough now that the effect data centers have on local communities and the planet at large is clear.

    When computers first entered the scene, the average PC was the size of an entire room. Now they fit in our pockets. AI tech will need to follow that same pattern, but on a larger scale and with a focus on the resources it demands. I don’t see any chance of the world putting this genie back in its bottle, but at a time when we’re already experiencing a climate crisis, we have to be more ethical and considerate with our approach.

    The discovery that some data centers are using far less water than expected is encouraging, but it’s just a small part in a much larger endeavor.



    Related

    Leave a Reply

    Please enter your comment!
    Please enter your name here