Claim: "AI wastes water and is harmful to the environment."
Verdict: Partially true, but the claim oversimplifies a complex, evolving picture.
There’s no denying that AI systems, particularly the large-scale data centers that power them, consume substantial amounts of water, primarily to cool server hardware. According to the UN Environment Programme, “AI-related infrastructure may soon consume six times more water than Denmark... when a quarter of humanity already lacks access to clean water and sanitation” (UNEP, 2024). The sheer volume of water use raises concern, but calling it “waste” glosses over a more complex reality.
Water use is not inherently “wasteful,” it depends on context. Whether AI’s consumption constitutes waste is contingent on factors like regulatory oversight, the ability to recycle or replenish water, and where the data centers are located. In some regions, “training... a single AI model... can lead to the evaporation of an astonishing amount of fresh water” (Harvard Business Review, 2024), making the environmental impact deeply local. Siting AI facilities in drought-prone regions, for example, can exacerbate water scarcity and disproportionately affect vulnerable communities.
Yet, it’s also worth noting that many companies are actively pursuing “water positive” goals by 2030 (Harvard Business Review, 2024), and distributing computational loads more equitably across global data centers can mitigate local harm (Harvard Business Review, 2024).
As for broader environmental impact, the energy footprint is significant. Ireland’s Central Statistics Office reported a 400% rise in data center electricity usage from 2015 to 2022. Yet regulation hasn’t kept pace: “Water usage receives even less regulatory attention” than emissions (MIT News, 2024).
In sum, AI's environmental toll is real, but whether it’s wasteful or net-harmful is still a matter of governance, technology design, and where we choose to draw the line.
Sources: