Net-zero emission goals went out the window with AI.

  • I’m not 100% down with these numbers. The verge has a breakdown of energy usage for generation and training, and you could argue that demand is responsible for training.

    I would also argue that energy usage would be directly related to water usage. Unless there is passive cooling unrelated to the energy generation, the evaporation would be directly related to the energy cost.

    I didn’t collect sources while I was coming this, but I found that it takes about .3 KWH to generate an AI image - about the same as fully charging a smartphone. 1kwh is ~860 kCal (one Calorie = kCal = 1000 calories) 1 image is ~282 Calories 1 Calorie heats 1 liter of water 1 degree It takes ~540 Calories to vaporise 1 liter 2 images vaporize a liter of water There are ~30k liters in an 18 foot above ground pool with 4ft of water. There were 15 billion images generated daily in May 2024 As of August 2023, people have generated almost 15.5 billion AI-generated images, and each day sees approximately 34 million new AI-generated images. 17 million liters vaporized daily, about 500 swimming pools This put my numbers at ~250k swimming pools vaporized so far.

    •  BrikoX   ( @BrikoX@lemmy.zip ) OP
      link
      fedilink
      English
      11 hour ago

      <…> I found that it takes about .3 KWH to generate an AI image <…>

      There isn’t really set power usage per image, since different models will take different amount of time. There are 100s of different factors and optional toggles that can increase or reduce time needed.

    •  Umbrias   ( @Umbrias@beehaw.org ) 
      link
      fedilink
      English
      25 hours ago

      water supply is a limited resource, everyone here appears to be focusing on the wrong thing. when a data center uses water in its cooling noops, that water is made inaccessible anywhere else, such as agriculture, natural habitats, drinking. it does not matter (directly) that the water technically is potable or not after use. Very little water ever leaves the earth system, yet drought exists.

    • Data centers don’t have “water cooling loops” that are anything like the ones in consumer PCs. To maximize cooling capacity, a lot of the systems use some sort of evaporative cooling that results in some of the water just floating away into the atmosphere (after which point it would need to be purified again before it could be used for human consumption)

      It also seems from what I can find like some data centers just pipe in clean ambient-temperature water, use it to cool the servers, and then pipe it right back out into the municipal sewer system. Which is even more stupid, because you’re taking potable water, sending it through systems that should be pretty clean, and then mixing it with waste water. If anything, that should be considered “gray water”, which is still fine to use for things like flushing toilets.

      • I would be really surprised if anyone is cooling data centres with city water except in emergency, that’s so unbelievably expensive (could see water direct from a lake though but that had it’s own issues too). I recall saving millions just by adjusting a fill target on an evaporative cooling tower so it wouldn’t overfill (levels were really cyclic, targets weren’t tuned for them), and that was only a fraction of what it’d have cost if we’d’ve used pure city.

        • This is correct. You don’t need potable water for cooling systems. Releasing vapor returns natural water where it came from, without adding any more heat to the environment than you already were.

          The environmental cost of AI needs to be measured in gigawatt hours, distributed over different energy generation methods.

          Adding heat to the system isn’t a big deal if you’re powered by solar energy, for example.