Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.

Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.

How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?

  • BussyCat@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    8 hours ago

    Because if it’s a closed loop then the heat doesn’t leave the system so at some point you need to open the system. Your options for getting heat to actually leave the system are: evaporate water, air coolers (not efficient with large systems or in warm climate), or water coolers.

    The water coolers sound good but then you are heating up a local water supply which can kill a bunch of local wildlife