'Thermodynamic computing' could slash energy use of AI image generation by a factor of ten billion, study claims — prototypes show promise but huge task required to create hardware that can rival current models

Google Nano Banana Pro
(Image credit: Google)

A mind-bending new report claims that 'thermodynamic computing' could, in theory, drastically reduce the energy consumed by AI to generate images, using just one ten-billionth of the energy of current popular tools. As reported by IEEE Spectrum, two recent studies hint at the potential of this burgeoning technology, but its proponents admit the solution is rudimentary.

According to the report, Lawrence Berkeley National Laboratory staff scientist Stephen Withelam claims thermodynamic computing could be used for AI image generation "with a much lower energy cost than current digital hardware can." In a January 10 article published by Whitelam and Corneel Casert, also of Berkeley, the pair outlined how "it was possible to create a thermodynamic version of a neural network," laying the foundations for generating images using thermodynamic computing.

Google Preferred Source

Follow 3DTested on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Stephen Warwick
News Editor
  • DS426
    Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture.
    Reply
  • JRStern
    DS426 said:
    Major datacenters should also be their own power plants, essentially using heat pumps to turn steam turbines to turn generators. It boggles my mind that the wealthiest and most hi-tech companies in the world throw away heat (and often) water with little to no recapture.
    They throw away a lot of heat but it's at low temperatures and that's very hard to recapture efficiently, you can't really boil water with it, you might boil some ammonia if you can cool the ammonia enough in the first place, or design some big honkin' Sterling engine with a boosted cooling cycle... That would be awesome. Would it ever pay for itself? Dubious.
    Reply
  • DS426
    JRStern said:
    They throw away a lot of heat but it's at low temperatures and that's very hard to recapture efficiently, you can't really boil water with it, you might boil some ammonia if you can cool the ammonia enough in the first place, or design some big honkin' Sterling engine with a boosted cooling cycle... That would be awesome. Would it ever pay for itself? Dubious.
    That's true, and it would take additional input (heat pump, etc.) To concentrate that thermal energy into higher usable temps, destroying efficiency and overall value of the system.

    Still, it's hard for me to imagine that we can throw away 300 MW or even more (besides heating the DC itself, which we know is rarely the case based on the fact that most DC's are located in warmer climates) from a single DC.

    Our ancestors from long ago did really clever things to survive and thrive. We're not so clever today; if the economics work out, that's generally good enough.
    Reply