top of page

"There's little transparency in AI energy consumption"

  • Writer: Matthias Haymoz
    Matthias Haymoz
  • Jul 25
  • 1 min read

In a recent Inside AI podcast episode, Marcel Salathé sat down with SDEA president Babak Falsafi – to unpack the growing energy footprint of artificial intelligence. As AI capabilities scale rapidly, so do the demands on computing infrastructure – and with them, significant sustainability implications.


Marcel Salathé and Babak Falsafi

From high-performance GPUs to cooling systems and electricity sourcing, Falsafi highlights the opaque and underreported reality of AI’s operational impact. He explains how much of the public debate misses the nuanced differences between datacenters, supercomputers, and cloud platforms – distinctions that fundamentally shape the environmental consequences of digital systems.


The conversation also dives into policy gaps, common misconceptions about AI emissions, and why efficiency must go beyond simplistic indicators like PUE. Transparency, Falsafi argues, is essential if we are to build systems that are not just powerful, but also responsible.


Comments


bottom of page