As the environmental costs of tools like ChatGPT and DALL-E mount, governments are demanding more clarity from tech companies.
The explosion of AI-powered chatbots and image generators, like ChatGPT and DALL-E, over the past two years is changing the way we interact with technology. Their impressive abilities to generate lifelike images from written instructions or write an essay on the topic of your choosing can seem a bit like magic.
But that “magic” comes at a steep environmental cost, researchers are learning. The data centers used to power these models consume an enormous amount of not just electricity, but also fresh water to keep everything running smoothly. And the industry shows no signs of slowing down. It was reported earlier this month that Sam Altman, the CEO of leading AI company OpenAI, is seeking to raise about $7 trillion to reshape the global semiconductor industry for AI chip production.
Ira Flatow is joined by Dr. Jesse Dodge, research scientist at the Allen Institute for AI, to talk about why these models use so much energy, why the placement of these data centers matter, and what regulations these companies could face.
Transcripts for this segment will be available the week after the show airs on sciencefriday.com
Subscribe to this podcast. Plus, to stay updated on all things science, sign up for Science Friday's newsletters.