Mitigating the environmental impact of machine learning

Laura Marks

Watch here

Information and communication technologies (ICT) produce more greenhouse gas emissions than the aviation industry, driven by High-resolution streaming media, cryptocurrency, machine learning, and other tech apps. Machine learning is often touted to improve the efficiency of data centers, but that small gain is overwhelmed by the new data centers’ enormous carbon, water, and land footprint. The radical expansion of ICT infrastructure demonstrates the rebound effect (aka Jevons paradox), more efficient technologies encourage greater use of a resource, reducing or eliminating savings.
The electricity consumption of large language models (LLMs) is due to graphics processing units (GPUs); the well-known electrical draw of training large language models; the less-known footprint of inference or individual uses; and electricity-intensive applications like image generation.
In the context of the Computing within Limits movement, my team of computer engineers, a media scholar, and an AI artist are exploring solutions including approximate or inexact computing; small language models; Low-Precision Hardware Architectures; anticipating and mitigating energy demands at the design phase; and my favorite, true-cost accounting. I will also survey ethical issues inherent in the shareholder-capitalist setting of LLMs and ICT.

Bio
Laura Marks is a University Professor at Simon Fraser University (SFU) in the School for the Contemporary Arts, and a Fellow of the Royal Society of Canada. Among her wide-ranging work on media art and philosophy she also co-directs a multi-disciplinary research team studying the environmental impact of machine learning and ways to mitigate it, such as by developing models that use much less electricity.