Dr. Tushar Sharma aims to reduce emissions by ensuring that artificial intelligence tools run as cleanly and efficiently as possible. Credit: Danny Abuel
Artificial intelligence (AI) has changed the world as we know it. It’s used for everything from health care monitoring to speech writing. However, the environmental impact of this technology has become a serious concern.
One of the most well-known AI models, ChatGPT is a form of generative AI that uses natural language processing to respond to user queries in a chatbot-style web interface.
When OpenAI, the company that created ChatGPT, was training its third-generation model (that is, teaching it what content to generate in response to user questions), 120 You used enough electricity to power your household for a year.
And training is just one aspect of AI model emissions. The biggest impact over time is on model inference, the process of running the model live. Large language models like ChatGPT are constantly running and waiting for user questions.
The data centers needed to power these models currently account for 3% of global energy consumption, with little use of renewable energy sources, and according to Forbes, They emit the same amount of CO2.
Introducing Dr. Tushar Sharma, Assistant Professor in the Department of Computer Science at Dalhousie University.
Dr. Sharma’s research focuses on sustainable AI and software engineering. In other words, he ensures that the source code that builds and runs these models is as clean and efficient as possible. If not, identify it and fix it.
Dr. Sharma’s SMART Lab recently published an article in ACM Transactions on Software Engineering and Methodology detailing how to measure the energy consumption of AI models at a granular level by identifying which parts of the code are consuming the most power. We announced the results of the survey mentioned above. (Think of your home’s electricity bill. Your electricity bill gives a great indication of your home’s energy consumption, but it usually doesn’t give you a detailed look at which appliances are consuming the most power.) )
In another study, his lab sifted through dozens of layers of code within an AI model to “prune” tokens that were no longer relevant, useful, or effective.
“We strategically move through each layer of these large models to reduce the computation required internally,” he explains.
The idea is to train models more efficiently, reducing power consumption and subsequent emissions. “We’re trying to make sure we don’t have to use as much electricity or time, which reduces our energy and carbon footprint,” he says. “The ideal scenario would be to reduce the energy required to train and operate these systems without sacrificing any benefits.”
So, is AI worth it?
Dr. Christian Blouin, acting dean of the Department of Computer Science at Dar University, said AI has the potential to change the world as we know it, whether by making technology more environmentally friendly or not. , says it will happen.
“We have a responsibility to find better ways to tackle important problems with fewer resources,” he says. “As people discover new ways to leverage AI, it’s important to develop computer science to make it more sustainable.”
This balance is especially important for those working in the climate sector. Dr. Anya Waite is the CEO and Scientific Director of Ocean Frontier Institute (OFI), a research institute at Dal. OFI studies the ocean’s changing role in the climate system and provides solutions to mitigate climate change.
Dr. Waite said that while AI is an important tool for managing data and improving efficiency and accuracy, it becomes unsustainable if it consumes more energy than it saves.
“Dr. Sharma’s research is critical because it supports the efficiency of AI, reducing costs and carbon emissions,” she says. “Ultimately, without research like Dr. Sharma’s, we risk losing our ability to launch new innovations and potentially miss out on the huge benefits they bring.”
difficult balance
Dr. Michael Freund, director of Dal’s Clean Technology Research Institute (CTRI), says users are not always aware of the infrastructure and operations required to support the technology they use.
“Responsible growth of AI requires consideration of environmental factors,” says Dr. Freund. “Efficient operations are needed, including more efficient code, responsible use, and connecting data centers to green energy sources.”
Like OFI, CTRI frequently uses AI to improve operational efficiency, which he admits is a difficult balance to strike.
“Work by researchers like Dr. Sharma will highlight the true value of AI and inform decisions about how it is developed and used,” he says.
The future of green AI
Converting data centers to using renewable energy sources is also a big hurdle, and Dr. Sharma believes that research like his, combined with solar, wind, and hydropower, will make AI more environmentally friendly. It says that it will be.
“All of these technologies will ultimately help us reach our goal of green AI and help us find ways to continue using these machine learning models while keeping energy costs down.”
Further information: Saurabhsingh Rajput et al, Enhancing Energy-Awareness in Deep Learning through Fine-Grained Energy Measurement, ACM Transactions on Software Engineering and Methodology (2024). DOI: 10.1145/3680470
Provided by Dalhousie University
Quote: Researchers want to make sure AI doesn’t destroy the environment (September 30, 2024) from https://techxplore.com/news/2024-09-ai-doesnt-environment.html Retrieved September 30, 2024
This document is subject to copyright. No part may be reproduced without written permission, except in fair dealing for personal study or research purposes. Content is provided for informational purposes only.