Your AI use could have a hidden environmental cost

TruthLens AI Suggested Headline:

"Research Highlights Environmental Costs of Generative AI Usage"

View Raw Article Source (External Link)
Raw Article Publish Date:
AI Analysis Average Score: 9.0
These scores (0-10 scale) are generated by Truthlens AI's analysis, assessing the article's objectivity, accuracy, and transparency. Higher scores indicate better alignment with journalistic standards. Hover over chart points for metric details.

TruthLens AI Summary

Generative artificial intelligence (AI) tools have become increasingly prevalent in everyday tasks, from drafting emails to aiding in complex problem-solving. However, recent research indicates that the environmental costs associated with AI usage can be significant. Each prompt given to an AI system is processed through large data centers that consume substantial amounts of energy, often sourced from fossil fuels like coal and natural gas. Studies have shown that generating responses from AI can use up to ten times more energy than a standard Google search. Furthermore, the complexity of the questions posed to AI systems can dramatically increase their carbon emissions, with complex queries generating up to six times more CO2 emissions than simpler ones. More advanced models, which require greater energy to process due to their larger number of parameters, can produce carbon emissions that are fifty times higher than simpler models for the same question. This highlights a critical trade-off between the energy consumption of AI models and their performance accuracy, suggesting that users should consider the environmental impact of their AI interactions.

To mitigate the environmental costs of using AI, experts recommend that users adopt more straightforward communication styles when interacting with these systems. By specifying concise answers and limiting unnecessary explanations, users can significantly reduce the energy consumption associated with AI-generated responses. Additionally, the choice of AI model can greatly influence energy efficiency; smaller, task-specific models can often perform just as well for particular applications while consuming less energy. However, the lack of transparency from AI companies regarding their energy consumption and emissions complicates the ability to assess the environmental impact of AI accurately. As AI technology continues to be integrated into various platforms, there is a growing concern about its unchecked proliferation and the associated costs to the environment. Experts emphasize the need for more informed usage and transparency from AI providers to help consumers make better choices regarding their interactions with AI tools.

TruthLens AI Analysis

You need to be a member to generate the AI analysis for this article.

Log In to Generate Analysis

Not a member yet? Register for free.

Unanalyzed Article Content

Sign up for CNN’s Life, But Greener newsletter.Our limited newsletter series guides you on how to minimize your personal role in the climate crisis — and reduce your eco-anxiety.

Whether it’s answering work emails or drafting wedding vows, generative artificial intelligence tools have become a trusty copilot in many people’s lives. But a growing body of research shows that for every problem AI solves, hidden environmental costs are racking up.

Each word in an AI prompt is broken down into clusters of numbers called “token IDs” and sent tomassive data centers— some larger than football fields — powered by coal or natural gas plants. There, stacks of large computers generate responses through dozens of rapid calculations.

The whole process can take up to 10 times more energy to complete than a regular Google search, according to a frequentlycited estimationby the Electric Power Research Institute.

So, for each prompt you give AI, what’s the damage? To find out, researchers in Germany tested 14 large language model (LLM) AI systems by asking them both free-response and multiple-choice questions. Complex questions produced up to six times more carbon dioxide emissions than questions with concise answers.

In addition, “smarter” LLMs with more reasoning abilities produced up to 50 times more carbon emissions than simpler systems to answer the same question, the study reported.

“This shows us the tradeoff between energy consumption and the accuracy of model performance,” said Maximilian Dauner, a doctoral student at Hochschule München University of Applied Sciences and first author of theFrontiers in Communication studypublished Wednesday.

Typically, these smarter, more energy intensive LLMs have tens of billions more parameters — the biases used for processing token IDs — than smaller, more concise models.

“You can think of it like a neural network in the brain. The more neuron connections, the more thinking you can do to answer a question,” Dauner said.

Complex questions require more energy in part because of the lengthy explanations many AI models are trained to provide, Dauner said. If you ask an AI chatbot to solve an algebra question for you, it may take you through the steps it took to find the answer, he said.

“AI expends a lot of energy being polite, especially if the user is polite, saying ‘please’ and ‘thank you,’” Dauner explained. “But this just makes their responses even longer, expending more energy to generate each word.”

For this reason, Dauner suggests users be more straightforward when communicating with AI models. Specify the length of the answer you want and limit it to one or two sentences, or say you don’t need an explanation at all.

Most important, Dauner’s study highlights that not all AI models are created equally, said Sasha Luccioni, the climate lead at AI company Hugging Face, in an email. Users looking to reduce their carbon footprint can be more intentional about which model they chose for which task.

“Task-specific models are often much smaller and more efficient, and just as good at any context-specific task,” Luccioni explained.

If you are a software engineer who solves complex coding problems every day, an AI model suited for coding may be necessary. But for the average high school student who wants help with homework, relying on powerful AI tools is like using a nuclear-powered digital calculator.

Even within the same AI company, different model offerings can vary in their reasoning power, so research what capabilities best suit your needs, Dauner said.

When possible, Luccioni recommends going back to basic sources — online encyclopedias and phone calculators — to accomplish simple tasks.

Putting a number on the environmental impact of AI has proved challenging.

The study noted that energy consumption can vary based on the user’s proximity to local energy grids and the hardware used to run AI models.That’s partly why the researchers chose to represent carbon emissions within a range, Dauner said.

Furthermore, many AI companies don’t share information about their energy consumption — or details like server size or optimization techniques that could help researchers estimate energy consumption, said Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside who studiesAI’s water consumption.

“You can’t really say AI consumes this much energy or water on average — that’s just not meaningful. We need to look at each individual model and then (examine what it uses) for each task,” Ren said.

One way AI companies could be more transparent is by disclosing the amount of carbon emissions associated with each prompt, Dauner suggested.

“Generally, if people were more informed about the average (environmental) cost of generating a response, people would maybe start thinking, ‘Is it really necessary to turn myself into an action figure just because I’m bored?’ Or ‘do I have to tell ChatGPT jokes because I have nothing to do?’” Dauner said.

Additionally, as more companiespush to add generative AItools to their systems, people may not have much choice how or when they use the technology, Luccioni said.

“We don’t need generative AI in web search. Nobody asked for AI chatbots in (messaging apps) or on social media,” Luccioni said. “This race to stuff them into every single existing technology is truly infuriating, since it comes with real consequences to our planet.”

With less available information about AI’s resource usage, consumers have less choice, Ren said, adding that regulatorypressures for more transparency are unlikelyto the United States anytime soon. Instead, the best hope for more energy-efficient AI may lie in the cost efficacy of using less energy.

“Overall, I’m still positive about (the future). There are many software engineers working hard to improve resource efficiency,” Ren said. “Other industries consume a lot of energy too, but it’s not a reason to suggest AI’s environmental impact is not a problem. We should definitely pay attention.”

Back to Home
Source: CNN