How CIOs can reduce AI's negative environmental impact

AI is a power-hungry beast -- here's how to tame it.

AI's energy consumption is skyrocketing, with mounting research that large-scale AI models consume immense amounts of power.

A ChatGPT query needs almost 10 times more electricity than a Google search, according to Goldman Sachs research. And, by 2030, AI will drive a 160% increase in data center energy demands.

And a 2023 study from Hugging Face and Carnegie Mellon University estimated that generating an AI image can use as much electricity as charging your smartphone.

These are important considerations since the AI adoption curve is only beginning. As adoption accelerates and AI models grow in complexity, CIOs and other tech leaders must proactively mitigate AI's environmental consequences.

This article examines the growing carbon footprint of AI, encouraging trends that offset AI's carbon footprint, how IT leaders can foster AI energy efficiency, and AI's sustainability and future.

The growing carbon footprint of AI

Several signs point to AI's aggregate carbon footprint growing ever larger.

AI models are exploding in size as measured by the number of parameters, and AI features and apps are becoming a part of enterprise systems, software and mobile devices. In particular, generative AI significantly increases energy consumption -- not just during training, but also during inference. Unlike traditional machine learning models, which consume substantial energy only while being trained, generative AI models require high compute power for inference, meaning their operational energy consumption remains high over time.

Adding to this challenge, AI models are coming out at an increasingly rapid pace. The state-of-the-art models today may be obsolete tomorrow. If the shelf life of AI models were longer, their energy consumption could have been amortized over a longer period. New advancements, such as reasoning models, need even more compute power for inference, which only increases the energy consumption. Moreover, emerging use cases, such as AI-generated video, will be even more energy-hungry due to their higher GPU requirements.

Big technology companies have plans to transition their data centers to renewable energy. But the rapid expansion of AI has outpaced any such efforts. In the rush to dominate the AI market, companies are scaling up data center capacity faster than they can secure sustainable energy sources. As a result, while data center power consumption grows, a smaller proportion of it is coming from clean energy, at least in the short term.

Encouraging trends that offset AI's carbon footprint

Some trends in the AI technology landscape might help counterbalance the rising negative environmental impacts of AI.

The rise of small language models (SLMs), which typically have 2 billion to 7 billion parameters -- versus 70 billion to 1 trillion-plus parameters for large language models (LLMs) -- offers a more energy-efficient alternative. However, the widespread deployment of SLMs across hundreds of millions or even billions of end-user devices can still contribute significantly to overall energy consumption.

On the infrastructure front, companies are investing in scaling up data centers, many of which are designed to be powered by renewable energy or even next-generation nuclear fusion. Over time, this shift toward greener energy resources should help mitigate AI's environmental impact.

Most encouragingly, reducing energy usage is in the financial interests of AI companies. Energy costs constitute a significant portion -- often 15% to 25% -- of a data center's total operating expenses. As a result, companies have strong incentives to optimize efficiency, both to reduce costs and to maintain competitiveness.

How IT leaders can foster AI energy efficiency

While macro-level solutions, like green data centers and infrastructure investments, will take time, CIOs can implement several practical measures within their control to reduce AI's environmental impact even while continuing to seek innovations. Here are six strategies.

1. Provide AI upskilling, especially in prompt engineering

Minimizing trial-and-error use is a cornerstone of reducing AI emissions -- and saving money. Much of generative AI usage involves trial and error and going back and forth with AI tools. Users revise their prompts many times, unwittingly contributing to the consumption of significant amounts of compute power. CIOs should train teams on prompt engineering best practices to reduce unnecessary iterations, thus making AI usage more efficient and cost-effective.

2. Use simpler, traditional alternatives to AI

Educate on AI's carbon footprint and the need for optimization. Organizations should provide employees with training on when to use AI-powered tools versus traditional software. Not all queries require an LLM-powered response. Spreadsheets, calculators, simple scripts and conventional web search engines often suffice for many tasks.

3. Use the smallest model possible

Optimizing model selection is another key to reducing AI's carbon emissions. AI models come in various sizes -- small, medium and large -- each with increasing energy requirements. Don't default to the most powerful model for every task. Instead, implement a model-routing system that selects the smallest model most suited for a given task. This not only reduces carbon emissions, but also cuts cloud and AI usage costs.

As AI providers eventually raise prices to reflect their true costs, enterprises will need to recalibrate their AI consumption.

4. Don't undercharge for AI

Underpricing a resource, including AI, encourages overconsumption. Incentivize responsible AI use through internal pricing mechanisms. Today, many AI companies are subsidizing generative AI services, making usage artificially cheap. However, as AI providers eventually raise prices to reflect their true costs, enterprises will need to recalibrate their AI consumption. CIOs can proactively develop frameworks, such as metering mechanisms, to optimize AI usage.

5. Incorporate green AI practices in procurement policies

Procurement teams should include sustainability as a criterion when selecting AI and cloud providers. CIOs should engage vendors to disclose their environmental commitments, sustainability roadmaps and measurable progress. Organizations must also be vigilant against greenwashing -- superficial sustainability claims without tangible execution. If enough enterprises prioritize green AI, market forces will compel AI vendors toward cleaner, more energy-efficient operations.

6. Consider open source and decentralized AI models

CIOs concerned about the carbon footprint of major cloud-based AI providers can explore open source models that can be hosted locally or on cloud platforms with cleaner energy sources. This approach reduces reliance on large-scale data centers running on fossil fuels.

AI's sustainability and future

AI technologies are evolving at a breakneck pace, underscoring the need for compute efficiency and ways in which the tech might support carbon emission reductions.

Regarding the latter, despite its growing carbon footprint, AI also has the potential to contribute to sustainability in positive ways. For instance, AI tools can reduce business travel by improving virtual collaboration through enhanced video conferencing, transcription and meeting summarization. AI can optimize logistics and help reduce fuel consumption in supply chains. AI can bring forth efficiency improvements in manufacturing and energy management, which can lead to substantial emission reductions.

On the compute efficiency front, there is the potential for AI technologies to become more sustainable.

Competitive dynamics are driving technology companies to seek innovations in energy efficiency. For example, DeepSeek has made claims that its engineering innovations have made both training and inference significantly more compute-efficient.

As AI vendors compete on cost and efficiency, innovations related to efficiency will likely become industry-wide standards.

The AI industry is beginning to measure efficiency in terms of tokens per watt per dollar -- a promising step toward a more sustainable AI future. Reducing AI's carbon footprint is a shared responsibility between AI companies and users. Through thoughtful policies and prioritizing sustainability, CIOs can ensure their organizations reap the benefits of AI, while minimizing its environmental impact.

Kashyap Kompella is an industry analyst, author, educator and AI adviser to leading companies and startups across the U.S., Europe and the Asia-Pacific regions. Currently, he is CEO of RPA2AI Research, a global technology industry analyst firm.

Next Steps

A simple guide to greenhouse gases: Main types and sources

What is enterprise AI? A complete guide for businesses

Understand greenhouse gas emissions vs. carbon emissions

Dig Deeper on Sustainable IT