

Balancing Innovation and Impact: A Realistic Look at AI’s Environmental Footprint
Feb 26
4 min read
0
0
0
As artificial intelligence becomes increasingly important to business, daily life, science, and other applications, it’s important to be conscious of the environmental impact the technology can have. You may have seen some alarming statements and demonstrations of AI’s power consumption. While some of those are misinformed or sensationalistic, the reality is that AI does, in fact, require massive amounts of energy and water.
Comparing AI Power Consumption to Other Sectors
AI is notorious for its high energy demands, but most of that doesn’t happen when you’re intereacting with the AI. A single AI query uses about 10 times as much power as a google search, at about 3 watt hours. That’s about the amount of power it takes to run a high-efficiency LED lightbulb for 18 minutes. That’s a certainly a lot of power to use in a few seconds, but it means that if you ask an AI a question every two minutes, you’re using about the same power as an average TV.
The more concerning power consumption figures in AI come from training the models. ChatGPT-3, already much smaller than most current models, is estimated to have consumed 1300 mWh of electricity for training. That means that the training took more power than it takes to respond to 400 million queries.
Reducing the Environmental Impact of AI Training
Given the high energy costs associated with training, AI companies have a responsibility—and an opportunity—to implement sustainable practices. Some strategies include:
Optimizing Hardware and Algorithms:
Employing more energy-efficient processors and refining training algorithms can reduce energy usage without compromising model performance.
Leveraging Renewable Energy:
Transitioning data centers to renewable energy sources (like wind or solar) can drastically cut the carbon footprint associated with heavy computations. Innovative cooling solutions and energy reuse strategies can further enhance efficiency.
Collaborative Approaches:
By pooling resources or sharing best practices, companies can collectively lower the environmental burden. In many cases, sustainable practices not only reduce costs but also build a positive brand reputation.
(For further insights on practical solutions, see MIT Sloan’s exploration of energy-efficient data center practices: MIT Sloan)
More Efficient Training Processes:
While ChatGPT-3 is a much smaller model than ChatGPT-4 (possibly only 10% of the size) we really don’t know what advances in the training process companies like OpenAI may have achieved in the time between training the two models. While the datasets are growing exponentially, we can hope that process improvements mean that the power requirements don’t scale up at the same rate.
Training Smaller Models:
This may seem counterintuitive, since the number of parameters in a model is generally treated as a reasonable measurement for the model’s power, but training smaller, more specialized models may reduce the overall power need for training as well as for queries. This is pure speculation on my part, but I suspect that a “base knowledge” that can be shared across those models could reduce the power consumption even further.
Environmentally Conscious AI Deployment for Businesses
Businesses integrating AI into their operations can take several steps to ensure that the technology is “worth it” from an environmental standpoint:
Opt for Retrieval-Augmented Generation (RAG) Over Full Fine-Tuning:
Fine-Tuning – basically taking an existing model and then training it again for your needs, takes a lot of power and processing time. Opting for some basic settings and giving the model access to your existing documents and internet search functions may eliminate the need for a fully custom build. As an aside, this can be set up for a fraction of the cost of a fine-tuned model.
Utilize Pre-Trained Models:
Rather than training models from scratch, leveraging pre-trained models (and adapting them with minimal fine-tuning) can conserve energy. This not only reduces power consumption but also accelerates deployment timelines.
Evaluate the True Cost of Customization:
Before opting for extensive model customization, businesses should weigh the environmental and financial costs against the potential benefits. In many cases, incremental adjustments or hybrid approaches may be more sustainable.
Monitor and Optimize Usage:
Continuously track energy consumption across AI deployments and adjust operations based on efficiency metrics. This proactive approach can help businesses maintain a balance between innovation and sustainability.
The Inequitable Impact of AI
It’s important to recognize that the environmental impact of AI is not evenly distributed. Some regions and companies bear a disproportionate share of the energy burden, a challenge that demands thoughtful policy and industry collaboration.
As with so many other major environmental issues, people in the areas that already have it the worst are the ones that are most heavily affected. Areas that rely more heavily on fossil fuels for their power grid will see more pollution if a data center creates a huge increase in power consumption. In many cases, data centers are placed strategically in places with low power costs. That’s a perfectly understandable strategy, but in some cases that means that poorly regulated power grids can become attractive for data centers.
(For a closer look at the uneven distribution of these impacts, see Harvard Business Review: HBR)
Conclusion
The conversation about AI’s environmental impact is complex and multifaceted. While the energy demands of AI—especially during the training phase—are significant, there are clear pathways to mitigating these effects.
AI can, and should, be a positive development for the world. We use power for other activities and devices that improve our lives, and AI could help us create efficiencies elsewhere. By adopting more efficient practices, leveraging renewable energy, and choosing smarter deployment strategies like RAG, both AI companies and businesses can ensure that the benefits of AI are truly “worth it.”
Acknowledging the problem is the first step toward sustainable innovation. As we continue to push the boundaries of what AI can do, let’s also commit to making its footprint as light as possible on our planet. Let’s not repeat the mistakes made with so many previous technological innovations. Virtually every work of futuristic fiction for decades has included some form of AI, and those works have been divided on whether AI would be good, evil, or neutral. We should all be pushing toward good.