šŸ¤– Day 26: Investigate strategies to minimise the carbon footprint of AI in testing

As we progress through our 30 Days of AI in Testing Challenge, todayā€™s focus shifts towards an important yet often overlooked aspect of AI adoption: the environmental impact.

AIā€™s rapid adoption has brought invaluable benefits to many industries. However, the training and deployment of AI models can have a significant environmental impact due to the energy consumption and carbon emissions associated with them. As responsible professionals, itā€™s essential to try to understand and mitigate AIā€™s carbon footprint to adopt more sustainable practices.

Task Steps:

  • Research the Carbon Footprint of AI: Find resources that discuss the energy consumption and carbon emissions associated with training LLMs, running AI assistants, and storing huge amounts of data. Look into factors that contribute to the carbon footprint of AI, such as hardware requirements, data centre operations, and model optimisation methods.
  • Explore Reduction Strategies: Explore how to make AI in Testing more sustainable such as energy-efficient hardware, green data centre practices, AI model optimisation techniques to reduce computational needs, as well as carbon offsetting programs to mitigate the carbon footprint of AI implementation.
  • Evaluate Applicability in Your Context: Identify areas where the strategies youā€™ve discovered could be implemented to reduce the carbon footprint of your AI testing activities. What are the potential challenges and benefits of adopting these strategies?
  • Share Your Findings: Reply to this post with a summary of of the strategies youā€™ve identified for minimising AIā€™s environmental impact. Discuss the feasibility and potential impact of implementing these strategies in your testing context. Where possible, share any useful resources you discovered during your research.

Why Take Part

  • Discover More Sustainable Solutions: Identify practical strategies and solutions for reducing the carbon footprint of AI in testing.
  • Contribute to Ethical AI Use: Your research and shared insights contribute to a larger discussion about responsible AI adoption.

:diving_mask: Dive deeper into these topics and more - Go Pro!

4 Likes

Hello all,

The articles Sarah provided show how AI can use a lot of energy, which isnā€™t great for the environment. To summarise:

  • LLMs require vast amounts of energy to train/run, leading to a significant carbon footprint. Hugging Face figured out a new way to measure how much energy these models use throughout their entire lifespan. They found that their own model, called BLOOM, was actually better for the environment than others because they used special computers powered by nuclear energy to train it, which doesnā€™t create as much carbon dioxide.

  • Forbes article proposes several suggestions for mitigating the environmental impact of AI, including improved carbon accounting, geographically optimized data storage & following Googleā€™s ā€œ4Mā€ best practices for reducing energy consumption in machine learning.

In conclusion, we can follow these ways to reduce AIā€™s harm on the environment:

10 Likes

Hello @sarahk
While AI presents remarkable opportunities, its training and deployment process can significantly contribute to carbon emissions and energy consumption, posing environmental challenges that definitely require our attention.

To gain insights into AIs environmental impact, I explored reputable sources such as MIT Tech Reviewā€™s article and also Kate Crawfordā€™s - Atlas of AI. These resources provided a comprehensive understanding of various factors influencing AIā€™s environmental footprint, including -

  • Hardware requirements

  • Data Center operation

  • Model Optimization methods

Exploring Reduction Strategies:

  1. Energy-efficient hardware: investing in it can significantly reduce power consumption during AI model training & deployment, leading to lower carbon emissions.

  2. Green data-center practices : Adopting green data centers practices - like leveraging renewable energy sources & optimizing cooling systems.

  3. AI-model Optimization: Employing like model distillation can decrease energy consumption.

  4. Carbon offsetting program: allows organizations by investing in projects which mitigate AI activities.

Assessing the feasibility of implementing these strategies in our testing activities is crucial.

While challenges such as -

  • Initial investment costs

  • Technical compatibility

  • Organizational alignment

may arise, the potential benefits of adopting sustainable AI practices such as - Reduced operational costs, Improved brand reputation and long-term environmental stewardship-make it a worth.

5 Likes

Hi, my fellow testers. For todayā€™s challenge I researched and read the following article: How much electricity do AI generators consume? - The Verge on calculating the energy consumption of AI which was a really interesting read.

Research the Carbon Footprint of AI:

The article has some estimates on the energy consumption of AI models but state that they are only estimates as currently the companies behind them arenā€™t sharing the data. Here are the figures the article presents:

  • Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes

  • de Vries calculates that by 2027 the AI sector could consume between 85 to 134 terawatt hours each year. Thatā€™s about the same as the annual energy demand of de Vriesā€™ home country, the Netherlands

  • The agency says current data center energy usage stands at around 460 terawatt hours in 2022 and could increase to between 620 and 1,050 TWh in 2026 ā€” equivalent to the energy demands of Sweden or Germany, respectively

Explore Reduction Strategies:

  • For offsetting the energy cost the article states:

  • that Microsoft claims that "AI ā€œwill be a powerful tool for advancing sustainability solutions,ā€ and emphasized that Microsoft was working to reach ā€œsustainability goals of being carbon negative, water positive and zero waste by 2030.ā€

  • Luccioni says that sheā€™d like to see companies introduce energy star ratings for AI models, allowing consumers to compare energy efficiency the same way they might for appliances

  • For de Vries, our approach should be more fundamental: do we even need to use AI for particular tasks at all? ā€œBecause considering all the limitations AI has, itā€™s probably not going to be the right solution in a lot of places, and weā€™re going to be wasting a lot of time and resources figuring that out the hard way,ā€

Evaluate Applicability in Your Context:

In my context I think up front there would be a decision making process about whether AI is needed as a part of a workflow, as the the best energy reduction is not to need to use it in the first place. Then if we did decide that AI was appropriate I would hope that we could make an informed decision on itā€™s energy impact that would involve evaluating the energy efficiency a model (if thatā€™s even possible), evaluating the offsetting claims of the company behind the model or potentially looking into offsetting the energy use directly as a company ourselves.

Ultimately at a higher level it does come down to the companies creating these AI models to be upfront about the costs involved and them doing their own work to genuinely offset the energy cost. Then at a company level we can take the decisions I mentioned above. Finally at an individual level it can be potentially a simple question, do I need to use AI in this task?

7 Likes

Being the typical Scandinavian social democrat ā€œpeoples homeā€(folkhemmet) paradigm prodigy, the ones that through a welfare society that during the 1900-s gave also people with all roots in the rural 90% poor in Scandinavia possibilities to study, gave possibilities for class travel, I have a somewhat different view on environmental issues than progressive people of more prosperous backgrounds.

I see the equation maybe a little more complex than ā€œstrategies to minimise carbon footprintā€. I want to add ā€œwhile at the same time strive for a society where grass roots still enjoys a prosperous lifestyleā€.

And Iā€™ve tried to work ChatGpt3 a little in that direction but itā€™s a bit hard. When you as ChatGpt for reasonable solutions on questions that are controversial at the moment (In Scandinavia, global warming IS a controversial issue, and the dialogue is currently worse than ever imho) the suggestions from my AI arenā€™t overly impressive.

I mean, if a majority of the population does care more about driving the latest SUV than actually reduce carbon footprintsā€¦ And I am, as born in a family where das Fressen was more important than die Moral, but now living in a community where family histories has allowed people to disregard the Fressen, but have good reasons for following a green agenda, seeing the problem as political rather than physical.

8 Likes

Hi All

Researching the Carbon Footprint of AI

I researched the environmental impact of AI, particularly the energy consumption and carbon emissions associated with training large language models (LLMs), running AI assistants, and storing vast amounts of dataĀ²Ā³āµ. I found that the AI industry is heading for an energy crisis, with large AI systems likely to need as much energy as entire nations within yearsĀ². The carbon footprint of AI is projected to grow at a CAGR of nearly 44% globally through 2025Ā³.

Exploring Reduction Strategies:

Next, I explored strategies to make AI in Testing more sustainable. This involved looking into energy-efficient hardware, green data centre practices, AI model optimisation techniques, and carbon offsetting programs.

  1. Energy-Efficient Hardware: I found that designing hardware specifically tailored to AI workloads can significantly improve energy efficiencyā·. For instance, IBM has developed prototype analog AI chips that are more energy-efficient for speech recognition and transcriptionā¶.

  2. Green Data Centre Practices: I discovered that green data centres are becoming the norm in enterprise data centre portfoliosĀ²ā“. They feature technology and design strategies that lessen negative impacts on the environment from data centre operationsĀ²āµ.

  3. AI Model Optimisation Techniques: I learned about various techniques for AI model optimisation, such as model distillation and post-training quantisationĀ¹ā¶. These techniques can balance accuracy and speed, unlocking the full potential of generative AI and LLMs without excessive energy consumptionĀ¹ā¶.

  4. Carbon Offsetting Programs: I found that carbon offset programs allow individuals or companies to invest in environmental projects to balance their carbon footprint. These programs support projects that reduce, avoid, or remove emissionsĀ¹ā“.

Evaluating Applicability in the Context:

I identified areas where these strategies could be implemented to reduce the carbon footprint of AI testing activities. The potential challenges and benefits of adopting these strategies were considered. For instance, while energy-efficient hardware and AI model optimisation techniques can reduce energy consumption, they may require significant investment and expertise to implement. Green data centre practices and carbon offsetting programs, on the other hand, can be more straightforward to adopt but may have varying effectiveness depending on the specific context.

Findings:

In conclusion, reducing the carbon footprint of AI in testing involves a combination of strategies, including using energy-efficient hardware, adopting green data centre practices, optimising AI models, and participating in carbon offsetting programs. The feasibility and potential impact of these strategies will depend on the specific testing context. The resources I discovered during my research provide valuable insights into these strategies and their implementation. By adopting these strategies, we can contribute to a larger discussion about responsible AI adoption and help pave the way for a more sustainable future.

References:

(1) Generative AIā€™s environmental costs are soaring ā€” and mostly secret. Generative AIā€™s environmental costs are soaring ā€” and mostly secret.
(2) The Carbon Footprint Of AI - Sustainable Software. The Carbon Footprint Of AI - Sustainable Software.
(3) AI has a large and growing carbon footprint, but there are potential ā€¦ AI has a large and growing carbon footprint, but there are potential solutions on the horizon.
(4) Energy Efficiency and AI Hardware - aha.stanford.edu. https://aha.stanford.edu/sites/g/files/sbiybj20066/files/media/file/aha-retreat-2023_dally_keynote_en_eff_ai_hw_0.pdf.
(5) New analog AI chip design uses much less power for AI tasks | IBM ā€¦ New analog AI chip design uses much less power for AI tasks | IBM Research Blog.
(6) What is a green data center and why are they attracting investment ā€¦ What is a green data center and why are they attracting investment? | ITPro.
(7) Green data centers: towards a sustainable digital transformation - A ā€¦ Green data centers: towards a sustainable digital transformation - A practitioner's guide.
(8) What are the Techniques to Optimize AI Model Size and Performance? - e42.ai. What are the Techniques to Optimize AI Model Size and Performance?.
(9) The Best Carbon Offset Programs for 2024 - Investopedia. The Best Carbon Offset Programs for 2024.
(10) Carbon offsetting: reviewing the evidence ā€“ Creating a better place. Carbon offsetting: reviewing the evidence ā€“ Creating a better place.
(11) A complete guide to carbon offsetting - The Guardian. A complete guide to carbon offsetting | Carbon offsetting | The Guardian.
(12) Carbon Offsets | MIT Climate Portal. Carbon Offsets | MIT Climate Portal.
(13) Carbon offsets and credits - Wikipedia. Carbon offsets and credits - Wikipedia.
(14) AIā€™s carbon footprint is bigger than you think. AI's carbon footprint is bigger than you think | MIT Technology Review.
(15) Reduce Carbon and Costs with the Power of AI | BCG. https://www.bcg.com/publications/2021/ai-to-reduce-carbon-emissions.
(16) CNN Hardware Accelerator Architecture Design for Energy-Efficient AI. CNN Hardware Accelerator Architecture Design for Energy-Efficient AI | SpringerLink.
(17) AI hardware has an energy problem - Nature. https://www.nature.com/articles/s41928-023-01014-x.pdf.
(18) How to Choose an Optimisation Algorithm. https://machinelearningmastery.com/tour-of-optimization-algorithms/.
(19) Top 7 AI Improvement Recommendations & Techniques in 2024 - AIMultiple. Top 7 AI Improvement Recommendations & Techniques in 2024.
(20) Artificial Neural Networks Based Optimisation Techniques: A Review - MDPI. Electronics | Free Full-Text | Artificial Neural Networks Based Optimization Techniques: A Review.
(21) 4 ways AI can super-charge sustainable development. https://www.weforum.org/agenda/2023/11/ai-sustainable-development/.
(22) Sustainable AI: AI for sustainability and the sustainability of AI | AI ā€¦ Sustainable AI: AI for sustainability and the sustainability of AI | AI and Ethics.
(23) The Role of AI in Sustainable Business Practices. The Role of AI in Sustainable Business Practices.
(24) Eco-Innovation: How AI is Shaping Sustainable Practices. Eco-Innovation: How AI is Shaping Sustainable Practices | PECB.
(25) Accelerating AI with Sustainability ā€“ A Playbook. https://blogs.microsoft.com/wp-content/uploads/prod/sites/5/2023/11/Microsoft_Accelerating-Sustainability-with-AI-A-Playbook-1.pdf.
(26) Data Center Sustainability: 5 Steps to a Green Data Center. Data Center Sustainability: 5 Steps to a Green Data Center | Enterprise Networking Planet.
(27) Green Data Centres: A Path Toward Sustainable Practices. https://www.iamrenew.com/sustainability/green-data-centers-a-path-toward-sustainable-practices/.
(28) Developing Green Data Centres for a Sustainable Future - G2. https://www.g2.com/articles/green-data-centers-for-a-sustainable-future.

Thank you

7 Likes

This is a head-spinning question.

MIT Technology Review article points out that training the model uses much more resources than using it afterwards. Based on their data for BLOOM, training took as much resources as running it for a bit over 7 years straight. So the basic strategies are obvious - limit the number of models that have to be trained, share already trained models, use them as long as possible and always prefer using already trained model to training the new one.

This article also points out the difference between energy usage for training and total cost of training, which would also include costs of manufacturing equipment and costs of infrastructure. This is where things start to become complex, because itā€™s only fair to assume that some of these costs would be paid anyway - some of data centers, servers, graphics cards, routers, electricity etc. would be produced anyway, they would just be used for something else. So you would need to compare actual AI usage with some hypothetical other usage, including the real and predicted benefits of both. I donā€™t even have a good idea where to begin.

We canā€™t talk about minimising resource usage without mentioning Jevons paradox - this is when developments in efficency cause increase in demand so large, that eventually total usage is increased instead of reduced. This is what we observed with coal usage in 19th century. This is not inevitable and I donā€™t think there are any hard rules that allow to predict whether itā€™s going to happen or not, but there is a risk that any efforts put into minimizing AI natural resources usage will actually only increase them.

Finally, I donā€™t think the discussion should be limited to carbon footprint. Data centers are known to use large amount of water, and big tech companies are securing tax benefits disproportionate to number of jobs created. Sure, these data centers are not used for AI exclusively and both of these were problems before current surge in AI usage. But it canā€™t be denied that AI is contributing factor now.

8 Likes

i read the documentation and then asked Co-Pilot
ā€œAs an automation engineer i want to reduce the carbon footprint of using AI in Automation Tests. Give me a strategy and practices to minimise this impactā€

Reducing the carbon footprint in AI automation testing is a commendable goal. Hereā€™s a strategy and some practices to help minimize this impact:

  1. Optimize AI Models: Use efficient algorithms that require less computational power. Consider techniques like pruning, quantization, and knowledge distillation to reduce the size and complexity of your models1.
  2. Energy-Efficient Hardware: Invest in modern servers and storage devices that are designed for energy efficiency. Look for hardware with a high performance-per-watt ratio2.
  3. Green Data Centers: Utilize data centers powered by renewable energy sources. Additionally, implement environmentally responsible cooling methods to reduce energy consumption2.
  4. Sustainable Development Practices: Adopt development practices that prioritize sustainability, such as using automated tools to optimize code for performance and energy consumption3.
  5. Monitoring and Analytics: Regularly monitor the energy usage of your AI systems and use analytics to identify areas where improvements can be made4.
  6. Reuse and Recycle: Where possible, reuse existing models and datasets instead of creating new ones from scratch. Also, recycle or repurpose old hardware responsibly2.
  7. Stakeholder Engagement: Work with all stakeholders, including management and clients, to ensure they understand and support your sustainability goals.
  8. Continuous Learning: Stay informed about new technologies and methods that can help reduce the carbon footprint of AI systems.

By implementing these strategies and practices, you can help ensure that your work in automation testing contributes to a more sustainable future. Remember, every small step towards efficiency can lead to significant reductions in carbon emissions over time. :seedling:

Working for a company that produces software in the Energy Trade and Risk sphere, I am proud to say our CEO pushes for ā€œgreenerā€ and has introduced an audit on our carbon footprint.

Some of what is stated in documentation and returned from Co-Pilot above sounds great, and in a perfect world we should strive for this, but I think we have to be realistic that business leaders will look at the balance sheets. That being said, we shouldnā€™t stick our collective head in the ground and hope it passes us by.

But cost free we can educate ourselves at what, how and where we can improve our everyday practices.
As a small to medium sized software company I would say we will use a third party in this. So in deciding that company and a having green aware CEO, I would think the carbon footprint and practices of a potential partner would be high on the decision tree.

One thing I noticed that wasnā€™t made obvious is time of day. I think that a good practice would be to do all this heavy processing and learning at off-peak times. Remember, Energy is a use it or lose it commodity.

6 Likes

1. Energy Consumption and Carbon Emissions:

  • Training LLMs (Large Language Models):
    • Training large AI models, such as LLMs, consumes significant energy and contributes to carbon emissions.
    • This energy usage primarily comes from computations on GPUs and TPUs.
    • The carbon footprint depends on the duration and intensity of training, as well as the energy sources of the data centers.
  • Running AI Assistants:
    • AI assistants, like chatbots and virtual assistants, require continuous processing power.
    • The carbon footprint here is associated with the servers that host these systems and the energy they consume to remain operational.
  • Storing Large Amounts of Data:
    • Data storage, especially for large datasets used in AI, contributes to the carbon footprint.
    • This includes both the energy used to store data and the cooling systems required to maintain server temperatures.

2. Factors Contributing to Carbon Footprint:

  • Hardware Requirements:
    • More powerful hardware (GPUs, TPUs) consumes more energy.
    • Using energy-efficient hardware can help reduce carbon emissions.
  • Data Center Operations:
    • Data centers consume a substantial amount of energy for cooling and running servers.
    • Green data center practices, such as using renewable energy sources, can mitigate this impact.
  • Model Optimization:
    • Optimizing AI models to be more efficient can reduce computational needs.
    • Techniques like pruning, quantization, and knowledge distillation can help in this regard.

Reduction Strategies:

1. Energy-Efficient Hardware:

  • Using energy-efficient processors and GPUs can significantly reduce energy consumption during AI training and operations.

2. Green Data Center Practices:

  • Opting for data centers powered by renewable energy sources, such as solar or wind, can greatly mitigate carbon emissions.

3. AI Model Optimization:

  • Employing techniques like model pruning, quantization, and distillation can reduce the computational requirements of AI models, thus lowering energy consumption.

4. Carbon Offsetting Programs:

  • Investing in carbon offsetting programs can help balance out the carbon emissions produced by AI activities.

Feasibility and Potential Impact:

In a testing context, implementing these strategies can have several benefits:

  • Feasibility:
    • Energy-efficient hardware: This can be feasible when upgrading or purchasing new hardware for testing.
    • Green data center practices: Choosing cloud providers that prioritize renewable energy can be a straightforward option.
    • AI model optimization: This requires expertise but can lead to long-term efficiency gains.
    • Carbon offsetting: Companies can participate in existing programs to offset their AI-related emissions.
  • Potential Impact:
    • Reduced energy costs: Energy-efficient practices can lead to cost savings.
    • Environmental benefits: Lowering carbon emissions contributes to sustainability goals.
    • Reputation and branding: Demonstrating a commitment to green practices can improve public perception.

Challenges:

  • Cost: Some energy-efficient hardware might have higher upfront costs.
  • Expertise: Implementing AI model optimization techniques requires specialized knowledge.
  • Data Center Contracts: Companies might already have contracts with data centers that do not prioritize renewable energy.

Resources:

  1. MIT Tech Review - AIā€™s True Carbon Footprint
  2. Atlas of AI by Kate Crawford
  3. Forbes - Green Intelligence: Why Data And AI Must Become More Sustainable

By adopting these strategies, your AI testing activities can become more sustainable, aligning with environmental goals while also potentially reducing costs in the long run.

5 Likes

Thanks so much for this topic!!!

I found a helpful article on the carbon footprint of ML, ā€œLarge Language Models: AIā€™s growing environmental costā€ by Archana Vaidheeswaran, The Carbon Impact of Large Language Models: AI's Growing Environmental Cost . She draws from several sources including this research paper, "Power Hungry Processing - ā€˜Wattsā€™ Driving the Cost of AI Deployment? " by Alexandra Sasha Luccioni, Yacine Jernite, and Emma Strubell. (I am delighted to find so much work by women scientists!) https://arxiv.org/pdf/2311.16863.pdf

Electric power may be generated by renewable sources, or by burning coal. Location makes a big difference. ML carbon footprint is influenced by hardware, training data, model architecture, training duration, and location of data centers.

One benchmark to think about: Generating 1,000 images with an advanced AI model equates to the carbon footprint of a car driven 4.1 miles. Now think of how many zillions of images are being generated every dayā€¦

Hardware considerations

Newer hardware may be more efficient. Data centers also need electricity for lighting and cooling. Manufacturing and disposal also consumes energy.

Training Data

Huge training datasets elevate the carbon footprint. Need to optimize the size and complexity for sustainability.

Model Architecture

Simpler model architecture can mean significant energy savings. Optimizing the time taken to do the training also affects energy use hugely.

Geographic location

Geographic location of data centers also contributes. Areas using renewable sources have a much lower carbon footprint.

Takeaways

This article gives some case studies on GPT-4, LLama 2, Stable Diffusion v1.

LLMs can drive innovation that could help reduce the carbon footprint in general with new technology and so on . They can potentially help solve complex global problems. But itā€™s urgent that we address the environmental impact.

Concerns about sustainability related to our software industry include more than AI and LLMs, of course. But the amount of resources used to train LLMs is simply mind-boggling.

Iā€™ve learned a lot about sustainability in general from Jutta Eckstein, Ines Garcia and others who are leading the Agile Alliance sustainability initiative: Agile Sustainability Initiative | Agile Alliance

This is scary stuff, folks. Even doing small things, like getting rid of data stored in the cloud that you arenā€™t using anymore, can help save the planet.

8 Likes

This is topic interesting, as it ties into a wider discussion I am having aroung Green IT and how testing can assist with this. The whole AI, Cryto Currencies and large ā€œCload Basedā€ infrastructures all have significant power requirements. It we look through at IT from an ecological lense, it is not a pritty picture. However, we need IT and as mention by others in posts above, software and particularly AI can also help provide solutions to minimise impact.
There are some groups now forming to look at how to make IT more sustainable and there are specific ficus groups looking at testing and what can be done. Some typical exxamples are:

How we as testers operate can make a difference.

6 Likes

Hi, everyone,

todayā€™s challenge provides an opportunity to discuss the negative environmental impact of AI and the possibilities of reducing them. In the public sphere there is a sufficiently active discussion about it and a lot of information on this topic can be found, as evidenced by its relevance :green_book: :earth_africa:

It was useful and interesting to look at the environmental impact of AI. On the one hand, it helps in the fight against climate change, but at the same time AI use also has a negative impact on the environment.

Carbon Footprint of AI:

Large-scale data collection

AI provides a way to make sense of massive amounts of data, but the current state-of-the-art requires a massive amount of data for training & validation. The more weights a model has, the more data it needs.

Large energy resources

ML systems learn to perform a specific task by observing lots of examples. This requires staggering amounts of energy (data & compute resources) to perform pattern matching & superhuman statistical analysis. Many experimentation paths are dead-ends with a corresponding carbon footprint.

The energy needed to train and run AI models. This increase in energy use directly affects greenhouse gas emissions, aggravating climate change. It was estimated, that training can produce about 626,000 pounds of carbon dioxide, or the equivalent of around 300 round-trip flights between New York and San Francisco ā€“ nearly 5 times the lifetime emissions of the average car.

Natural resources

Generative AI systems need enormous amounts of fresh water to cool their processors and generate electricity.

Reduction Strategies:

Few-shot learning
Less than zero-shot learning
E-waste management
Recycling of AI-related electronic waste
More stringent laws and ethical disposal practices
Transparency, like regular environmental audits, reports, and accountability
Practical industry framework and guidelines
Build more efficient AI models
Develop energy-efficient hardware, algorithms, and data centers

Recourses:

AIā€™s Carbon Footprint Problem (stanford.edu)

The Carbon Footprint Of AI - Sustainable Software (microsoft.com)

The carbon impact of artificial intelligence | Nature Machine Intelligence

The Real Environmental Impact of AI | Earth.Org

The environmental impact of the AI revolution is starting to come into focus - The Verge

5 Likes

Thanks for these links! It is heartening to see people are actively thinking about this and taking concrete steps to sustainability.

1 Like

LLMs require a large amount of resources - leaving a big carbon footprint - during training and while running the AI.
Fossil fuels are limited and cause the most pollution: While countries like France continue to count on nuclear energy, other countries like Germany which are concerned about safety of nuclear reactors want to switch to all renewable energy. But are wind and solar energy enough to power all the AI tools in the world?
LLM tools may sound like humans but they do not care about our environment:
You
Are you - ChatGPT - sustainable?
ChatGPT
As an AI language model, I donā€™t have physical sustainability concerns like those associated with environmental impact or resource consumption.

2 Likes

Day 26

The Carbon Footprint of AI

This always reminds me of the paperclip game, where you end up consuming all resources in the universe to make paperclips. When we create a new technology (like Bitcoin mining for example), we end up firing up old coal power stations to feed the hunger. I assume Generative AI will be the same, where as models become more energy efficient, we will just ask them to do more and more. Perhaps a large language model will eventually come up with a plan to address climate change that everyone will be onboard with.

Iā€™ll have a look at this today I think, try and list some strategies for reducing that footprint:

  • Interesting that they compare training a model to the lifetime output of five cars. We could do with a lot less cars to be fair before worrying about AIā€™s carbon footprint.
  • I like some of the ideas about where to do your processing, looking for areas that have a lot of hydroelectric power generation for example.
  • Iā€™m wary of calculators provided by cloud providers - if they are as accurate as the cost calculators, then we are in environmental trouble. However, they might serve as a starting point.
  • The 4M best practices are a good guide to bring the other points in the article together, although some who are large enough are returning to on prem, as the cloud cost gets higher and higher.

In terms of AI in testing, it would be lovely to have a AI capability which:

  • Powers down unused infrastructure when its not in use.
  • Limits test environments to specific purposes (or types of testing), rather than spinning them up because the previous test strategy said so.
  • Runs tests when it makes sense to do so, overnight for longer test runs for example.
  • Help to filter our automation for repetition and needless tests that havenā€™t failed for eons.
  • Models which ask for better prompts, instead of just responding. :slight_smile:
3 Likes

I went to a talk by a chap who used load testing to give more accurate cloud spend estimations. The calculators provided by big cloud were out by orders of significant magnitude!

Its a good shout to think how testing can contribute in this regard.

Hi There

A lot of innovations happening in AI world to reduce carbon footprint.

Tech Giants are working for a greener environment AI and hope a better tomorrow expect in terms of energy consumption :crossed_fingers:

Thanks
Vishnu

2 Likes

Hello @sarahk and fellow learners!

Thanks for this challenge. It opened some new areas of reality for me. I wish everyone could be made to learn about these aspects of AI.

Here is a summary mindmap with all my learnings done as part of this task:

I have also done a video blog explaining the dark reality and hidden sides of using AI. Check it out and feel free to share it with your network as we need to educate more and more people about this:

Do share your thoughts & feedback!

Thanks,
Rahul

2 Likes

My Day 26 Task

About Research the Carbon Footprint of AI

I quickly read the article Weā€™re getting a better idea of AIā€™s true carbon footprintrecommended in the challenge assignment.

  • Outline of the article:

Hugging Face estimates AI modelā€™s carbon footprint

  • :bar_chart: Emissions Estimate: BLOOMā€™s training led to 25 metric tons of CO2 emissions.

  • :earth_africa: Real-world Impact: AI modelsā€™ environmental impact needs further understanding.

  • ā€¢ Call to Action: Encouraging more efficient AI research and development.

  • Summary of the article

The article from MIT Technology Review discusses Hugging Faceā€™s initiative to more accurately calculate the carbon footprint of large language models (LLMs) by considering their entire lifecycle, not just the training phase. Hugging Face applied this methodology to its own model, BLOOM, finding its carbon emissions were significantly lower compared to other LLMs, partly due to using nuclear energy for training. The research emphasizes the importance of considering the broader impact of AI on the environment, including hardware manufacture and operational emissions, and suggests shifting towards more efficient research practices to reduce carbon footprintsā€‹ā€‹.

About Explore Reduction Strategies

I quickly read the article Why Data And AI Must Become More Sustainablerecommended in the challenge assignment.

  • Outline of the article:

Data and AIā€™s environmental impact is a growing concern

  • :exclamation: Problem: Data and AIā€™s carbon footprint is a major concern.

  • :bulb: Suggestions: Tools and best practices to mitigate environmental impact.

  • :bar_chart: Impact: AIā€™s energy consumption threatens climate change progress.

  • Summary of the article

The article ā€œGreen Intelligence: Why Data And AI Must Become More Sustainableā€ emphasizes the environmental impact of data and AI technologies, highlighting the growing concerns about their carbon footprint and energy consumption. It discusses the exponential growth in data and AI deployment, exacerbated by the COVID-19 pandemic, leading to significant energy demands and environmental costs. The piece stresses the need for enterprises to address the contribution of data storage and AI to greenhouse gas emissions. To tackle AIā€™s sustainability impact, the article suggests measures like improving carbon accounting, estimating carbon footprints of AI models, optimizing data storage locations, increasing transparency, and following energy-efficient practices like Googleā€™s ā€œ4Mā€ best practices. It also calls for a shift towards new AI paradigms that prioritize environmental sustainability to combat climate change effectively. The article underscores the importance of reforming AI research agendas and enhancing transparency to mitigate the environmental impact of AI and data technologies.

About Share Your Findings

I think the development of AI definitely has a large impact on the environment, and there is a carbon footprint associated with both the research and use of AI.

The Carbon Footprint of AI

The development and application of AI technology are rapidly expanding on a global scale. With the continuous advancement of AI technology, especially the emergence of large language models (LLMs) and other complex AI systems, there is growing concern about their energy consumption and carbon emissions. The training and operation of these systems require substantial computational resources, which directly relate to energy consumption and associated carbon emissions.

Discussion and Training of LLMs

The training of AI systems like large language models typically requires significant electrical power and hardware resources. These resourcesā€™ consumption comes not only from the energy required for computation itself but also from the cooling systems needed to maintain the hardware devicesā€™ normal operation. Researchers are exploring ways to reduce energy consumption in these processes, such as by optimizing algorithms, using more efficient hardware, and improving data center energy management.

Operating AI Assistants

The operation of AI assistants also requires energy support, especially in cloud services. These services typically rely on data centers, whose energy consumption and carbon emissions are an integral part of studying the carbon footprint of AI. To reduce these impacts, data centers are adopting renewable energy sources and more efficient cooling technologies.

Storing Large Amounts of Data

Data storage is also an essential component of the carbon footprint of AI systems. As the volume of data increases, the energy required to store and back up this data is also rising. Researchers are looking for more efficient data storage solutions and ways to reduce energy consumption by minimizing data redundancy and optimizing storage structures.

Influencing Factors
  • Hardware Requirements: The hardware requirements of AI systems directly affect their energy consumption. Using more energy-efficient processors and storage devices can reduce the carbon footprint.

  • Data Center Operations: The energy efficiency of data centers, the proportion of renewable energy used, and the efficiency of cooling systems are all critical factors affecting the carbon footprint of AI.

  • Model Optimization Methods: By optimizing AI models, such as reducing the number of parameters and using techniques like knowledge distillation, it is possible to maintain performance while reducing energy consumption during training and inference processes.

Conclusion

The carbon footprint of artificial intelligence is a multidimensional issue that needs to be considered comprehensively from various aspects, including hardware, data center operations, data storage, and model optimization. As awareness of the environmental impact of AI technology grows, future developments will place greater emphasis on sustainability and efficiency to reduce the negative impact on the environment.

2 Likes

Greetings Everyone!
This is an exciting yet thorough topic which needs much more attention than given, in my opinion, the development should not be at the cost of nature.
After reading this blog: Weā€™re getting a better idea of AIā€™s true carbon footprint  | MIT Technology Review
Summary :
1. A company named ā€˜Hugging Faceā€™ tried estimating the overall emissions for its large language model, BLOOM, and tracked the carbon dioxide emissions BLOOM was producing in real time over 18 days.
2. The use of AI algorithms choices that we make about how we use these algorithms and what algorithms to use need to be thought of before actually using them.
To reduce AIā€™s harm to the environment:

  1. Measure the correct amount of carbon footprinting of the algorithms in use.
  2. Use of 'Googleā€™s 4M policy.

Reads to read :

  1. The Real Environmental Impact of AI | Earth.Org
  2. AI's Climate Impact Goes beyond Its Emissions | Scientific American
  3. https://www.gartner.com/en/articles/keep-ai-from-doing-more-climate-harm-than-good
  4. AI and the Environment: How Artificial Intelligence is Helping to Save the Planet

I insist everyone read these blogs once and try to use AI precisely.

Thanks!

1 Like