How can AI and AI developers help reduce the energy usage of AI?

Join our social channels to discuss building lean energy AI
scifi datacentre image
A N3XTCODER series

Implementing AI for Social Innovation

Welcome to the N3xtcoder series on Implementing AI for Social Innovation Series.

In this series we are looking at ways in which Artificial Intelligence can be used to benefit society and our planet - in particular the practical use of AI for Social Innovation projects.

In this instalment, we look at the energy usage of AI, its Carbon Footprint, and what can be done to optimise it.

We cover:

  • What can AI developers do? We look at how developers can reduce energy usage when programming AI, and also when setting up AI systems. 
  • What can AI do? We explore how AI applications and tools can also help reduce energy consumption of AI and other digital systems.

If you have missed it, read part 1:

In part 3 we then commence to look at:

  •  What can Governments and Regulators do? Finally we look at what policy makers can do to help lower the energy consumption of AI?

Let’s dive in:

1. What can AI developers do?

If you develop or work with AI models, there are a number of ways you could significantly reduce your AI energy use.

Firstly, at the start of every project it’s important to consider your use case, or use cases. Do you really need to use AI for a given task or tool? Perhaps a less energy intensive solution will do. And if you do need AI, are there trained and specialised AI tools out there already that can do the task? Just keeping in mind at the planning stage that every use of AI has an energy cost, may enable you to find energy savings, as well as streamlining your work and your costs. 

If you plan on working with existing AI models, check whether there are existing carbon calculations of their lifecycle and the cost of inference.

Within every AI development process there are four clear areas where you can minimise energy use and carbon emissions: hardware, software, data management, and finally supporting the Green AI community.

Hardware

Most AI developers will use cloud-based services and hardware for most of their work. So, as an AI developer your relationship with your cloud provider is crucial for making best use of the most energy efficient hardware and servers for your projects. Your setup should take into account:

  • Data centres powered with renewable energy:
    The most obvious way to reduce the footprint of AI. https://energydigital.com/top10/top-10-green-energy-data-centres 
  • Specialised hardware for AI:
    Tensor Processing Units (TPUs), graphics processing units (GPUs), and field-programmable gate arrays (FPGAs) are examples of specialised hardware that can be used with artificial intelligence. Compared to general-purpose CPUs, these are more suited to tackle AI and machine learning tasks, which results in quicker processing and less energy usage.
  • Edge and IoT devices:
    You can deploy AI models on edge devices or IoT devices whenever possible to reduce the need for data transmission and cloud-based processing, both of which can be energy-intensive.
  • Matching hardware to specific tasks: customise your hardware setups to meet the demands of different AI tasks. For instance, while faster processor cores may be needed for certain jobs, high memory bandwidth may be more advantageous for others.

If you run your own hardware, you could look at using low energy components. In future, more experimental hardware approaches such as neuromorphic computing should become more widely available and affordable. Currently this novel technology is only accessible to researchers and scientists. Neuromorphic chips mimic the systems of the human brain and are a lot more energy efficient than our old fashioned chips, which separate memory and processing.

Software

For many AI developers major energy savings can be made by developing software frameworks and libraries that are specifically designed to minimise AI energy consumption. Implementing techniques such as optimised runtime scheduling, task parallelisation, and resource-aware programming can maximise software performance while minimising energy requirements. As always, optimisations such as these not only help the environment, but also result in more cost-effective and scalable AI solutions. 

AI life cycle analysis

Monitoring the entire life cycle is a complex task, albeit a necessary one if we want to understand the whole picture of AI carbon emissions for a specific model. A paper by AI researchers Alexandra Sasha Luccioni, Sylvain Viguier, Anne-Laure Ligozat details a calculation method for measuring the carbon footprint of BLOOM, a 176 billion parameter language model. There is no universally accepted approach for assessing the environmental impacts of ML models. The researchers started their analysis at the stage of equipment manufacturing, through model training to model deployment. This leaves out two emissions-producing steps that happen even earlier in the lifecycle: raw material extraction and materials manufacturing. Also left out is the step of disposal/end of life, which has not occurred yet for this model, as it’s still in deployment. Therefore a calculation of this last step would only be theoretical right now and a complete cradle-to-grave assessment is not feasible.
Nevertheless, the scope of this assessment is much more thorough than assessments only focusing on the training of AI models, which is the easiest part of the process to calculate. Also worth mentioning that the calculation is not only focussed on Carbon emissions, but also other greenhouse gases, such as methane, carbon dioxide, nitrous oxide, etc. leading to the measure of carbon dioxide equivalents (CO2eq). For sure more AI frameworks will be the focus of an investigation like this in the near future and will be required to make their lifecycle energy consumption more transparent. 


Advanced energy saving techniques

One of the main contributors to AI energy consumption is the complexity of model architectures. Traditional AI models often require extensive computational power, leading to energy-intensive processes. AI Developers should aim to develop models that require less computational power and energy for training and operation. Methods for this include:

All of these methods have shown promise in creating lean models that deliver comparable results with significantly lower energy consumption. 

Data Management

There are methods to make data storage and processing more efficient. The carbon footprint of large data sets used in AI can be reduced by minimising unnecessary data redundancy and optimising data processing pipelines for lower energy consumption. Two key areas are:

  • Utilising distributed computing frameworks to distribute AI workloads across multiple machines. This approach enables parallel processing, speeding up training times and making more efficient use of resources. (Although this may increase amount of training rather than saving CO2 emissions)
  • Cloud Resource Management. Cloud providers offer tools and services for efficient resource allocation, autoscaling, and monitoring. Properly managing cloud resources helps prevent overprovisioning and waste.

Supporting the green AI community

One of the most important pathways to make AI more energy efficient is by supporting other projects and networks that are also working on improving the energy usage and carbon footprint of AI. There are a number of avenues to do this:

  • Open-Source Collaboration
    Working with others in the open source AI community can help share knowledge and best practices in resource-efficient AI development. Collaborative efforts can lead to the creation of standardised frameworks and libraries that prioritise efficiency.

  • Algorithmic Efficiency Research
    You could also invest in research to improve the efficiency of AI algorithms. Continuous innovation in algorithmic design can lead to breakthroughs that require fewer resources without compromising performance.

  • Monitor your Co2 footprint and make it public
    There are many tools available that offer reliable assessments of your carbon footprint, for example: 

These tools require some time to set up but give you a credible overview of your compute and data carbon emissions. If you make these public on your websites and marketing materials you are setting an example and encouraging others to do the same.

2. What AI can do?

There are lots of practical examples of AI helping reduce energy usage. For example, Google reported that one of their DeepMind AI models has helped reduce the energy usage of cooling in data centres by up to 40%. They achieved this by using historical data from thousands of data centre sensors so their neural networks could predict server temperatures in the next hour. AI tools can be used to reduce energy usage in the following ways:

  1. Energy-aware scheduling (See above Google example)
    Use AI algorithms for dynamic workload scheduling to optimise the utilisation of resources and reduce energy consumption in data centres and cloud infrastructure.
  2. Monitoring and feedback (See above Google example)
    Implement AI-based monitoring systems to continuously track energy consumption and make real-time adjustments to optimise energy usage.
  3. Energy-aware training algorithms
    Researchers are actively exploring the development of energy-aware training algorithms. These algorithms dynamically adjust the training process based on the energy efficiency of the underlying hardware.By incorporating energy consumption metrics into the training process, AI systems can optimise their learning strategies, striking a balance between performance and energy efficiency.
  4. Dynamic Resource Allocation
    Implement dynamic resource allocation mechanisms to enable AI systems to adjust their computing resources based on the current workload. This ensures that computational power is utilised efficiently, leading to reduced energy consumption during periods of lower demand. Techniques like reinforcement learning can be employed to enable AI systems to learn and adapt their resource allocation strategies over time.

Recommendations:

  1. Be aware of your CO2 footprint in your development work and take steps to measure it.
    Communicate your CO2 footprint.
  2. Be part of the green AI community and support other projects in it, make your own projects open source.
  3. Whether using AI or not, make sure your website and the third party services it uses are run on green energy. This is one of the easiest ways to have an impact.
  4. When developing your own AI models, weigh up the energy saving potential of optimal hardware against what you might have already in house.
  5. Energy efficient considerations during software development offer some of the biggest potential for optimisation, but these approaches require research. Bigger is not always better - methods like Model Pruning and Quantisation are not overly complex and do have an impact. Leapfrog development by utilising transfer learning in new models.
  6. When writing AI code, program the AI to consider energy efficiency. 

Additional resources we read or looked at while writing this article:

Was this article helpful? yes no

Join us in the conversation on various social channels

Join us in the conversation on various social channels. We discuss the latest developments in technology as they happen!

THIS ARTICLE HAS BEEN REALISED WITH THE HELP OF
Bundesministerium für Wirtschaft und Klimaschutz
NextGenerationEU