Top AI Experts Create CodeCarbon, A Tool to Track and Reduce Computing’s CO2 Emissions

In a major validation of their commitment to responsible technology, Mila, BCG GAMMA, Haverford College, and Comet.ml today released CodeCarbon, an open source software package to estimate the location-dependent COfootprint of computing. AI can benefit society in many ways but the amount of energy needed to support the massive computing behind it can come at a high cost to the environment.

HR Technology News: SAIC Names Bridget Chatman as New Inclusion and Diversity Leader

Jointly developed by Mila, a world leader in AI research based in Montreal; GAMMA, BCG’s global data science and AI team; Haverford College in Pennsylvania; and Comet.ml, a leading MLOps solution provider, CodeCarbon is a lightweight software package that seamlessly integrates into Python codebase. It estimates the amount of carbon dioxide (CO2) produced by the computing resources used to execute the code to incentivize developers to optimize their code efficiency. It also advises developers on how they can reduce emissions by selecting their cloud infrastructure in regions that use lower carbon energy sources.

Yoshua Bengio, Mila founder and Turing Prize recipient, said of the software, “AI is a powerful technology and a force for good, but it’s important to be conscious of its growing environmental impact. The CodeCarbon project aims to do just that, and I hope that it will inspire the AI community to calculate, disclose, and reduce its carbon footprint.” Sylvain Duranton, a managing director and senior partner at Boston Consulting Group (BCG) and global head of BCG GAMMA, said, “If recent history is any indicator, the use of computing in general, and AI computing in particular, will continue to expand exponentially around the world. As this happens, CodeCarbon can help organizations make sure their collective carbon footprint increases as little as possible.”

HR Technology News: TecHRseries Interview with Gregg Church, President at 4medica

Why Organizations Need This Tool Now

Training a powerful machine-learning algorithm can require running multiple computing machines for days or weeks. The fine-tuning required to improve an algorithm by searching through different parameters can be especially intensive. For recent state-of-the-art architectures like VGG, BERT, and GPT-3, which have millions of parameters and are trained on multiple GPUs (graphic processing units) for several weeks, this can mean a difference of hundreds of kilograms of CO₂eq.

Helping Organizations Live Up to Their Carbon Promises

The tracker records the amount of power being used by the underlying infrastructure from major cloud providers and privately hosted on-premise datacenters. Based on publicly available data sources, it estimates the amount of CO2 emissions produced by referring to the carbon intensity from the energy mix of the electric grid to which the hardware is connected. The tracker logs the estimated CO₂ equivalent produced by each experiment and stores the emissions across projects and at an organizational level. This gives developers greater visibility into the amount of emissions generated from training their models and makes the amount of emissions tangible in a user-friendly dashboard by showing equivalents in easily understood numbers like automobile miles driven, hours of TV watched, and daily energy consumed by an average US household.

HR Technology News: Employee Checklist: Improving your Relationship with your Boss

Write in to psen@itechseries.com to learn more about our exclusive editorial packages and programs.

AI ExpertsCO2CodeCarbonNEWS
Comments (0)
Add Comment