Is the carbon footprint of AI too big?

It’s no surprise that AI has a carbon footprint, which refers to the amount of greenhouse gases (carbon dioxide and methane, primarily) that producing and consuming AI releases into the atmosphere. In fact, training AI models requires so much computing power, some researchers have argued that the environmental costs outweigh the benefits. However, I believe they’ve not only underestimated the benefits of AI, but also overlooked the many ways that model training is becoming more efficient. 

Greenhouse gases are what economists refer to as an “externality” — a cost borne inadvertently by society at large, such as through the adverse impact of global warming, but inflicted on us all by private participants who have little incentive to refrain from the offending activity. Typically, public utilities emit these gases when they burn fossil fuels in order to generate electricity that powers the data centers, server farms, and other computing platforms upon which AI runs.

Consider the downstream carbon offsets realized by AI apps

During the past few years, AI has been unfairly stigmatized as a major contributor to global warming, owing to what some observers regard as its inordinate consumption of energy in the process of model training.

Unfortunately, many AI industry observers contribute to this stigma by using an imbalanced formula for calculating AI’s overall carbon footprint. For example, MIT Technology Review published an article a year ago in which University of Massachusetts researchers reported that the energy needed to train a single machine learning model could emit carbon dioxide at nearly five times the lifetime emissions of the average American car.

This manner of calculating AI’s carbon footprint does the technology a huge disservice. At the risk of sounding pretentious, this discussion suggests Oscar Wilde’s remark about a cynic being someone who “knows the price of everything and the value of nothing.” I’m not taking issue with the UMass researchers’ finding on the carbon cost of AI training, or with the need to calculate and reduce that cost for this and other human activities. I am curious why the researchers didn’t also discuss the value that AI provides downstream, often indirectly, in reducing human-generated greenhouse gases from the environment.

If an AI model delivers a steady stream of genuinely actionable inferences over an application’s life, it should generate beneficial, real-world outcomes. In other words, many AI apps ensure that people and systems take optimal actions in myriad application scenarios. Many of these AI-driven benefits may be carbon-offsetting, such as reducing the need for people to get in their cars, take business trips, occupy expensive office space, and otherwise engage in activities that consume fossil fuels.

Copyright © 2020 IDG Communications, Inc.

Source link