The Hidden Face of Artificial Intelligence: Pollution, Energy, and Silence

This Articleis Sponsored By Phicus

The advancement of artificial intelligence seduces the world with promises of efficiency, automated creativity, and solutions to complex problems. But while public opinion points to innovation, a heavy shadow grows on the sidelines: the carbon footprint of AI.

This technology, which seems intangible, leaves a tangible mark on the planet. And yet, few talk about it. The environmental cost of artificial intelligence deserves to be an urgent part of the public debate.

“The future is not what it used to be” — Paul Valéry

By: Gabriel E. Levy B.

When Paul Valéry wrote that sentence, he did not imagine a future controlled by algorithms and servers that devour energy with insatiable hunger. However, the reflection is still valid to understand what is happening today with the development of artificial intelligence.

From its first experiments in university labs to its current massive expansion, AI evolved with a promise of automation and efficiency that, paradoxically, demands more and more natural resources.

For years, the focus was on technical advances, from deep learning to generative language models.

What was left out of the conversation was the environmental price involved in training and maintaining these systems. Researchers such as Kate Crawford, author of Atlas of AI, denounce this omission: “AI is not just a mathematical creation; it is a physical network, with material impacts on the environment, on human bodies and on the global economy.”

Training a single large-scale language model can consume more than 300 tons of carbon dioxide, according to a study by Emma Strubell (2019) of the University of Massachusetts Amherst.

That figure is equivalent to the emissions of five cars over their entire useful life. As technology becomes more complex and ubiquitous, the cost of energy also increases, dragging with it consequences that we have not yet learned to measure in their entirety.

“Artificial intelligence needs a lithium mine and a thermal power plant”

For a machine to think, infrastructure is needed. And that infrastructure is not virtual. Huge data centers, servers with constant cooling, power networks that keep millions of parameters active: all of that is part of the machinery that gives life to AI.

Companies such as Google, Microsoft and Amazon manage these complexes in strategic countries, seeking to combine energy efficiency with reduced costs, but even in their “greener” versions, these systems consume an exorbitant amount of resources.

Training models like GPT-3 or GPT-4, according to Hugging Face estimates, can involve energy consumption equivalent to that of 126 American homes for a year.

In addition, much of this energy still comes from non-renewable sources. Coal and gas continue to be invisible protagonists in the expansion of artificial “thinking”.

It’s not just a matter of electricity. The materials needed to make the chips, processors, and devices that power AI also have their mark. Lithium, cobalt, nickel, rare earths: all come from regions already facing environmental and social stresses from intensive extraction. As Danish researcher Anders Sandberg of the Future of Humanity Institute warns,

“AI is not in the cloud: it is in the soil, in the minerals, in the mines and in the cables that cross continents”.

The paradox is obvious: a technology designed to optimize processes, which could help mitigate climate change through predictions and analysis, becomes part of the problem because of its operating structure. And in that awkward silence, sustainability is relegated to footnotes.

“Training a model costs more than lighting a stadium”

Behind the glamorous veil of artificial intelligence operate energy processes that rival the most polluting sectors. Language, computer vision, or climate prediction models need to be trained in multiple stages. Each iteration demands more calculations, more servers, more cooling. And once trained, they do not stop: they continue to operate, store data, attend to queries, reproduce results, with a constant energy demand.

This dynamic translates into a carbon footprint that grows with the ambition of the model. If the goal is to create more complex, more accurate, and more “human” systems, then the energy required also scales. In the words of the French philosopher Éric Sadin, author of The Siliconization of the World, “we have delegated intelligence to systems that consume more than the planet can sustain.”

The problem is compounded by the lack of clear regulations or global standards on the environmental impact of AI. Unlike other industries, the technology sector still operates under the illusion of being clean, intangible, neutral. This perception fuels inaction.

There are no carbon labels for artificial intelligence applications. There are no mandatory energy consumption declarations for developers. And in the meantime, the models are multiplying.

In the countries where data centers are hosted, the impact is also felt. Iceland, for example, saw a 45% increase in its electricity consumption following the massive installation of servers for data mining and AI. In the United States, the Environmental Protection Agency reported that data centers account for more than 2% of total energy consumption, a figure that grows every year with no signs of slowing down.

“An algorithm can also pollute”

The cases are piling up. OpenAI, in launching ChatGPT, used Microsoft’s infrastructure, which operates one of the world’s largest data centers in Iowa.

That center, while boasting efficiency, consumes more water than a small city to keep its servers cool. Meta, meanwhile, has been criticized for its centers in New Mexico, where local communities report water shortages while servers remain in constant operation.

In China, Baidu trained its Ernie language model in coal-fired facilities, sparking protests among environmentalists who denounced the doublespeak: on the one hand, technological development; on the other, the worsening of climate change.

At the same time, Amazon Web Services faces questions in India and Brazil, where its expansion puts pressure on local ecosystems and vulnerable communities.

Even “green” projects are not exempt. Google, which tries to operate on renewable energy, acknowledged in its 2023 sustainability report that its emissions grew by 20% due to the training of new AI models. Even when energy comes from clean sources, the magnitude of consumption is still an issue.

This scenario poses an ethical and political dilemma: are we willing to sacrifice finite natural resources so that a machine can write a poem, draw an image or solve an equation? Who decides what kind of energy consumption is justifiable in the name of technological progress?

In conclusion, artificial intelligence is not innocuous. Its accelerated expansion entails environmental costs that we have not yet integrated into the public debate. Carbon footprint, water consumption, mineral extraction and energy expenditure defy the promise of “clean” technology. Recognising these impacts is the first step towards a more sustainable model, where the development of AI is not a threat to the ecological balance of the planet.

References:

  • Crawford, K. (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.
  • Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and Policy Considerations for Deep Learning in NLP. arXiv preprint.
  • Sadin, É. (2016). The silicolonization of the world. Caja Negra Editora.
  • Sandberg, A. (2020). Future of Humanity Institute, University of Oxford.
  • Google Environmental Report 2023.
  • EPA Data Center Energy Usage Report, USA.