Skip to main content
This blog, written by our Technical Lead, Kyle Bouwknecht, breaks down AI Environmental Impacts and equips you with a starter pack for knowing when and how to use AI responsibly.

 

Key Takeaways

  • Not all AI tools are equal. Large, general-purpose models use far more energy and water than smaller or specialized tools built for specific tasks.
  • Image and video generation carry a much higher footprint. AI-generated visuals, especially video, can consume orders of magnitude more energy than text-based queries.
  • Efficiency gains aren’t guaranteed. Newer or more advanced AI models are not always more efficient. Higher performance often comes with higher resource use.
  • AI infrastructure affects real communities. Data centres place significant strain on electricity grids and freshwater supplies, with disproportionate impacts on nearby populations.
  • Intentional use makes a difference. Choosing the right tool, avoiding unnecessary AI use, and opting for smaller or local models can significantly reduce environmental impact.
Responsible AI Use at a Glance
  • Choose the right tool: Use specialized tools (e.g., Grammarly for writing, Semantic Scholar for research) instead of large general-purpose models.
  • Think smaller: When you need an LLM, opt for lightweight or local models like SmolLM WebGPU.
  • Don’t default to AI: If a web search or traditional tool works, AI may be unnecessary.
  • Use AI to support expertise: AI works best when it enhances existing knowledge.
  • Watch for passive AI use: Many search engines enable AI by default. Turn it off or switch to classic search options.

 

AI: The Elephant in the Room with a Big Footprint

Over the past few years, the technology sector has seen the rise of artificial intelligence (AI). Maybe you’ve actively chosen to use AI by asking ChatGPT a question or by having Meta AI create a funny picture. Or maybe it’s been added to systems or services you use every day, resulting in passive use. Either way, AI has become ubiquitous. Every major corporation is scrambling to level up its services with AI tools. Have a question to ask your bank? Use the AI chatbot. Need help making that email sound a little less intense? Ask Gemini. Need to put a cowboy hat on your head in that picture of you from your office’s Western party? Enter Adobe Firefly.
If you can think of a digital task, someone has come up with a way for AI to do it. This creates so much buzz and excitement that it becomes easy to get swept up in AI use, and before you know it, you’re consistently using a suite of AI tools to help with things you did without AI not too long ago.
So, the question becomes: what’s the cost, and what can we do about it?

 

Understanding AI

AI is a broad term that covers several different technologies. A conversational chatbot assistant, for example, is vastly different from a generative AI tool that makes videos. Different types of AI tools come with various considerations, but generative AI, such as large language and multimodal models (LLMs), like ChatGPT, CoPilot, and Gemini, are the tools most used, so that is what we are focusing on in this post.
More recently, we’ve seen the introduction of ‘reasoning’ models, which perform extensive background processing before delivering a response. This extra work can consume up to 700 times more energy than a non-reasoning model would consume for the same query. Many large LLMs, such as ChatGPT, are looking to add reasoning to their products, which would significantly increase energy use and emissions intensity.
There is also AI image and video generation, which, if you’ve been anywhere on social media, seems to be taking over our feeds, even LinkedIn. According to Forbes, 71% of images on social media are now AI-generated, raising a host of ethical questions related to creative IP.
Impacts from AI can be generally categorized into two types: direct impacts and “higher-order”, or indirect impacts. Higher-order impacts are difficult to track and still poorly understood, so in this post, we’re looking at the direct impacts, specifically energy and water, and their social implications.

 

Energy Use

You have probably heard or read about how AI is energy hungry – but just how hungry is it? Well, it depends. AI models are trained on massive amounts of data, known as parameters, that enable them to generate a response to a request. The number of parameters loosely correlates with the model’s capacity to handle complex requests (though this assertion is being challenged) and with the energy they consume. Most of these companies, like OpenAI (Chat-GPT), Google (Gemini), and Anthropic (Claude), do not provide any information on the number of parameters that their models have or how much energy they consume; however, research suggests that some of these models likely have  in excess of 1 trillion parameters. By comparison, an open-source model (SmolLM 1.7B) from HuggingFace has 1.7 billion parameters, at least 588 times fewer. To understand the energy use between different models, we ran the numbers on energy consumption for one query using data from MIT and HuggingFace for four different models, and this is what we found:
Estimated energy consumption for a single query for four AI models. The number of parameters for ChatGPT-4 is estimated to be one trillion

bar graph showing 4 Ai models and their energy consumptionSo, what do the numbers mean? The values in the graph are given in watt-hours (Wh), a unit of energy equal to one watt of power expended over one hour. A typical modern LED light bulb is rated at 6 watts, so leaving it on for one hour would consume 6 Wh of energy. Your electricity provider likely bills you per kilowatt-hour (kWh); one kWh is equivalent to one thousand Wh.

With that in mind, we can see that an average GPT-4 query consumes nearly half the energy of an LED light bulb used for an hour. This might not seem like much, but consider that ChatGPT alone has surpassed 2.5 billion queries per day. If all  those queries were text only, that’s over 2.5 billion kWh a year. If ChatGPT were a country, it would be 147th largest electricity consumer, and ChatGPT is just one of thousands of generative AI models out there, with more being released every day. Conversely, if those same 2.5 billion queries were performed on the SmolLM 1.7B model, it would use about 6.9 million kWh – about 370 times less than ChatGPT.
Then there’s the energy input for image and video generation. These energy requirements vary widely, according to the International Energy Agency (IEA). Image generation uses, on average equates to about 1.7 Wh per image – roughly 50% of the text models we discussed above. Video, on the other hand, requires generating a set of individual images that are stitched together. For example, a typical video frame rate is 24 frames per second. Generating a 10-second video (about 240 images) would use 408 Wh of electricity – that’s 144 times more energy than a text query using GPT-4. Images, videos and chat queries contribute to the rapidly growing need for more data centres, a need which big AI companies are scrambling to fill regardless of environmental or social impacts. As image and video models evolve, pursuing higher detail and greater realism, these numbers will grow. This means energy consumption and resulting emissions vary depending on how you use AI and the models you choose.

 

Water Use

These data centres also consume massive amounts of water. The vast majority of data centres use water cooling to keep server racks at an operable temperature, with some using up to 19 million litres of water per day (equivalent to a town of up to 50,000 people). Due to how these systems operate, they must use fresh water, which puts stress on an already extremely limited resource: only 0.5% of water globally is fit for human consumption. Estimates suggest that data centres for generative AI models can use four to nine litres of water per kWh of compute energy. Thinking back to the ChatGPT estimate above and using an average value of 6.5 litres per kWh, 2.5 billion daily queries would “drink” about 46 million litres of water. Over the course of a year, that’s like draining 6,700 Olympic swimming pools and is roughly equivalent to the water consumed in a year by 206,000 Canadians. And again, ChatGPT is just one of thousands of models.

 

The Emissions of AI

Large models require significant energy, and all this electricity consumption results in greenhouse gas emissions. These emissions depend on the source of the electricity generation – renewable energy, such as solar, wind, or hydroelectric, is the best option, but these generation methods are often not suitable for the rate at which data centres are being built, and in the case of solar and wind, they struggle to match the constant power demand of data centres. Based on 2024 data centre energy consumption and global average electricity emissions intensity, data centres produced nearly 205 million tonnes of CO2 emissions. That’s almost as much as the entire country of Ukraine. As useful as these tools are, we cannot ignore their significant energy demand and the resulting emissions.
It’s worth considering how generative AI compares with streaming video, since nearly everyone uses Netflix, Apple TV, YouTube, or one of the dozens of other platforms. A frequent talking point is that using generative AI is a “drop in the bucket” compared to streaming video, but, when looking at the numbers, this argument doesn’t hold up. A report from The Carbon Trust demonstrated that, on average, an hour of streaming video consumes about 21 Wh of electricity before it reaches your device. Streaming video does require more energy from your devices (about 89% of all streaming video energy consumption), but this impact isn’t included in AI energy estimates, so we’ve excluded it to allow for an apples-to-apples comparison. To put this into context, every 10-second video generated using Sora AI uses roughly the same amount of energy as streaming 45.5 hours of online video (when including the user device for streaming, generating a 10-second AI video still uses over 5.5 times more energy than an hour of streaming). In addition, streaming video energy consumption occurs almost exclusively in your home, and doesn’t have the same disproportionate impact on the communities that host the AI data centres.

 

Social Implications of Energy and Water Consumption

This excessive water use strains communities near these data centres, with residents reporting contamination and reduced access to this finite resource. In addition to water use, the rapid development of new AI data centres is straining electricity grids, prompting some companies to generate their own power. xAI, for example, has over 400 MW of natural gas turbines operating to power its Colossus data centre outside of Memphis. These turbines generate significant greenhouse gas emissions and pollutants such as nitrogen oxides, which degrade local air quality. xAI is also the subject of a lawsuit stating that it is operating without proper permits in an area already impacted by poor air quality. This isn’t unique to xAI, either. Global energy demand for data centres is expected to increase from 460 TWh in 2024 to 1,050 TWh by 2026. This is like adding another Germany’s worth of electricity consumption to the world.
We’re also learning that the notion that LLMs are getting more efficient isn’t exactly true, either. A recent post from HuggingFace demonstrated that, across 15 new models they assessed, nine of them had equal or greater energy use compared to similarly sized models tested earlier in the year.

 

Future Potential for Model Collapse

As we continue to produce more and more AI-generated content and upload it to the internet, we are saturating the very data source on which so many AI models are trained with AI-generated content. If model training were to use recursively generated data indiscriminately, it’s very possible that the models of the future could collapse altogether, or, at the very least, be so diluted by AI-generated content that nothing created by humans remains.

 

So What Does This Mean?

AI is not going away, and with its projected growth in the coming years, the energy & water demands and subsequent emissions will increase. To mitigate this damage globally, we must advocate for increased use of renewable energy and alternative cooling solutions, such as immersion cooling, to avoid using freshwater. While these movements build, there are steps individuals can take to reduce their impact when using AI.

 

What Can We Do?

It can be daunting to figure out how to use AI responsibly without avoiding it altogether, but there are simple steps we can all take to leverage the benefits of AI while mitigating its impacts. Here are our top tips:
  1.  Choose the right tool for the job. Large program models like ChatGPT use more parameters to have a wide variety of questions they can answer, but that means more energy use. Using specialized models is just as accurate but uses fewer parameters to answer specific queries, resulting in less energy use. This means if you’re writing code, use an AI code writing assistant that has been trained on code. If you need help with grammar, try Grammarly. Some specialized models we recommend are:
    1. For writing: Grammarly: Free AI Writing Assistance – built on Grammarly’s wealth of experience in helping people write better
    2. For research: Semantic Scholar | AI-Powered Research Tool – uses a model developed in-house to search for relevant research papers contextually
    3. General questions:  SmolLM WebGPU
  2. Start with what you already know. AI is most effective when it’s used to bolster existing expertise. An expert knows when something should be done with “old tech” versus when AI adds meaningful value and has the insight to understand if AI results are valid or not.
  3. Think critically. If you find yourself defaulting to using AI, step back and think, “How would I have done this before ChatGPT existed?” If a web search is just as effective, avoid using AI. You don’t need Chat-GPT to tell you how to boil an egg.
  4. Think small. If you need an LLM, look for a model with a small number of parameters. A favourite of ours is SmolLM WebGPU which, once loaded, can run locally in your browser, skipping the data centre altogether, reducing energy and water use.
  5. Switch search engines. Most search engines have now fully integrated AI into their results, meaning you are using AI without trying to. However, there are some browsers that either don’t have it at all or have an option to turn it off. Go to your browser settings and switch off automatic AI answers, or switch browsers. Our favourite for a more classic search engine experience is Dogpile.com

 

Conclusion

As AI becomes more embedded in our daily lives, it’s crucial to recognize the energy footprint behind the technology. Choosing models wisely, resisting the urge to treat AI like a search engine, and advocating for sustainable practices such as renewable energy and innovative cooling methods are all steps we can take to reduce its impact. Most importantly, the conversation doesn’t end here. AI is a rapidly evolving topic, and by sharing what we learn, we can inspire others to think critically, act responsibly, and push for a future where innovation and sustainability go hand in hand.

 


 

References

United Nations Environment Programme. (2025, November 13). AI has an environmental problem. Here’s what the world can do about that. https://www.unep.org/news-and-stories/story/ai-has-environmental-problem-heres-what-world-can-do-about
Leffer, L. (2023, November 21). When it comes to AI models, bigger isn’t always better. Scientific American. https://www.scientificamerican.com/article/when-it-comes-to-ai-models-bigger-isnt-always-better/
O’Donnell, J., & Crownhart, C. (2025, May 20). We did the math on AI’s energy footprint. Here’s the story you haven’t heard. MIT Technology Review. https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/
AIEnergyScore. (n.d.). AI Energy Score leaderboard. Hugging Face. https://huggingface.co/spaces/AIEnergyScore/Leaderboard
DataStudios. (2024, July 22). ChatGPT hits record 2.5 billion queries per day amid rapid user expansion. https://www.datastudios.org/post/chatgpt-hits-record-2-5-billion-queries-per-day-amid-rapid-user-expansion
Ember. (n.d.). Yearly electricity data. https://ember-energy.org/data/yearly-electricity-data/
De Chant, T. (2025, June 18). xAI is facing a lawsuit for operating over 400 MW of gas turbines without permits. TechCrunch. https://techcrunch.com/2025/06/18/xai-is-facing-a-lawsuit-for-operating-over-400-mw-of-gas-turbines-without-permits/
International Energy Agency. (2024). Electricity 2024: Analysis and forecast to 2026. https://iea.blob.core.windows.net/assets/6b2fd954-2017-408e-bf08-952fdd62118a/Electricity2024-Analysisandforecastto2026.pdf
International Energy Agency. (2025). Electricity 2025: Emissions. https://www.iea.org/reports/electricity-2025/emissions
Yañez-Barnuevo, M. (2025, June 25). Data centers and water consumption. Environmental and Energy Study Institute. https://www.eesi.org/articles/view/data-centers-and-water-consumption
Luccioni, S. & Gamazaychikov, B. (2025, December 4). AI Energy Score v2: Refreshed Leaderboard, now with Reasoning. Hugging Face. https://huggingface.co/blog/sasha/ai-energy-score-v2
Marr, B. (2025, March 10). 15 mind blowing AI statistics everyone must know about now. Forbes. https://www.forbes.com/sites/bernardmarr/2025/03/10/15-mind-blowing-ai-statistics-everyone-must-know-about-now/
International Energy Agency. (2025, April 10). Energy and AI. https://www.iea.org/reports/energy-and-ai
Shumailov, I., Shumaylov, Z., Zhao, Y. et al. AI models collapse when trained on recursively generated data. Nature 631, 755–759 (2024). https://doi.org/10.1038/s41586-024-07566-y
The Carbon Trust. (2021). Carbon impact of video streaming. Carbon-impact-of-video-streaming.pdf
Alexander, A. (2025, November 21). Every Sora AI video burns 1 Kilowatt hour and emits 466 grams of carbon. And for what, exactly? https://reclaimedsystems.substack.com/p/every-sora-ai-video-burns-1-kilowatt