Eight tips to use AI more sustainably
Using generative artificial intelligence (AI) tools comes with a direct environmental impact. Sustainability Committee member and legal counsel within the NatWest Group Hannah Gardner discusses how to use them in the most sustainable way possible.
Many of us now use generative AI tools at work (such as CoPilot) or in our personal lives (like ChatGPT). These tools have the incredible capacity to increase efficiency and innovation. Next generation AI models have the potential to deliver huge sustainable benefits, but also have a direct impact on the environment.
With COP30 taking place this week in Belem, Brazil, and ahead of our Climate Perspectives - AI and Sustainability event on 25 November, it's important to highlight one of the key environmental concerns with using generative AI.
Use of tools such as CoPilot produces ‘tokens’ that use computational resources. The longer your prompt or output, the more tokens are produced and the increased computation leads to higher energy consumption.
There are some quick and easy changes that we can make in our daily use of generative AI to continue to receive the benefits from the tools, but to do so more sustainably.
The following are some low-carbon prompting tips from our Head of Responsible AI at NatWest, which you can put into practice:
- Be specific and goal-oriented. For example, ‘List three key impacts of climate change on agriculture, each under 20 words’. Specific prompts use fewer tokens.
- No need to be polite! Avoid ‘please’ or ‘can you’. Instead, give direct instructions: ‘Do X’. This will also generate more accurate results.
- Limit output with caps. For example, ‘Summarise this regulation in 100 words’. This prevents token overuse.
- Ask for tables or lists. This forces concise responses.
- Combine tasks in one prompt. When you prompt, you are ‘calling’ the model. If you combine tasks – asking for various actions in one prompt - you are making fewer model calls, so generating lower emissions.
- Prefer estimates when you don’t need precision. For example, ‘Approximately how many people live in Scotland’. High precision requests demand more resources.
- Trim context. Don’t upload huge documents, if you only really need the AI tool to look at one section.
- Reduce default AI use in searches. Search engines like Google and Bing now automatically use their LLM models to give you an immediate and consolidated response. If you put ‘-AI’ into the search bar when you ask it a question, it won’t use generative AI.
About the author
Hannah Gardner is a legal counsel (Outsourcing, Technology & IP) with The Royal Bank of Scotland, part of the NatWest Group, and a member of the Law Society of Scotland's Sustainability Committee.