In November 2022, the launch of the popular artificial intelligence (“AI”) app, ChatGPT, made generative AI applications available to a large public audience. There are multiple applications of AI in the real world, with varying degrees of capability. Machines may never get to human-like self-awareness, but generative AIs are getting better and better at cognition, which is the ability to incorporate past actions and learning experiences into future practices. AIs are now generating text, poems, research papers, drawings, and videos. In order to do all this, these models require to be fed vast amounts of data, which entails using a significant amount of computing power and thus energy. Digiconomist estimates that by 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually. This would be comparable to the annual electricity needs of countries like the Netherlands, Argentina and Sweden. The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy intensive, so we don’t want to put it in all kinds of things where we don’t actually need it.