Environmentaly Sustainable Artificial Intelligence

AI and Sustainability

Environmental Dangers of AI Systems

There are many environmental dangers of AI. One of the most significant dangers is the extremely large use of energy needed to train large language models such as ChatGPT and others. This is because the training of these models requires a tremendous amount of data that is processed by many GPUs. It is estimated that ChatGPT consumes over 500,000 kilowatt-hours of electricity every day. More than 17,000 times that of the average American household. This problem is so bad that CEO of OpenAI Sam Altman states a technological breakthrough is needed to address this issue. (The New Yorker)

Another significant danger of AI is water consumption. The data centers used to train these AI systems and large language models create a lot of heat through GPU useage, and water is used to cool down the GPUs to help them run more efficiently, water is also consumed in the generation of electricity in coal and nuclear power plants, so AIs large use of energy also increases water consumption this way. It is estimated that around 16 ounces of water, about the amount of one water bottle, is used every 10 to 50 responses made to ChatGPT. This adds up to an estimated 6.6 to 8.6 billion cubic yards of water use by AI by 2027, roughly 4 to 6 times the amount of water the whole nation of Denmark uses. (News Week)

There are various ways that companies and industry leaders are addressing these important issues. Continue scrolling and exploring this website to learn about all the ways these issues are being addressed.

Addressing Energy Demands

Optimization

Optimizing the algorithms and data centers that go into training AI systems can drastically reduce the amount of energy consumed.

Renewable Energy

One of the biggest ways companies are addressing the energy needs of AI is by building more sources of renewable energy.

Heat Reuse

Almost all the energy used by AI is converted to heat. This heat can be reused to help increase sustainability.

Edge Computing

Edge Computing is a training process that allows for more efficient training of AI systems and less computer infrastructure.