Tech

AI is an energy hog

Artificial intelligence programs like ChatGPT are developing quickly and enabling computers to behave in what are perceived as more human-like ways. But a new study has concluded that it is also a very energy intensive technology with the potential to consume as much electricity as an entire country.

Having a chat with a bot, asking it to write a story or draw a landscape on your device, is made to look easy, but there is much more going on behind the scenes. The electronic brain works hard handling vast amounts of information and consuming vast amounts of energy to produce what seems like a human response.  

Alex de Vries of the Free University Amsterdam School of Business and Economics researches the environmental impact of digital technology. In findings recently published in the journal Joule, de Vries used published reports to anticipate the extra energy that will be needed to support future AI systems.

Studies have shown AI programs like ChatGPT use ten times more energy than a regular Google search for similar queries. (CBC)

Both training and subsequently running the AI algorithm to satisfy user requests require a great deal of computing power, and thus electricity. De Vries quotes previous studies that found that AI programs such as ChatGPT or Alphabet’s Bard consume approximately ten times more energy than a regular Google search for similar queries. 

Given those findings, de Vries calculated that if Google were to enhance their search product with AI, it could result in up to a 30 per cent increase in the energy it uses. That would mean annual use of 29.3 Twh (terawatt hours) of electricity, which is more energy than the country of Ireland uses. 

There are concerns over the extra burden electric cars will place on the grid. Now we need to consider the higher needs of artificial intelligence. Where will all that extra electricity come from — fossil fuels or green energy sources?

Efforts are underway to improve the efficiency of artificial intelligence programs so they consume less energy, but as we have seen with other technology, when it becomes more efficient, demand tends to also rise, so the overall energy consumption still increases. 

To meet the extra energy demand of AI,  Microsoft is considering going nuclear to meet its power demands.

In a job posting for a Principal Program Manager in Nuclear Technology the company advertised that it is looking for an expert to investigate the use of Small Modular Reactors (SMR) to supply its energy needs. These reactors are smaller than a school bus, similar in scale to those that have been powering nuclear aircraft carriers and submarines for decades. 

An illustration of a transport truck. Inside the body of a truck is a long, tubular object.
Small modular reactors (SMRs) are generally designed to be small enough to manufacture in a factory and be transported via a semi truck or shipping container. Microsoft is looking into the use of SMRs to meet energy needs. (U.S. Department of Energy)

The concept being developed for SMRs is for reactors to be made in factories, assembly-line style, to reduce costs. The self-contained units would then be transported to sites where they would provide roughly 300 MW of electricity — enough to power a small town or an industrial computer server farm. 

There are many different designs for SMRs and only prototypes have been built, but they could meet the 24/7 power demands of computer giants without carbon emissions. 

Intelligent systems are a powerful tool for handling large data, improving scientific research, medical diagnosis and countless other applications in the future. It’s useful, however, to be cognizant of the potential environmental costs of these advances.

See also  Rise of solar power ushers planet toward 'age of electricity,' energy agency says 

Related Articles

Leave a Reply

Back to top button