Science & Health

Latest Stories

New AI Chatbots use only 13W of energy with no performance loss

Written by:

Farwa Mehmood
Posted: 18-07-2024

New AI Chatbots use only 13W of energy with no performance loss

This is an AI-generated image created with Midjourney by Molly-Anna MaQuirl

The UC Santa Cruz AI research team has developed an astonishing method to revolutionize the use of modern algorithm models, especially large language models (LLMs), to reduce energy consumption without compromising performance. With billions of parameters, LLMs can run on just 13W, the equivalent of the power required to run a 100W LED bulb, by using custom Field Programmable Gate Array (FPGA) computer hardware. 

Removal of complex multiplication from the neural network 

Matrix multiplication (MatMul), a complex math operation, uses extensive power to train modern LLM algorithms. The UC team eliminates MatMul from the training and uses simpler operations to achieve remarkable efficiency gains. The simple tweaking of the math calculations to use -1, 0 and 1 allows the processors to sum the numbers rather than multiply them, saving a considerable amount on hardware costs and making no difference to the algorithm. Additionally, they introduced a time-based computation procedure that allows computers to remember past calculations, making it more efficient.  

Environmental risks associated with current AI models 

AI chatbots impact the environment and raise some concerns, including: 

High energy consumption 

Modern AI models, such as LLMs, require extensive energy. Powerful computer graphics processing units (GPUs) such as the Nvidia H100 and H200 require 700W of power, and each data center needs to run these models. The innovative Blackwell B200 uses up to 1200W per GPU and is expected to be available in the near future. 

Carbon footprint impact

This extensive electricity consumption converts into a significant carbon footprint emission, contributing to climate change, environmental degradation and global warming. 

Strain on power grid resources 

The immense power demand strains power grids and increases the operational costs for data centers to run these models. 

Sustainability growth concerns

Researchers warn that if artificial intelligence power consumption continues, it will consume a massive portion of a country's electricity, which is not sustainable and will raise sustainability issues for its development. Arms' CEO warns that AI will consume one-fourth of the US power by 2030. Reducing the power consumption to 1/50 of the current amount represents a massive improvement. 

Is this AI energy efficiency breakthrough a promising solution? 

Unquestionably, this innovative approach, which eliminates complex matrix multiplication and reduces energy consumption up to 50 times, will enable AI to have a positive, sustainable climate change impact

In the future, we expect giant tech companies such as OpenAI, Google, Nvidia and Meta to leverage this breakthrough and power their models more efficiently and sustainably. This inspiring vision brings us closer to an environmentally friendly era. If adopted widely and wisely, AI's transformative journey will result in sustainable and cost-effective advancements and reduce the risks posed by current AI models.

Explore More News

New AI Chatbots use only 13W of energy with no performance loss

Written by:

Farwa Mehmood
Posted: 18-07-2024

New AI Chatbots use only 13W of energy with no performance loss

This is an AI-generated image created with Midjourney by Molly-Anna MaQuirl

The UC Santa Cruz AI research team has developed an astonishing method to revolutionize the use of modern algorithm models, especially large language models (LLMs), to reduce energy consumption without compromising performance. With billions of parameters, LLMs can run on just 13W, the equivalent of the power required to run a 100W LED bulb, by using custom Field Programmable Gate Array (FPGA) computer hardware. 

Removal of complex multiplication from the neural network 

Matrix multiplication (MatMul), a complex math operation, uses extensive power to train modern LLM algorithms. The UC team eliminates MatMul from the training and uses simpler operations to achieve remarkable efficiency gains. The simple tweaking of the math calculations to use -1, 0 and 1 allows the processors to sum the numbers rather than multiply them, saving a considerable amount on hardware costs and making no difference to the algorithm. Additionally, they introduced a time-based computation procedure that allows computers to remember past calculations, making it more efficient.  

Environmental risks associated with current AI models 

AI chatbots impact the environment and raise some concerns, including: 

High energy consumption 

Modern AI models, such as LLMs, require extensive energy. Powerful computer graphics processing units (GPUs) such as the Nvidia H100 and H200 require 700W of power, and each data center needs to run these models. The innovative Blackwell B200 uses up to 1200W per GPU and is expected to be available in the near future. 

Carbon footprint impact

This extensive electricity consumption converts into a significant carbon footprint emission, contributing to climate change, environmental degradation and global warming. 

Strain on power grid resources 

The immense power demand strains power grids and increases the operational costs for data centers to run these models. 

Sustainability growth concerns

Researchers warn that if artificial intelligence power consumption continues, it will consume a massive portion of a country's electricity, which is not sustainable and will raise sustainability issues for its development. Arms' CEO warns that AI will consume one-fourth of the US power by 2030. Reducing the power consumption to 1/50 of the current amount represents a massive improvement. 

Is this AI energy efficiency breakthrough a promising solution? 

Unquestionably, this innovative approach, which eliminates complex matrix multiplication and reduces energy consumption up to 50 times, will enable AI to have a positive, sustainable climate change impact

In the future, we expect giant tech companies such as OpenAI, Google, Nvidia and Meta to leverage this breakthrough and power their models more efficiently and sustainably. This inspiring vision brings us closer to an environmentally friendly era. If adopted widely and wisely, AI's transformative journey will result in sustainable and cost-effective advancements and reduce the risks posed by current AI models.