This is an AI-generated image created with Midjourney by Molly-Anna MaQuirl
The UC Santa Cruz AI research team has developed an astonishing method to revolutionize the use of modern algorithm models, especially large language models (LLMs), to reduce energy consumption without compromising performance. With billions of parameters, LLMs can run on just 13W, the equivalent of the power required to run a 100W LED bulb, by using custom Field Programmable Gate Array (FPGA) computer hardware.
Matrix multiplication (MatMul), a complex math operation, uses extensive power to train modern LLM algorithms. The UC team eliminates MatMul from the training and uses simpler operations to achieve remarkable efficiency gains. The simple tweaking of the math calculations to use -1, 0 and 1 allows the processors to sum the numbers rather than multiply them, saving a considerable amount on hardware costs and making no difference to the algorithm. Additionally, they introduced a time-based computation procedure that allows computers to remember past calculations, making it more efficient.
AI chatbots impact the environment and raise some concerns, including:
Modern AI models, such as LLMs, require extensive energy. Powerful computer graphics processing units (GPUs) such as the Nvidia H100 and H200 require 700W of power, and each data center needs to run these models. The innovative Blackwell B200 uses up to 1200W per GPU and is expected to be available in the near future.
This extensive electricity consumption converts into a significant carbon footprint emission, contributing to climate change, environmental degradation and global warming.
The immense power demand strains power grids and increases the operational costs for data centers to run these models.
Researchers warn that if artificial intelligence power consumption continues, it will consume a massive portion of a country's electricity, which is not sustainable and will raise sustainability issues for its development. Arms' CEO warns that AI will consume one-fourth of the US power by 2030. Reducing the power consumption to 1/50 of the current amount represents a massive improvement.
Unquestionably, this innovative approach, which eliminates complex matrix multiplication and reduces energy consumption up to 50 times, will enable AI to have a positive, sustainable climate change impact.
In the future, we expect giant tech companies such as OpenAI, Google, Nvidia and Meta to leverage this breakthrough and power their models more efficiently and sustainably. This inspiring vision brings us closer to an environmentally friendly era. If adopted widely and wisely, AI's transformative journey will result in sustainable and cost-effective advancements and reduce the risks posed by current AI models.
This is an AI-generated image created with Midjourney by Molly-Anna MaQuirl
The UC Santa Cruz AI research team has developed an astonishing method to revolutionize the use of modern algorithm models, especially large language models (LLMs), to reduce energy consumption without compromising performance. With billions of parameters, LLMs can run on just 13W, the equivalent of the power required to run a 100W LED bulb, by using custom Field Programmable Gate Array (FPGA) computer hardware.
Matrix multiplication (MatMul), a complex math operation, uses extensive power to train modern LLM algorithms. The UC team eliminates MatMul from the training and uses simpler operations to achieve remarkable efficiency gains. The simple tweaking of the math calculations to use -1, 0 and 1 allows the processors to sum the numbers rather than multiply them, saving a considerable amount on hardware costs and making no difference to the algorithm. Additionally, they introduced a time-based computation procedure that allows computers to remember past calculations, making it more efficient.
AI chatbots impact the environment and raise some concerns, including:
Modern AI models, such as LLMs, require extensive energy. Powerful computer graphics processing units (GPUs) such as the Nvidia H100 and H200 require 700W of power, and each data center needs to run these models. The innovative Blackwell B200 uses up to 1200W per GPU and is expected to be available in the near future.
This extensive electricity consumption converts into a significant carbon footprint emission, contributing to climate change, environmental degradation and global warming.
The immense power demand strains power grids and increases the operational costs for data centers to run these models.
Researchers warn that if artificial intelligence power consumption continues, it will consume a massive portion of a country's electricity, which is not sustainable and will raise sustainability issues for its development. Arms' CEO warns that AI will consume one-fourth of the US power by 2030. Reducing the power consumption to 1/50 of the current amount represents a massive improvement.
Unquestionably, this innovative approach, which eliminates complex matrix multiplication and reduces energy consumption up to 50 times, will enable AI to have a positive, sustainable climate change impact.
In the future, we expect giant tech companies such as OpenAI, Google, Nvidia and Meta to leverage this breakthrough and power their models more efficiently and sustainably. This inspiring vision brings us closer to an environmentally friendly era. If adopted widely and wisely, AI's transformative journey will result in sustainable and cost-effective advancements and reduce the risks posed by current AI models.