New chip uses AI to shrink large language models’ energy footprint by 50%
Oregon State University College of Engineering researchers have developed a more efficient chip as an antidote to the vast amounts of electricity consumed by large-language-model artificial intelligence applications like Gemini and GPT-4.
Doctoral student Ramin Javadi, said:
We have designed and fabricated a new chip that consumes half the energy compared to traditional designs,
who, along with Tejasvi Anand, associate professor of electrical engineering, presented the technology at the IEEE Custom Integrated Circuits Conference in Boston.
Anand, who directs the Mixed Signal Circuits and Systems Lab at OSU, said:
The problem is that the energy required to transmit a single bit is not being reduced at the same rate as the data rate demand is increasing,
“That’s what is causing data centers to use so much power.”
The new chip itself is based on AI principles that reduce electricity use for signal processing, Javadi said.
He said,
Large language models need to send and receive tremendous amounts of data over wireline, copper-based communication links in data centers, and that requires significant energy,
“One solution is to develop more efficient wireline communication chips.”
When data is sent at high speeds, Javadi explains, it gets corrupted at the receiver and has to be cleaned up. Most conventional wireline communication systems use an equalizer to perform this task, and equalizers are comparatively power-hungry.
Javadi said,
We are using those AI principles on-chip to recover the data in a smarter and more efficient way by training the on-chip classifier to recognize and correct the errors,
Javadi and Anand are working on the next iteration of the chip, which they expect to bring further gains in energy efficiency.
READ the latest news shaping the AI Chips market at AI Chips News
New chip uses AI to shrink large language models’ energy footprint by 50%, source






Add comment