News on Medial

Researchers upend AI status quo by eliminating matrix multiplication in LLMs

ArstechnicaArstechnica Ā· 6m
Researchers upend AI status quo by eliminating matrix multiplication in LLMs

Researchers have proposed a new method to run AI language models more efficiently by eliminating matrix multiplication, which is a core operation in neural networks. The study suggests that their approach could significantly reduce the environmental impact and operational costs of AI systems. The researchers developed a custom 2.7 billion parameter model without using matrix multiplication, achieving similar performance to conventional large language models. They also demonstrated running a 1.3 billion parameter model on a GPU accelerated by a custom-programmed FPGA chip, which consumed around 13 watts of power. The findings challenge the prevailing belief that matrix multiplication is essential for high-performing language models and could make large models more accessible and sustainable.

Comments

Download the medial app to read full posts, comements and news.