Do you want to save the planet from AI? Throw in an FPGA and throw away the matrix

Large language models can be made 50 times more energy efficient with alternative mathematics and custom hardware, researchers at the University of California, Santa Cruz claim. In an article titled “Scalable MatMul-free Language Modeling,” authors Rui-Jie Zhu, Yu Zhang, Ethan Sifferman, Tyler Sheaves, Yiqiao Wang, Dustin Richmond, Peng Zhou, and Jason Eshraghian describe how to … Read more