Researchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.comexternal-link BrikoX ( @BrikoX@lemmy.zip ) M Technology@lemmy.zipEnglish • 3 months ago message-square3fedilinkarrow-up128
arrow-up128external-linkResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.com BrikoX ( @BrikoX@lemmy.zip ) M Technology@lemmy.zipEnglish • 3 months ago message-square3fedilink
minus-square bitfucker ( @bitfucker@programming.dev ) linkfedilinkEnglish4•edit-23 months agoGood Edit: Oh shit nvm. It still requires dedicated HW (FPGA). This is no different than say, an NPU. But to be fair, they also said the researcher tested the model on traditional GPU too and reduce memory consumption.
Good
Edit: Oh shit nvm. It still requires dedicated HW (FPGA). This is no different than say, an NPU. But to be fair, they also said the researcher tested the model on traditional GPU too and reduce memory consumption.