Researchers upend AI status quo by eliminating matrix multiplication in LLMs
-
Running AI models without floating point matrix math could mean far less power consumption.
Unfortunately, no one can be told what matrix multiplication is. You have to create the dot product yourself.
-
Running AI models without floating point matrix math could mean far less power consumption.
Unfortunately, no one can be told what matrix multiplication is. You have to create the dot product yourself.
-
Running AI models without floating point matrix math could mean far less power consumption.
Unfortunately, no one can be told what matrix multiplication is. You have to create the dot product yourself.
Article wrote:
Running AI models without floating point matrix math could mean far less power consumption.
that could be a good thing... they will now have to do things correctly because they are fully integer :rolleyes: :-D
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.