Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Other Discussions
  3. The Insider News
  4. Researchers upend AI status quo by eliminating matrix multiplication in LLMs

Researchers upend AI status quo by eliminating matrix multiplication in LLMs

Scheduled Pinned Locked Moved The Insider News
csscom
3 Posts 3 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K Offline
    K Offline
    Kent Sharkey
    wrote on last edited by
    #1

    Ars Technica[^]:

    Running AI models without floating point matrix math could mean far less power consumption.

    Unfortunately, no one can be told what matrix multiplication is. You have to create the dot product yourself.

    J N 2 Replies Last reply
    0
    • K Kent Sharkey

      Ars Technica[^]:

      Running AI models without floating point matrix math could mean far less power consumption.

      Unfortunately, no one can be told what matrix multiplication is. You have to create the dot product yourself.

      J Offline
      J Offline
      jochance
      wrote on last edited by
      #2

      My gut says if they've done that (it says yet-to-be-peer-reviewed) then they've probably found a better way of doing matmul but are maybe focused on its being different rather than its being the same.

      1 Reply Last reply
      0
      • K Kent Sharkey

        Ars Technica[^]:

        Running AI models without floating point matrix math could mean far less power consumption.

        Unfortunately, no one can be told what matrix multiplication is. You have to create the dot product yourself.

        N Offline
        N Offline
        Nelek
        wrote on last edited by
        #3

        Article wrote:

        Running AI models without floating point matrix math could mean far less power consumption.

        that could be a good thing... they will now have to do things correctly because they are fully integer :rolleyes: :-D

        M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.

        1 Reply Last reply
        0
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        • Login

        • Don't have an account? Register

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • World
        • Users
        • Groups