Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Other Discussions
  3. The Insider News
  4. Researchers say there’s a vulgar but more accurate term for AI hallucinations

Researchers say there’s a vulgar but more accurate term for AI hallucinations

Scheduled Pinned Locked Moved The Insider News
csscom
7 Posts 7 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K Offline
    K Offline
    Kent Sharkey
    wrote on last edited by
    #1

    Futurism[^]:

    In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.

    The technical (but accurate) term

    Sorry to your little sister

    D T Richard DeemingR O 4 Replies Last reply
    0
    • K Kent Sharkey

      Futurism[^]:

      In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.

      The technical (but accurate) term

      Sorry to your little sister

      D Offline
      D Offline
      Dave Kreskowiak
      wrote on last edited by
      #2

      In Trump circles, it's known as "alternative facts".

      Asking questions is a skill CodeProject Forum Guidelines Google: C# How to debug code Seriously, go read these articles. Dave Kreskowiak

      1 Reply Last reply
      0
      • K Kent Sharkey

        Futurism[^]:

        In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.

        The technical (but accurate) term

        Sorry to your little sister

        T Offline
        T Offline
        TNCaver
        wrote on last edited by
        #3

        Let's face it, calling any LLM program artificial "intelligence" is itself bovine excrement.

        There are no solutions, only trade-offs.
           - Thomas Sowell

        A day can really slip by when you're deliberately avoiding what you're supposed to do.
           - Calvin (Bill Watterson, Calvin & Hobbes)

        1 Reply Last reply
        0
        • K Kent Sharkey

          Futurism[^]:

          In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.

          The technical (but accurate) term

          Sorry to your little sister

          Richard DeemingR Offline
          Richard DeemingR Offline
          Richard Deeming
          wrote on last edited by
          #4

          Love it! :laugh: Now where's the browser extension to automatically replace all references to "AI" or "LLM" with "bullsh!t machine"?


          "These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer

          "These people looked deep within my soul and assigned me a number based on the order in which I joined" - Homer

          D J 2 Replies Last reply
          0
          • K Kent Sharkey

            Futurism[^]:

            In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.

            The technical (but accurate) term

            Sorry to your little sister

            O Offline
            O Offline
            obermd
            wrote on last edited by
            #5

            When you consider the training source (public internet), of course LLMs spit out crap.

            1 Reply Last reply
            0
            • Richard DeemingR Richard Deeming

              Love it! :laugh: Now where's the browser extension to automatically replace all references to "AI" or "LLM" with "bullsh!t machine"?


              "These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer

              D Offline
              D Offline
              Daniel Pfeffer
              wrote on last edited by
              #6

              Replacing LLM with BSM would preserve the layout of the page. :)

              Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

              1 Reply Last reply
              0
              • Richard DeemingR Richard Deeming

                Love it! :laugh: Now where's the browser extension to automatically replace all references to "AI" or "LLM" with "bullsh!t machine"?


                "These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer

                J Offline
                J Offline
                jochance
                wrote on last edited by
                #7

                "digishyster"?

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups