Researchers say there’s a vulgar but more accurate term for AI hallucinations
-
In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.
The technical (but accurate) term
Sorry to your little sister
-
In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.
The technical (but accurate) term
Sorry to your little sister
In Trump circles, it's known as "alternative facts".
Asking questions is a skill CodeProject Forum Guidelines Google: C# How to debug code Seriously, go read these articles. Dave Kreskowiak
-
In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.
The technical (but accurate) term
Sorry to your little sister
Let's face it, calling any LLM program artificial "intelligence" is itself bovine excrement.
There are no solutions, only trade-offs.
- Thomas SowellA day can really slip by when you're deliberately avoiding what you're supposed to do.
- Calvin (Bill Watterson, Calvin & Hobbes) -
In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.
The technical (but accurate) term
Sorry to your little sister
Love it! :laugh: Now where's the browser extension to automatically replace all references to "AI" or "LLM" with "bullsh!t machine"?
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer
-
In a new paper published in the journal Ethics and Information Technology, a trio of philosophy researchers from the University of Glasgow in Scotland argue that referring to chatbot's propensity to make crap up shouldn't be referred to as "hallucinations," because it's actually something much less flattering.
The technical (but accurate) term
Sorry to your little sister
-
Love it! :laugh: Now where's the browser extension to automatically replace all references to "AI" or "LLM" with "bullsh!t machine"?
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer
Replacing LLM with BSM would preserve the layout of the page. :)
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
Love it! :laugh: Now where's the browser extension to automatically replace all references to "AI" or "LLM" with "bullsh!t machine"?
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer