It’s almost as if these large language models are extremely unsuited to the primary application being imagined for them: knowledge retrieval.

The choice of the term “hallucination” to describe what is happening with these language models is fundamentally misleading, and feels calculated to obscure their probabilistic nature.

Characterizing the fabricated output as “hallucination” means, by implication, that correct output is “knowledge”—that the incorrect results are aberrations, not the result of the exact same blind processes that sometimes produce factually sound results.

Hey, ChatGPT, would I call the answers I get from you that are factually incorrect lies, untruths, or falsehoods?

Call them hallucinations.

Robotistry

@robotistry@sciencemastodon.com

You could argue that "hallucination" is a more accurate description - these systems literally have no mechanism to separate facts from lies - they have no intent to lie or tell the truth and can't represent those concepts.

Humans recognize hallucinations as wrong because they have systems in the brain that say "that can't have been real".

LLMs can't recognize lies because they don't have referents for "real".

May 28, 2023 at 11:33:38 AM

LLMs can’t “recognize” anything. They can’t “perceive” anything. And that’s why using sensory-oriented terminology (like “hallucination”) with LLMs is misleading and incorrect. It’s wrong both about what human hallucinations are and what’s going on in an LLM.

It’s more like when Trump is rambling on in one of his speeches, just stringing together phrases and thoughts haphazardly. So I’d like to propose that it be called “trumpeting”.

Didn't Sarah Palin do it first? How about "palining" (to avoid confusion since repurposing words can be tricky! 😆)

Elk Logo

Elk is in Preview!

Thanks for your interest in trying out Elk, our work-in-progress Mastodon web client!

Expect some bugs and missing features here and there. we are working hard on the development and improving it over time.

Elk is Open Source. If you'd like to help with testing, giving feedback, or contributing, reach out to us on GitHub and get involved.

To boost development, you can sponsor the Team through GitHub Sponsors. We hope you enjoy Elk!

Patak三咲智子 Kevin DengDaniel RoeAnthony Fu

The Elk Team