Settings

Theme

LLM Hallucination Seems Like a Big Problem, Not a Mere Speedbump

freddiedeboer.substack.com

8 points by blueridge 4 months ago · 3 comments

Reader

poulpy123 4 months ago

LLM are statistical language models. They don't hallucinate because they have no brain or senses for that.

  • more_corn 4 months ago

    Pedantic comment. It’s commonly understood that hallucination means “made up crap generated by an LLM” we could push for a better name like fabrication, but then we have to re-train all the 95% of the population who don’t even know LLMs aren’t trustworthy.

    • poulpy123 4 months ago

      > "commonly understood"

      That's the point: even on hacker news LLM are not understood even in the basics of the basics. I refuse to bend over the buzzwords of the industry to help them confuse and scam unknowledgeable people

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection