Settings

Theme

A Calif. teen trusted ChatGPT for drug advice. He died from an overdose

sfgate.com

4 points by c420 a month ago · 2 comments

Reader

uyzstvqs a month ago

So "drug advice" refers to drug abuse, not medicine. ChatGPT gave several bad replies, but this was in between constant warnings about the dangers, which were ignored. The guy even told ChatGPT to "[not] get into the medical stuff about the dangers".

Why did ChatGPT give the bad replies? It appears that it fell for the false "harm reduction" narrative. This should obviously be improved.

Saying that he "trusted ChatGPT for drug advice" or attributing the overdose to ChatGPT is straight up misleading. This is ragebait, and clearly not from a reliable source.

chenyusu a month ago

We should develop some methods to detect the nonsense in the coherent stories that LLMs are telling

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection