Using GPT-3 to structure clinical notes
twitter.comI'm a product designer experimenting with applications for GPT-3 in the mental healthcare space and I'm encouraged about the results of some experiments in automating the updating of a patient's chart from unstructured clinical notes.
Today, this is workflow (note->chart) is almost entirely manual consumes a large portion of clinician's time. Automating this process could free up this time to be better spent delivering care.
The fact that someone without deep NLP/NL experience can bootstrap something with this much potential impact is incredibly exciting to me.
Isn't it well known by now that chatgpt isn't reliable enough to build anything really serious on top of ?
It's not a good store of knowledge. It's pretty good at transforming information from one form to another.
I have the feeling we really don't know what it's good at for certain. Due to its highly statistical nature, and completely opaque data structure, we can only observe some behaviors, empirically. But it's a total black box. Nothing can be proved nor introspected about it.
Building anything related to people's health on top of that is just wrong imho.
Exactly. GPT happens to hallucinate a lot of "facts" making it unreliable for a large breadth of tasks. However it's quite adept at many NLP tasks and can be fine-tuned to further improve its domain expertise.
In any sensitive application like clinical charting, one would also want to include a workflow for reviewing GPT's output for erroneous data before welcoming it in.