Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My dad's a doctor and the amount of paperwork you need to write these days has gone up a whole bunch, even higher for him then the 40% of time mentioned in the article. There's a big temptation to fudge on the reports too, since it's mostly checkmarking to cya for lawsuits. Thus ,I think this is actually one area where because the people writing the report are someone misaligned with the purpose of the report, ML powered report writing is a better solution.


If the purpose of the report is to CYA, then you 100% do not want an AI filling it out. Your A will not be C'd.


It would be if it became standard practice and the report filling software were an fda approved medical device.


A report which may or may not contain hallucinations and random shit will never be enough to cover your ass.

How would that work?


Reports can already contain errors and random shit. All you need to cover your ass is the ability to pass liability onto someone else. If the errors are because of a software defect rather than because of user error, your ass is more covered, not less. Though it's true that certain sorts of errors in certain sorts of reports could be problematic in and of themselves.


From what I've heard, a lot of doctors and cops make up shit in reports today anyways. If the summarization is citing back the source in the audio transcript and generally doing RAG stuff, I actually think its more aligned with the interests of the public, especially in cases where the doctor or cop wants to fudge or smooth a bad situation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: