
"Nine out of 20 AI systems reportedly "fabricated information and made suggestions to patients' treatment plans" that weren't discussed in the recordings. According to the report, evaluators spotted potentially devastating incorrect information in the sample reports, such as no masses being found, or patients being anxious, even though these things were never discussed in the recordings."
"Twelve of the 20 systems evaluated inserted incorrect drug information into patient notes, while 17 of the systems "missed key details about the patients' mental health issues" that were discussed in the recordings. Six of the systems "missed the patients' mental health issues fully or partially or were missing key details," per the report."
"As part of the procurement process, officials conducted evaluations using simulated doctor-patient recordings. Medical professionals then reviewed the original recordings alongside the AI-generated notes to evaluate their accuracy."
"The AI systems approved for Ontario healthcare providers routinely missed critical details, inserted incorrect information, and hallucinated content that neither patients nor clinicians mentioned, according to a provincial audit of 20 approved vendors' systems."
Evaluations of 20 approved AI scribe systems used simulated doctor-patient recordings and compared original recordings with AI-generated notes. Nine systems fabricated information and made suggestions to patients’ treatment plans that were not discussed in the recordings. Potentially harmful incorrect details appeared in sample reports, including statements such as no masses being found or patients being anxious, despite those details not being mentioned. Twelve systems inserted incorrect drug information into patient notes. Seventeen systems missed key details about patients’ mental health issues that were discussed in the recordings, and six systems missed those issues fully or partially or omitted key details.
Read at theregister
Unable to calculate read time
Collection
[
|
...
]