AI Scribe Program Gave Hallucinated Notes to Ontario Doctors: Report
AI Scribe Program Hallucinated Notes to Ontario Doctors

An auditor general report into the use of an artificial intelligence program by Ontario doctors has found that the technology frequently produced hallucinated and erroneous notes, raising serious concerns about patient safety and privacy.

AI Scribe Program Under Fire

The report, released Tuesday by Auditor General Shelley Spence, examined AI Scribe, a program designed to relieve physicians from note-taking during patient consultations. However, the investigation revealed widespread inaccuracies, including incorrect information, missing details, and AI-generated fabrications.

Since 2023, Ontario physicians have been permitted to use AI scribe technology only with patient consent. The program listens to the conversation between a patient and doctor and compiles the information into a SOAP (subjective, objective, assessment, and treatment plan) note.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Key Findings of the Report

Spence's report identified several types of errors across 20 vendors:

  • Incorrect information in 12 of 20 vendors, such as capturing a different drug than prescribed.
  • AI hallucinations in nine of 20 vendors, where the system fabricated information not based on any provided data.
  • Incomplete information in six of 20 vendors, omitting critical details like patients' mental health issues.

Hallucinations included fabricated suggestions for treatment plans, such as referring a patient to therapy or ordering blood tests that were never discussed. Another example involved statements that "no masses were found" or that a patient had anxiety issues, even though these topics were never mentioned in the simulated recordings.

Privacy Risks Highlighted

The report also emphasized significant privacy risks. AI notes that generated missing or incomplete information occurred 85% of the time, often omitting vital details about patients' mental health. The auditor general warned that using AI in healthcare must not compromise the confidentiality of intimate patient information.

"When Ontarians see their doctor, they need to share intimate information about their health, their bodies, and their personal lives to receive proper care," the report stated. "Ontarians expect this extremely personal information to be kept private and confidential. Using AI to assist in providing health care must not come at the cost of compromising privacy."

The findings have sparked calls for stricter oversight and safeguards before AI tools are further integrated into medical practice.

Pickt after-article banner — collaborative shopping lists app with family illustration