Federal Court Judge Rebukes Lawyer for Submitting AI-Generated 'Hallucinated' Cases
Lawyer Rebuked for AI 'Hallucinations' in Federal Court Case

Federal Court Judge Issues Stern Rebuke Over AI-Generated Legal Citations

A Federal Court of Canada judge has delivered a sharp rebuke to a lawyer who submitted court documents containing four completely fabricated legal precedents generated by artificial intelligence. The incident occurred during proceedings related to Ottawa's termination of an Indigenous fisheries program, raising significant concerns about the growing misuse of AI technology in legal practice.

Non-Existent Cases Cited in Judicial Review Request

According to court documents, the lawyer representing an Indigenous fisheries agency cited four supposed court decisions to support a request for additional time to file for judicial review against the federal Fisheries Department. However, Justice Danielle Ferron discovered during a March hearing in Montreal that these cases were entirely fictional.

"There is one substantial problem with all of these decisions: They simply do not exist," wrote Justice Ferron in her ruling, highlighting what she described as a serious breach of professional standards.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

AI 'Hallucinations' in Legal Filings

The Fisheries Department, which opposed both the filing extension and the judicial review application regarding its termination of a contract with the National Indigenous Fisheries Initiative, argued that the "only logical reason" for the inclusion of these fabricated cases was the lawyer's reliance on generative AI tools without proper disclosure.

Joseph MacKinnon, the in-house lawyer for the Indigenous organization, apologized to the court, explaining that he had subcontracted the legal research due to urgent filing deadlines and the non-profit's limited resources. He admitted failing to verify the AI-generated references before submission.

Violation of Court Directives and Professional Standards

Justice Ferron characterized the submission as "careless and contrary to a lawyers' professional code," emphasizing that presenting AI-generated hallucinations as legitimate legal precedents constitutes more than a simple error. She noted that such actions can:

  • Mislead the court and waste judicial resources
  • Put a litigant's case at significant risk
  • Cause reputational damage to clients
  • Constitute an abuse of process comparable to making false statements

The court imposed higher-than-normal costs against the lawyer as a consequence of this violation. This decision aligns with a 2024 Federal Court directive requiring any document generated by artificial intelligence to be clearly labeled as such—a policy now adopted by courts across Canada, including the British Columbia Supreme Court.

Growing Judicial Concern About AI in Legal Practice

This incident reflects increasing judicial anxiety about artificial intelligence's role in legal proceedings. Justice Ferron referenced a 2024 British Columbia ruling that equated presenting erroneous AI content with making false statements, calling it an abuse of process.

Similarly, a 2025 Ontario case addressed comparable issues, with the judge declaring the matter "serious" despite the litigant's claims of personal circumstances. Retired Federal Court Chief Justice Paul Crampton recently expressed concern in an interview with an online legal magazine about the low disclosure rates of AI use in court filings.

"The court won't hesitate to award costs against them or hold them in contempt of court for those who 'thumb their noses' at the court," Crampton warned, highlighting the judiciary's growing intolerance for undisclosed AI assistance in legal work.

Broader Implications for Legal Profession

This case underscores critical questions about:

Pickt after-article banner — collaborative shopping lists app with family illustration
  1. The ethical responsibilities of lawyers when using AI tools
  2. The verification processes necessary for AI-generated content
  3. The potential consequences for clients when lawyers misuse technology
  4. The balance between efficiency gains and professional standards in legal practice

As artificial intelligence becomes increasingly integrated into legal research and document preparation, this Federal Court ruling serves as a cautionary tale about the dangers of over-reliance on unverified AI outputs and the professional obligations that accompany technological adoption in the justice system.