Ontario Judge Criticizes Lawyer for Fabricated Legal Quotes in Court Filing
Judge Slams Lawyer for Fake Legal Quotes; Blames Human Error

An Ontario lawyer has come under severe judicial scrutiny after submitting seven completely fabricated quotations from court cases in a legal filing, with the presiding judge questioning whether the lawyer's explanation of "human error" rather than artificial intelligence use actually exacerbates the situation.

Judge Questions Lawyer's Explanation of Fabricated Legal Material

Ontario Superior Court Judge Frederick Myers delivered a scathing written decision regarding lawyer Khalid Parvaiz's submission of fake legal quotations during court proceedings. The judge noted that Parvaiz had cited real cases with correct citations but then added quotations that simply do not exist in those cases.

"This decision may involve the next generation of A.I. hallucinations," Judge Myers wrote in his decision, referencing the phenomenon where artificial intelligence systems generate plausible but false information.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Seven Fabricated Quotations Discovered

In his ruling, Judge Myers meticulously documented seven paragraphs of made-up quotations that appeared inside quotation marks and were attributed to real case law in Parvaiz's submitted material. After each false quotation, the judge wrote: "Nothing like this quotation appears in the case. It is wholly made up."

The bizarre situation emerged during court proceedings involving a real estate investment dispute. After losing his motion regarding legal costs at a hearing that featured significant acrimony between lawyers, both sides were instructed to submit written factums arguing how much costs should be awarded.

Lawyer Claims Human Error, Not AI Use

On February 26, Parvaiz responded in writing to the court's concerns about his submission. "I acknowledge that on review my recitation of the legal principles and substance of the cases referenced was not accurate," the lawyer wrote, according to the published court decision.

Parvaiz characterized the errors as "clear errors on my part and the result of a lack of due care." He specifically stated: "These were, however, human errors and while I take full responsibility for them, I wish to advise the Court that I did not use or rely on artificial intelligence or other such tools in preparing the reply factum."

The lawyer claimed the errors arose from his misreading of the cases cited and expressed that he had reflected on them and learned from the experience.

Judge Considers Multiple Possibilities

Judge Myers outlined two possible explanations for the fabricated quotations in his decision. "The lawyer either used AI then misled court or made up rulings and presented them as real law," the judge wrote, adding that "the cover-up may be worse than the initial error."

The judge noted that "the most obvious explanation for these fake quotations is that counsel used A.I. to draft the factum," but clarified he was not making that finding definitively since he had not received full submissions on the issue.

Professional Consequences and Referral

Khalid Parvaiz has been referred to the Law Society of Ontario, the governing body of the legal profession in the province. Judge Myers mused that a referral to police could have been an option as well, indicating the seriousness with which he viewed the submission of fabricated legal material.

The opposing lawyers in the case had requested increased costs for Parvaiz allegedly submitting false AI-generated material, though Parvaiz has consistently denied using any artificial intelligence tools in preparing his submission.

This case highlights growing concerns within the legal profession about the potential misuse of artificial intelligence tools and the ethical obligations of lawyers to verify the accuracy of all material submitted to courts. The incident serves as a cautionary tale about the importance of due diligence in legal research and the serious consequences that can result from submitting inaccurate or fabricated legal authorities.

Pickt after-article banner — collaborative shopping lists app with family illustration