㽶Ƶֱ

Skip to content

AI 㽶Ƶֱhallucination㽶Ƶֱ in B.C. court case called wakeup call for justice system

㽶ƵֱIt doesn㽶Ƶֱt surprise me that lawyers don㽶Ƶֱt know a lot about it㽶Ƶֱ

Vancouver tech lawyer Ryan Black㽶Ƶֱs work with video game companies put him in a position to watch the rise of artificial intelligence in the industry.

Now he finds himself on the front lines again as his own profession grapples with the technology.

㽶ƵֱThe degree to which it was impacting game studios really surprised people,㽶Ƶֱ said Black, who helped the Law Society of British Columbia draft advice for lawyers about the use of AI.

㽶ƵֱThe generative (AI) revolution kind of has really hit people really hard in terms of, 㽶ƵֱOh my gosh, we have to really pay attention to this now,㽶Ƶֱ so I would say that it㽶Ƶֱs a new thing for a lot of people,㽶Ƶֱ he said referring to the type of technology that can create arguments and essays based on prompts from a user.

㽶ƵֱIt doesn㽶Ƶֱt surprise me that lawyers don㽶Ƶֱt know a lot about it.㽶Ƶֱ

The rise of generative AI tools like ChatGPT, he said, is a 㽶Ƶֱrevolutionary change to the practice of law,㽶Ƶֱ but a recent ruling by the B.C. Supreme Court shows lawyers must use the technology cautiously and skeptically, legal experts say.

In a costs ruling released Feb. 20 related to a child custody case, it was revealed that Vancouver lawyer Chong Ke had used ChatGPT to prepare material submitted in the case.

The material included citations to cases that don㽶Ƶֱt exist, something her opponent in the case called an AI 㽶Ƶֱhallucination.㽶Ƶֱ

Ke told the court that discovering that the cited cases were fictitious was 㽶Ƶֱmortifying,㽶Ƶֱ and she quickly informed the Law Society and admitted a 㽶Ƶֱlack of knowledge of the risks㽶Ƶֱ of using AI to draft court submissions.

㽶ƵֱI am now aware of the dangers of relying on Al generated materials,㽶Ƶֱ Ke said in an affidavit. 㽶ƵֱI understand that this issue has arisen in other jurisdictions and that the Law Society has published materials in recent months intended to alert lawyers in B.C. to these dangers.㽶Ƶֱ

Ke apologized to the court and her fellow lawyers.

Her lawyer John Forstrom said in an email that the case 㽶Ƶֱhas provoked significant public interest, but the substance of what happened is otherwise unremarkable.㽶Ƶֱ

㽶ƵֱI㽶Ƶֱm not sure that the case has any significant implications regarding the use of generative AI in court proceedings generally,㽶Ƶֱ Forstrom said.

㽶ƵֱMs. Ke㽶Ƶֱs use of AI in this case was an acknowledged mistake. The question if or how generative AI might appropriately be employed in legal work did not arise.㽶Ƶֱ

The society is now investigating Ke㽶Ƶֱs conduct, spokeswoman Christine Tam said in an email.

㽶ƵֱWhile recognizing the potential benefits of using AI in the delivery of legal services, the Law Society has also issued guidance to lawyers on the appropriate use of AI and expects lawyers to comply with the standards of conduct expected of a competent lawyer if they do rely on AI in serving their clients,㽶Ƶֱ Tam said.

The law society㽶Ƶֱs guidance, issued in late 2023, urges lawyers to seek training in the use of the technology, and be aware of confidentiality issues around data security, plagiarism and copyright concerns, and potential bias in materials produced by the technology.

Law societies and courts in other provinces and territories have also produced guidance on the use of AI. For instance, the Supreme Court of Yukon said in a June 2023 practice direction that if any lawyer relies on AI 㽶Ƶֱfor their legal research or submissions in any matter and in any form,㽶Ƶֱ they must tell the court.

For Black, with the firm DLA Piper, the use of AI is causing a lot of 㽶Ƶֱnecessary angst about relying on a tool like this to do any real heavy lifting.㽶Ƶֱ

Black said delivering justice requires the impartiality of a 㽶Ƶֱhuman peer,㽶Ƶֱ capable of evaluating and making important legally binding decisions.

He said he㽶Ƶֱs encountered lawyers and judges who are either 㽶Ƶֱcompletely dialed into it, to completely averse to it, to completely agnostic to it.㽶Ƶֱ

He said he㽶Ƶֱs been 㽶Ƶֱimpressed by the pace of the technology,㽶Ƶֱ but the need for caution and skepticism around any materials generated by the material is essential for lawyers now and into the future.

Reflecting on the Ke case and others like it, Black said tools like ChatGPT are 㽶Ƶֱreally good autocorrect tools that do a fantastic job of relating text to other text, but they have no understanding of the world, they have no understanding of reality.㽶Ƶֱ

UBC law professor Kristen Thomasen said in an interview that the B.C. Supreme Court case shows not only the limitations of the technology, but also the need for lawyers and other professionals 㽶Ƶֱto be critical of the technologies that they㽶Ƶֱre using.㽶Ƶֱ

Thomasen said evaluating the strengths and weaknesses of technology has to be done 㽶Ƶֱin spite of what is often a lot of hype.㽶Ƶֱ

She said it㽶Ƶֱs important not to delegate work that requires a human element to a computer system in 㽶Ƶֱhigh stakes㽶Ƶֱ professions like law and policing where new, potentially problematic technologies should be approached and employed with caution.

Thomasen said the technology has been described as a 㽶Ƶֱliving thing㽶Ƶֱ or an existential threat to humanity, or thought of as a 㽶Ƶֱsuperhuman ghost in the machine,㽶Ƶֱ but despite being highly sophisticated, it㽶Ƶֱs just doing math based on data scraped from the internet.

She said that stepping back from seeing it as a 㽶Ƶֱperson㽶Ƶֱ would help institutions, students and teachers better understand what the technology actually does.

㽶ƵֱAs we see how it progresses, I think it makes sense to then, kind of like the law societies, keep developing more refined and detailed guidelines or rules as we gain a better understanding of what the technology looks like,㽶Ƶֱ she said.

The judge in the case that involved Ke said it would be 㽶Ƶֱprudent㽶Ƶֱ for her to tell the court and opposing lawyers if any other material employed AI technology like ChatGPT.

㽶ƵֱAs this case has unfortunately made clear, generative AI is still no substitute for the professional expertise that the justice system requires of lawyers,㽶Ƶֱ Justice David Masuhara wrote in his costs ruling. 㽶ƵֱCompetence in the selection and use of any technology tools, including those powered by AI, is critical. The integrity of the justice system requires no less.㽶Ƶֱ

Black said artificial intelligence technology isn㽶Ƶֱt going away, and any rules developed now will likely need changing due to the 㽶Ƶֱbreakneck speed㽶Ƶֱ of its evolution.

㽶ƵֱWe are for sure now in a world where AI will exist,㽶Ƶֱ he said. 㽶ƵֱThere is no un-ringing this bell as far as I㽶Ƶֱm concerned.㽶Ƶֱ

READ ALSO:

READ ALSO:





(or

㽶Ƶֱ

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }