VestNexus.com

5010 Avenue of the Moon
New York, NY 10018 US.
Mon - Sat 8.00 - 18.00.
Sunday CLOSED
212 386 5575
Free call

AI a growing concern in the workers comp legal community

The use of artificial intelligence in workers compensation courts is a concern, with attorneys saying evidence and documents suspected to have been generated with AI are becoming common.

A California law firm was sanctioned in July after the state’s Workers’ Compensation Appeals Board suspected it used AI to draft a petition.

AI use in the legal community is common, as some programs can help with grammar, writing and crafting legal summaries. Professionals should remain cautious, however, as AI has been known to “hallucinate,” or fabricate, case law, according to Sure Log, of counsel with El Segundo, California-based workers comp defense firm Michael Sullivan & Associates.

Such was the case regarding Orlando, Florida-based personal injury law firm Morgan & Morgan, where three attorneys were sanctioned in federal court over motions that included nine citations, eight of which a federal judge alleged were fabricated, according to documents filed in February in federal court in Wyoming that used the word “hallucination.”

“I’ve seen the AI hallucinate, and it would make up cases, or there’d be situations where one would cite the right case, but it would say it stood for the wrong thing,” Mr. Log said. Workers comp is ripe for such confusion, as every state has its own case law, often used in documents, he said.

That’s what’s suspected in John Richard Sedano vs. Live Action General Engineering Inc., where law firm Dietz, Gilmor & Chazen was sanctioned $750 — down from $2,500 after promptly admitting it had made errors — over a court document that “failed to cite the evidentiary record, improperly attached documents already contained in the record, raised issues not raised or decided at trial, failed to cite to proper legal authority, and appeared to create citations that did not exist,” according to a document on sanctions issued July 29 by the Workers Compensation Appeals Board in California.

In that case, the Appeals Board said in its citation that the presiding workers compensation judge “speculates” that the firm “may have used artificial intelligence” and that the “suspicion is based on both the formatting and syntax of the Petition, along with the nature of the errors, such as confusing a 15% attorney’s fee award as a 15% permanent disability award, the inclusion of an unrelated issue, and the citations to legal authority that are irrelevant, do not exist, and/or are not citeable.”

Dietz, Gilmor & Chazen “responded and apologized for the filing and the sanctions were reduced,” and there “was no litigation or sanction regarding an allegation of AI,” Richard Lynn, Rancho Cucamongo, California-based partner and managing attorney at the firm, said in an email. “The attorney that filed the document is no longer with our firm. This has never happened before, and safeguards have been put into place to make sure that this does not happen again.”

Last year the American Bar Association issued ethics guidelines for attorneys using “AI tools,” saying lawyers should understand “the benefits and risks associated” with the technologies.

While the California comp case was considered rare, as the appeals court noted, those handling workers comp cases say the use of AI-generated court documents is a common issue on the plaintiffs’ side, mostly by injured workers representing themselves.

Jeff Adelson, a partner and general counsel with Irvine, California-based law firm Bober, Peterson & Koby LLP, said he is working on a case where a plaintiff submitted as an exhibit a document created using AI.

There are now programs that can detect AI use and “immediately tell the court from the telltale signs whether the document was prepared via AI and can check the citations to see if they’re valid,” according to Mr. Adelson, who called the practice of using AI in the legal community “dangerous.”

“The days of anyone getting away with this are slowly going to be gone,” he said.

Chesterfield, Missouri-based Harris Dowell Fisher & Young does not allow using any AI in drafting court petitions, said shareholder Bradley Young. Mostly, he said, AI can be used in drafting medical summaries, often generated by AI viewing hundreds of pages of documents — which are provided alongside the summaries. That way treating physicians and medical examiners to get a bird’s-eye view of a claim along with the pages of documents.

Such parameters, however, don’t stop AI from showing up in some form in cases, Mr. Young said, adding that the wayward documents can clog up the courts. Last year, he said, the Missouri Court of Appeals dismissed a case in which a pro se plaintiff submitted citations of nonexistent cases.

“Now, I’ve got a case with a pro se claimant. She did the same thing. She filed a brief that ChatGPT wrote.”