Comp adopts AI, girds for potential liabilities
- August 3, 2025
- Posted by: Web workers
- Category: Workers Comp
The workers compensation industry is embracing artificial intelligence yet remains cautious as experts see potential pitfalls, from increased liabilities to challenges in interpreting the mounds of information seen as a cornerstone of AI’s capabilities.
With some variance in capabilities, companies managing workers comp claims are using AI in predictive modeling, which pulls from a database of personal, injury and recovery information to help predict an injured worker’s claim trajectory and whether there is a risk of complications or litigation. The industry is also using generative AI to review hundreds of medical documents and produce claim summaries, which can provide recommendations for an injured worker’s care.
AI has gained traction in workers comp in part because it helps manage mundane tasks, freeing claims handlers to focus on claims that need more attention. Many programs have been deemed successful, saving insurers and employers money, especially in terms of getting injured workers the care they need and keeping claimants out of the courts. The industry also says AI is aiding in fraud detection, identifying commonalities in claims data to detect suspicious activity or providers.
“We’re using AI to help our adjusters get up to speed quickly on a claim and to help make the right decisions,” said Joe Powell, Fort Wayne, Indiana-based chief digital officer at Gallagher Bassett Services Inc.
The concern of giving a computer program too much power in workers comp was a sticking point during discussions in May at the National Council on Compensation Insurance’s Annual Insights Symposium, where experts chimed in on the fast inroads AI is making in in the insurance industry.

Claims professionals tend to see “a huge number” of documents and medical records, particularly with larger claims, said Neil DeBlock, Palatine, Illinois-based vice president, head of workers compensation claims for Zurich North America.
“We can use AI to simplify those claims, bring them down, synthesize them, pick out the correct areas to look at; that helps us save so much time,” Mr. DeBlock said.
The medical community also is embracing AI in comp, while still relying on doctors and claims professionals as the “humans in the loop” on decision-making, said Dr. Jill Rosenthal, Boca Raton, Florida-based senior vice president and chief medical officer at Zenith Insurance, a unit of Fairfax Financial Co.
“As a physician, I receive hundreds or thousands of pages of records, and now I have a tool that will help me synthesize that and create a timeline,” Dr. Rosenthal said. “I want to know what happened that first day this person got hurt, even if it’s 20 years later. I want to understand the progression, to know how we got here today, so that I know where to go tomorrow.”
The industry is steering away from using AI to make claims decisions — on compensability or treatments — experts say, as litigation plagues the much larger overall health insurance industry, which faces lawsuits alleging denial of care by what many have referred to as an algorithm. (See related story below.)
Using AI to curate reports has risks if the information is flawed or too limited, said Kevin Kalinich, Glen Ellyn, Illinois-based intangible assets global collaboration leader at Aon PLC.
“Generative AI is generating new information based on the data it has, so if the data is biased and discriminatory, saying this particular classification (of worker) is more likely to suffer this type of injury or fatality, or recover quicker or not, that’s one of the drawbacks,” he said.
One risk management strategy is to constantly test the AI model, according to Mr. Powell, who said cautious comp organizations can compare an AI model with what a human determines when it comes to claims summaries. Such a side-by-side look also can help train the AI model, he said.
Another area of concern is the evolving workers compensation regulations, with rules varying by jurisdiction, and whether AI can keep up.
“Workers comp is so complicated,” said Irina Simpson, Philadelphia-based executive vice president, workers compensation for Gallagher Bassett. “The regulations are getting more complex in every single state … and it’s going to get more complicated, more comprehensive.”
Humans will be needed to interpret rules, which vary significantly depending on the business or organization involved. For example, private employers don’t have to navigate presumption laws that apply to first responders and others in the public sector, she said. “I don’t see where AI is quite there in terms of being able to do that level of analysis,” Ms. Simpson said.
While he lauded AI’s capabilities, “You’ve got to keep the human element involved,” said Greg Larson, Stevens Point, Wisconsin-based assistant vice president of workers compensation claims at Sentry Insurance.
“You have to have someone who’s monitoring and watching the technology,” Mr. Larson said. “It is still technology, and how that information is pulled together has to be consistently monitored. What we have in place today is monitored daily, just to make sure that it’s pulling the accurate information together.”
Health insurers face lawsuits alleging AI use factored into claim denial
Two large health insurers are defendants in ongoing class-action lawsuits filed in 2023 — UnitedHealth Group Inc. and Cigna Corp. — and analysts predict more litigation could emerge as artificial intelligence use evolves.
In Estate of Gene B. Lokken et al. v. UnitedHealth Group, Inc. et al., which a federal judge in February allowed to proceed in part, estates of patients whose post-acute care coverage was terminated filed a class-action complaint alleging that the insurer’s reliance on AI tools to deny some medical claims under Medicare Advantage plans constituted breach of contract, breach of the implied covenant of good faith and fair dealing, unjust enrichment and insurance bad faith.
Plaintiffs in the ongoing Kisting-Leung et al. v. Cigna Corp. et al. made similar claims.
Denying services based on AI is a “high risk” for the comp industry, said Greg Larson, Stevens Point, Wisconsin-based assistant vice president of workers compensation claims at Sentry Insurance.
“That’s why we’ve watched it very closely, especially with what we see in a group health,” he said.
AI is used mostly to augment claims handling, said Bill Chepulis, Chicago-based head of large casualty for U.S. national accounts at Zurich North America. “We’re not letting a machine or any kind of artificial intelligence make a decision,” he said.
“We use AI to be able to go through hundreds and hundreds of claims documents and look for trends,” Mr. Chepulis said, adding that patterns among types of injuries, industries and claimants can help steer the claims manager. “It allows us to ask, is (the claim) being handled by the right adjuster? Do we need somebody to interact with this fast and get in front of it all?”
Keeping humans involved will be paramount, said Joe Powell, Fort Wayne, Indiana-based chief digital officer at Gallagher Bassett Services Inc. “That’s where lawsuits will pop up, where you’ve taken the people out of the loop.”
AI in health care is like self-driving cars, of which consumers remain cautious, said Joel Raedeke, Chicago-based technology and data science officer at Broadspire Services Inc. “We still believe that the adjuster who’s assigned to the claim needs to be the strategic owner of the decisions made on that claim,” he said. “You could think of it like self-driving cars; until you prove that you can drive relatively perfectly, you’re not ready to go to self-driving.”


