senate-finance-committee-weighs-healthcare-ai-oversightSenate Finance Committee Weighs Healthcare AI Oversight

This audio is auto-generated. Please let us know if you have feedback.

Senators mulled how to best oversee artificial intelligence applications in the healthcare industry at a finance committee hearing on Thursday. Lawmakers particularly focused on preventing algorithmic bias and unfair care denials by health plans.

AI holds a lot of promise to improve efficiency, alleviate burnout among stressed providers and lower ever-increasing healthcare costs, Sen. Ron Wyden, D-Ore., said during the hearing. But the products could also replicate racial, gender or disability bias and potentially worsen existing healthcare disparities, Wyden said.

“It is very clear that not enough is being done to protect patients from bias in AI,” he said. “[…] Congress now has an obligation to ensure the good outcomes from AI set the rules of the road for new innovations in American healthcare.”

AI could do more harm than good without careful oversight, said Ziad Obermeyer, an associate professor at the University of California, Berkeley. His research on a family of algorithms that were meant to flag patients who were at higher risk of future health problems found significant racial biases.

The algorithms utilized cost data to predict future care needs — but underserved patients contributed to less spending due to access issues or discrimination. So Black patients were less likely to be identified for extra care compared with their White counterparts. 

“The AI saw that fact clearly, it predicted the cost accurately,” Obermeyer said. “But instead of undoing that inequality, it reinforced it and enshrined it in policy.”

Witnesses listed a number of ways to keep an eye on the technology’s use in the sector, including creating groups of experts to hash out standards and enforcing them through federal agencies, like Medicare, or regulating the tools like the Food and Drug Administration evaluates medicines. 

Courts might be able to play a role, but they may struggle to understand complex AI when cases come before them, said Sen. Bill Cassidy, R-La. 

Healthcare professional organizations could also help evaluate AI products, like working with the American College of Cardiology to serve as a third-party validator for tools geared toward cardiologists. But that might quickly become too complicated, as there may be overlap between patients. 

“A cardiology patient with congestive heart failure can have kidney disease, and can have diabetes and hypertension and be at risk of stroke,” he said. “[…] But actually, that’s the one that seems most valid to me, because you actually have subject matter expertise kind of penetrating there.”

Senators also raised concerns about insurers using predictive algorithms to inform coverage decisions, particularly in the Medicare Advantage program. 

Some payers — like Humana, UnitedHealth and Cigna — have faced lawsuits alleging they use algorithms to improperly deny claims. The UnitedHealth suit cites a Stat investigation published in November that found the insurer used an algorithm to predict length of stay in rehabilitation facilities, and pushed employees to cut off coverage even for seriously ill seniors.  

“Until CMS can verify that AI algorithms reliably adhere to Medicare coverage standards by law, then my view on this is CMS should prohibit insurance companies from using them in their MA plans for coverage decisions,” said Sen. Elizabeth Warren, D-Mass. “They’ve got to prove they work before they put them in place.”

Medicare is a flagship healthcare program, and other insurers follow its lead, Wyden said. But Medicaid beneficiaries are also affected by algorithmic benefit decisions, and they may be even more vulnerable.

Rates of appeals and overturned decisions aren’t enough to determine if there’s a person in the loop correcting errors for Medicaid programs, said Michelle Mello, a professor of law and health policy at Stanford University. 

“This group of enrollees does not appeal in force,” she said. “They just simply don’t have the social capital to do that.”

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *