California Limits Well being Plan Use of AI in Utilization Administration


California Governor Newsom signed Senate Invoice 1120 into regulation, which is named the Physicians Make Choices Act. At a excessive stage, the Act goals to safeguard affected person entry to remedies by mandating a sure stage of well being care supplier oversight when payors use AI to evaluate the medical necessity of requested medical providers, and by extension, protection for such medical providers.

Typically, well being plans use a course of generally known as utilization administration, pursuant to which plans evaluate requests for providers (also called prior authorization requests) in an effort to restrict utilization of insurance coverage advantages to providers that are medically crucial and to keep away from prices for pointless remedies. More and more, well being plans are counting on AI to streamline inner operations, together with to automate evaluate of prior authorization requests. Particularly, AI has demonstrated some promise of decreasing prices in addition to in addressing lag instances in responding to prior authorization requests. Regardless of such promise, use of AI has additionally raised challenges, akin to considerations about AI producing outcomes that are inaccurate, biased, or which in the end end in wrongful denials of claims. Many of those considerations are based mostly on questions of oversight, and that’s exactly what the Act goals to deal with.

As a place to begin, the Act applies to well being care service plans and entities with which plans contract for providers that embody utilization evaluate or utilization administration features (“Regulated Events”). For functions of the Act, a “well being care service plan” contains well being plans that are licensed by the California Division of Managed Well being Care (“DMHC”). Considerably, the Act incorporates a lot of particular necessities that are relevant to using an AI software that has utilization evaluate or utilization administration features by Regulated Events, together with most importantly:

  • The AI software should base choices as to medical necessity on:
    • The enrollee’s medical or different medical historical past;
    • The enrollee’s medical circumstances, as introduced by the requesting supplier; and
    • Different related medical data contained within the enrollee’s medical or different medical document.
  • The AI software can’t decide solely based mostly on a bunch dataset.
  • The AI software can’t “supplant well being care supplier determination making”..
  • The AI software might not discriminate, straight or not directly, in opposition to enrollees in a fashion which violates federal or state regulation.
  • The AI software have to be pretty and equitably utilized.
  • The AI software, together with particularly its algorithm, have to be open to inspection for audit or compliance by the DMHC.
  • Outcomes derived from use of an AI software have to be periodically reviewed and assessed to make sure compliance with the Act in addition to to make sure accuracy and reliability.
  • The AI software should restrict its use of affected person information to be in step with California’s Confidentiality of Medical Data Act in addition to HIPAA.
  • The AI software can’t straight or not directly trigger hurt to enrollees.

Additional, a Regulated Celebration should embody disclosures pertaining to the use and oversight of the AI in its written insurance policies and procedures that set up the method by which it critiques and approves, modifies, delays, or denies, based mostly in entire or partly on medical necessity, requests by suppliers of well being care providers for plan enrollees.

Most importantly, the Act supplies {that a} dedication of medical necessity have to be made solely by a licensed doctor or a licensed well being care skilled who’s competent to judge the particular medical points concerned within the well being care providers requested by the supplier. In different phrases, the buck stops with the supplier, and AI can’t exchange the supplier’s function.

The Act is probably going simply the tip of the spear by way of AI-related regulation which can develop within the healthcare house. That is notably true as use of AI can have super real-life penalties. For instance, if an AI software causes incorrect leads to utilization administration actions which end in inappropriate denials of advantages, sufferers might not have entry to protection for medically crucial providers and will endure opposed well being penalties. Equally, disputes between well being plans and suppliers can come up the place suppliers consider that well being plans have inappropriately denied protection for claims, which may be notably problematic the place an AI software has adopted a sample of decision-making which impacted a bigger variety of claims. The entire foregoing may have super impacts on sufferers, suppliers, and well being plans.

We encourage Regulated Events to take steps to make sure compliance with the Act. Regulated Events with questions or in search of counsel can contact any member of our Healthcare Staff for help.

Additionally, take into account registering for our upcoming webinar, Tips on how to Construct an Efficient AI Governance Program: Issues for Group Well being Plans and Well being Insurance coverage Issuers, on November 13, 2024.

Leave a Reply