When Software Says No
By Erica Daily
A clinician reviews a patient chart while a translucent interface displays โ€œPrior Authorization: Denied,โ€ illustrating algorithm-driven insurance denials.
By Erica Daily
Editor Notes

Last week, while at the DMV waiting for my number to be called, the woman beside me turned and asked how long renewals usually take. After that, we made the usual small talkโ€”commiserating about the job market, traffic, and the impending end of the summerโ€”before she shared what she was really struggling with. Her mother, who was recovering from a stroke, had her rehab cut short after the insurer said she had โ€œplateaued.โ€ The care team disagreed and as a result she spent the next few weeks appealing, emailing, and calling to get her care restored. Weeks later, her coverage was reinstated, but in that gap, she lost function sheโ€™s still fighting to regain.

 

Unfortunately, her story isnโ€™t an outlier. Across the country, insurers are embedding algorithms in prior authorization and claims review. While these tools claim to improve efficiency, their swift blanket denials often affect seniors, low-income patients, people of color, and the chronically ill first. Lawsuits against major mayors, new federal pilots, and increased guardrails create an opportunity to set the rules before code becomes policy by default.

Whatโ€™s Actually Happening Behind the Scenes

Insurers are increasingly using automated tools for allocating care to predict the length of stays, flag โ€œlow value services, and batch-screen claims against diagnoses codes. In practice, this means coverage decisions shift from your doctorโ€™s clinical judgment to an opaque algorithmic score that patients will never see.ย ย 

 

Two high-profile cases show how this scenario usually plays out. In Minnesota, families filed a lawsuit against UnitedHealth over allegations that its nH Predict tool systematically cut off post-acute rehab for Medicare Advantage patientsโ€”claims that federal court originally allowed to proceed through February of 2025 before rejecting the companyโ€™s bid to limit discovery last week. Meanwhile in California, the court escalated a class action lawsuit against Cignaโ€™s PxDx system for allegedly enabling batch denials without reviewing patient records. The tools may be different, but the pattern is clear: algorithms speed up denials, while patients are left to deal with the same slow, manual appeals process.ย 

 

In response to this growing issue, regulators are pushing back. In February 2024, CMS drew a clear line stating that Medicare Advantage plans can use algorithmic tools to assist with coverage decisions, but licensed clinicians will have the final say in determining medical-necessity. Put simply, this means that software advises, but doctors decide. California took this a step further with SB 1120, which kicked in January 1, 2025, mandating human oversight and transparency whenever insurers use AI to influence coverage decisions. Recently, the state even doubled down in May with detailed compliance guidance.ย ย 

 

However, the inclusion of AI in medical decisions is expanding beyond Medicare Advantage. Starting January 1, 2026, CMS will roll out its Wasteful and Inappropriate Service Reduction (WISeR) model in six states, leveraging AI and Machine Learning to โ€œensure timely and appropriate Medicare payment for select items and services.โ€ While the pilot does include mandatory clinician review, this hybrid approach could go either wayโ€”reducing administrative burden or perpetuating the same vague decision-making process that plagued Medicare Advantage. The success of this new iteration is entirely dependent on its implementation. In this case, responsible deployment would involve transparent disclosure when AI is used, detailed audit logs, and genuine recourse when algorithms deny legitimate care.ย 

 

As these tools spread from Medicare to Advantage to Original Medicare and across insurance markets, their impact follows a familiar pattern of inequality and thereโ€™s a clear trend of where theyโ€™re deployed. They cluster in areas with narrow provider networks, among patients whose caregivers lack the bandwidth or language skills to appeal, and in communities cut off by digital divides. Efficiency is the sales pitch of AI, but vulnerability is the target when these algorithms underestimate recovery needs for Medicare seniors with complex conditions and trap Medicaid patients in endless prior authorizations loops. To make matters worse, these systems learn from historically biased data about who received rehab, imaging, or specialist referralsโ€”essentially programming yesterday’s discrimination into tomorrow’s care decisions at massive scale. CMS’s new guardrails are a start, but without transparency requirements and real enforcement mechanisms, theyโ€™re little more than suggestions.ย 

 

So What Can You Do?

Fighting a faceless algorithm can feel impossible, but hereโ€™s practical checklist that can tip the odds in your favor:

  • Read the notice for tells: Look for phrasing like โ€œnot medically necessary per criteria,โ€ โ€œfails predictive model,โ€ or decisions issued within hours of submission. Ask (in writing) whether an algorithm assisted the decision and which policy or guideline was applied.
  • Demand a human review: Under CMS guidance and laws like CA SB 1120, a licensed clinician must be accountable for coverage decisions. Request the physician reviewerโ€™s credentials and a case-specific rationale that addresses your medical facts.
    • Example language to use an appeal:
      • โ€œMy clinician attests this service is medically necessary based on [diagnosis/facts]. Please cite the exact Medicare/NCD/LCD or plan policy used and explain how my case was evaluated individually as required.โ€
      • โ€œIf a software tool assisted the decision, identify it and confirm a licensed physician reviewed my file and made the final determination, per CMS.โ€
      • โ€œI request an expedited review due to the risk of health deterioration.โ€
  • Be strategic about how you escalate: Ask for a peer-to-peer review. Log calls, save portal messages, and for Medicare beneficiaries, contact your State Health Insurance Assistance Program (SHIP) or legal aid. Note if the denial conflicts with Medicare coverage rules.
  • Know Your Rights (Medicare Advantage): Plans can use algorithms to assist, but they canโ€™t deny or terminate care without individualized medical-necessity review by clinicians, and must follow Medicare coverage criteria. If you suspect a batch or automated denial, call it out and request documentation.

Individual action helps, but families shouldnโ€™t have to fight case by case. Hereโ€™s what policymakers should do next:

    1. Mandate audit trails: Every AI-assisted coverage decision should generate a readable log: what tool/criteria were used, what data were considered, performance by subgroup, and which clinician signed off. CAโ€™s SB 1120 and subsequent guidance are starting points; other states should follow.
    2. Publish denial/appeal/reversal rates (by line of business and ZIP code): If specific tools or vendors correlate with high reversal rates (i.e. many denials overturned), regulators and the public should know.
    3. Tie transparency to participation: For federal pilots like WISeR, require model documentation, bias testing, and patient-facing disclosures as conditions of entry. Publish independent evaluations early and often.
    4. Protect continuity of care: When algorithms trigger cuts to rehab days or therapies, require short grace periods while appeals are pending to prevent harm that canโ€™t be undone.
  • Invest in navigation, not just negation: If a system can flag โ€œinappropriateโ€ services in milliseconds, it can also surface covered alternatives, community resources, and streamlined approvals for evidence-based care.

What Happens Next

We’re at an inflection point. The woman at the DMV that sparked this piece shouldn’t have had to spend weeks fighting for her mother’s rehab. Right now, while courts are still hearing cases and regulators are writing rules, we have a small window to ensure algorithms serve patients rather than spreadsheets. The question isn’t whether AI will reshape healthcareโ€”it already has. The costs of getting this wrong extends farther than you may know. This is something I’ll explore in my next article, at the epicenter of America’s AI boom, where the true price of algorithmic healthcare is written in power bills and water usage.

 

Want the bigger picture on how AI could closeโ€”or widenโ€”health gaps? Read our explainer: AI in Healthcare: Equity, Disparities & Opportunities

Trending Topics

Features

Download and distribute powerful vaccination QI resources for your community.

Sign up now to support health equity and sustainable health outcomes in your community.

MCED tests use a simple blood draw to screen for many kinds of cancer at once.

FYHN is a bridge connecting health information providers to BIPOC communities in a trusted environment.

Discover an honest look at our Medicare system.

ARC was launched to create a network of community clinicians to diversify and bring clinical trials to communities of color and other communities that have been underrepresented.

The single most important purpose of our healthcare system is to reduce patient risk for an acute event.

Related Posts
Building Health Advocacy with Trusted Voices
Government Shutdown: Health Care Impact & Research Delays
Gut Health and Mental Health: The Connection in Black and Brown Communities
Scroll to Top
Featured Articles
Faith and community leaders participate in an advocacy workshop at The Sanctuary at Kingdom Square, taking notes during a training session
Building Health Advocacy with Trusted Voices
Government Shutdown Health Care Impact & Research Delays fyh.news
Government Shutdown: Health Care Impact & Research Delays
Gut health in Black and Brown Communities
Gut Health and Mental Health: The Connection in Black and Brown Communities
Close-up of a womanโ€™s hands holding a mug with a pinkโ€“tealโ€“green metastatic breast cancer ribbon beside a notebook, symbolizing strength during treatment.
Living with Metastatic Breast Cancer: Mariaโ€™s Journey of Resilience and Advocacy
Flint Water Crisis on the Black Community
The Flint Water Crisis: Effects on Michiganโ€™s Black Communityย 
Dancers in colorful regalia enter the arena as drum groups perform during the Lumbee Tribe powwow.
'Dance of the Harvest Moonโ€™ Powwow Fills UNCP With Song, Color and Community
Categories
AI
BIPOC News
Cancer
Clinical Trials
Diseases of the Body
Environment
Health Data
Health Equity Events
Health Policy
Heart Health
kidney Health
LGBTQ Health
Subscribe to our newsletter to receive our latest newsโ€‹
All Stories
Faith and community leaders participate in an advocacy workshop at The Sanctuary at Kingdom Square, taking notes during a training session
Building Health Advocacy with Trusted Voices
Government Shutdown Health Care Impact & Research Delays fyh.news
Government Shutdown: Health Care Impact & Research Delays
Gut health in Black and Brown Communities
Gut Health and Mental Health: The Connection in Black and Brown Communities
BIPOC News
Gut health in Black and Brown Communities
Gut Health and Mental Health: The Connection in Black and Brown Communities
Flint Water Crisis on the Black Community
The Flint Water Crisis: Effects on Michiganโ€™s Black Communityย 
Dancers in colorful regalia enter the arena as drum groups perform during the Lumbee Tribe powwow.
'Dance of the Harvest Moonโ€™ Powwow Fills UNCP With Song, Color and Community
Environment
Flint Water Crisis on the Black Community
The Flint Water Crisis: Effects on Michiganโ€™s Black Communityย 
Hurricane Katrina Anniversary 20 Years Later in New Orleans fyh.news
Hurricane Katrina Anniversary: 20 Years Later in New Orleans

democracynow

energy poverty crisis
Federal Cuts to Energy Assistance Programs Further Crisis in Maryland

citybuzz.co

Work Force
Racial/Ethnic Minorities have Greater Declines in Sleep Duration with Higher Risk of Cardiometabolic Disease
Racial/Ethnic Minorities have Greater Declines in Sleep Duration with Higher ...

pubmed

set of hands from different races
How Diversity in Health Care Improves Patient Outcomes
A group of diverse nursing students smiling on campus in Michigan, representing top accredited nursing programs in 2025
Top Nursing Schools in Michigan โ€“ Explore Accredited BSN & MSN Programs

allnurses

Clinical Trials
Tylenol, Autism, and Breaking the Stigma: What the Science Really Says
Tylenol, Autism, and Breaking the Stigma: What the Science Really Says
Racial/Ethnic Minorities have Greater Declines in Sleep Duration with Higher Risk of Cardiometabolic Disease
Racial/Ethnic Minorities have Greater Declines in Sleep Duration with Higher ...

pubmed

ID 225485086 ยฉ Piyapong Thongcharoen | Dreamstime.com
AAP Breaks with Federal Guidance, Recommends COVID-19 Shots for Healthy Young...

Vaccines and Outbreaks
Vaccines in the Black Community
Vaccines in the Black Community: A Legacy of Mistrust
dreamstime_s_255228734
Your Childโ€™s Doctor May Now Recommend Covid Shots โ€“ Hereโ€™s Why
ID 225485086 ยฉ Piyapong Thongcharoen | Dreamstime.com
AAP Breaks with Federal Guidance, Recommends COVID-19 Shots for Healthy Young...

Other Categories
AI
Cancer
Read the latest Cancer stories trending around the world
Diseases of the Body
Read about the latest Diseases of the Body trending around the world
Friday Webinars
Every Friday, we bring you insightful webinars covering critical topics in healthcare, data equity, and policy reform.
Health Data
Read the latest Health Data stories trending around the world
Health Equity Events
Read the best Health Equity Events around the country.
Health Policy
Read the latest Health Policy stories trending around the world
Heart Health
Read the latest on Heart Health News, Stories and Tips.
kidney Health
Read more trending News about Kidney Health, Stories and Tips.
LGBTQ Health
Read the latest LGBTQ Health stories trending around the world
Lift Every Voice Patient Network
Mental Health
Read the latest Mental Health stories trending around the world