Surveillance, Metadata & AI Profiling in Healthcare

Where data becomes danger.

This category breaks down how metadata, call logs, AI flagging, and internal systems were quietly used to profile, escalate, and erase patients from care.

From unauthorized surveillance to algorithmic bias, these posts expose the hidden infrastructure used to dehumanize vulnerable individuals—especially transgender and disabled patients.

Surveillance infrastructure

Two Bites at the Apple: How UnitedHealthcare Targeted a Patient Through DHS and Local Police

enter image description here The Story They Don’t Want Told

On January 15, 2025, Rocky Mountain Health Plans (RMHP) crossed a line no insurer is legally or ethically permitted to cross. Acting under the corporate umbrella of UnitedHealthcare of Colorado (UHC-CO) and UnitedHealth Group (UHG), RMHP took five recorded patient calls—concerning denied coverage for gender-affirming care—and packaged them with protected health information (PHI): surgical history, psychiatric medications, and identity details.

They sent it to the Grand Junction Police Department (GJPD).

But what most don’t know: that was the second attempt. Internal routing metadata and correspondence indicate that UHC, operating at the corporate level, first sought to refer the same PHI to the Department of Homeland Security (DHS). When that federal referral failed to produce action, RMHP transmitted the same material locally to police.

Two bites at the apple—both without warrant, subpoena, or emergency justification.

And the question that matters most: why did they do this?

Why: Retaliation Disguised as “Safety”

The answer isn’t hidden in statute books. It’s political, personal, and strategic.

Silencing Advocacy. The patient—a trans woman—was documenting coverage denials, building a digital archive, and preparing litigation. AdministrativeErasure.org was already exposing UHC’s practices. Escalating her to DHS and police wasn’t about “safety.” It was about silencing speech and making advocacy carry the cost of fear.

Shifting Liability. At the time, both UHC and RMHP faced exposure over gender-care denials and potential HIPAA non-compliance. By flipping the script, they reframed the patient not as a whistleblower but as a threat—diverting scrutiny from their own misconduct.

Profiling a Vulnerable Target. Trans patients are disproportionately surveilled and stigmatized. RMHP exploited that bias, betting that law enforcement and media would default to suspicion rather than empathy.

This was not an accident. It was administrative erasure: weaponizing bureaucratic tools to erase dissent and credibility.

“Deny. Defend. Depose.”

Just days after Luigi Mangione allegedly shot and killed UHG CEO Brian Thompson, the patient used three words to describe what RMHP and UHC were already doing to her:

“Deny. Defend. Depose.”

Deny needed care—estrogen, surgery, coverage. Defend the denial with corporate compliance maneuvers. Depose the whistleblower by framing her as a threat and dragging her into court.

Those words weren’t rhetorical flourish; they predicted exactly what came next: escalation to DHS, then police, then litigation. They captured the corporate playbook—deny the claim, defend the denial, and depose the patient into silence.

Now preserved in the civil complaint, the phrase crystallizes UHC’s and RMHP’s retaliatory posture. It is the blueprint by which insurers turn patients into defendants, and defendants into erasures.

Exhibit L: The Paper Trail

The referral is memorialized in Exhibit L, a January 15 2025 cover letter from RMHP to the Grand Junction Police Department. It lists:

Five audio recordings of patient calls;

Identifying PHI, including surgical and medication details;

Language portraying the patient as a potential threat.

What it does not include:

A warrant;

A subpoena;

Any reference to imminent danger.

Exhibit L reads not as a compliance response but as RMHP’s calculated attempt to shift reputational risk from the insurer to law enforcement.

During follow-up communications, an RMHP employee reportedly acknowledged the impropriety, saying:

“I’m not supposed to do that — but it’s done.”

That single line captures the state of mind behind the disclosure: awareness of violation, followed by willful completion. It turns a policy failure into a conscious act.

The 35-Day “Myth” of Imminent Threat

HIPAA permits PHI disclosure without consent only under 45 C.F.R. § 164.512(j)(1)(i)—when there is a good-faith belief in a serious and imminent threat.

But RMHP’s own records destroy any “imminent” claim.

The last call was in December 2024.

RMHP waited 35 days before sending those calls to police.

Police closed the referral within 72 hours, filing no charges.

As detailed in Section XVI. The 35-Day ‘Myth’ of Imminent Threat, no threat is imminent after five weeks of silence. The delay itself proves the defense hollow.

From Algorithm to Accusation

This wasn’t the work of one rogue employee. UnitedHealth’s ecosystem runs on AI-driven monitoring platforms—Verint, NICE, and nH Predict—that flag patient calls for “risk” markers and auto-escalate them.

A distressed patient calling about estrogen-coverage denials was flagged not as vulnerable, but as dangerous. The escalation pipeline—UHC’s algorithms feeding RMHP’s local actions—was likely triggered by algorithmic misclassification.

Automation gave UHC cover: “the system flagged it.” But what the system really did was criminalize distress.

Media Echo: How Retaliation Travels

The police dropped the referral in three days. But the damage didn’t end there.

In July 2025, a local news article resurfaced the incident. By sequencing events and omitting context, it presented the patient as “investigated for threats.” The investigation’s closure—the fact that no charges were ever filed—was buried.

This media echo effect amplified stigma while shielding RMHP and UHC. Neither insurer needed to speak publicly; the narrative they seeded had already taken on a life of its own.

Why This Matters Beyond One Case

Civil Rights. If insurers can send patients to DHS for coverage complaints, the chilling effect is obvious.

HIPAA Integrity. The 35-day delay and 72-hour dismissal expose how weak the “imminent threat” safeguard truly is.

Trans Profiling. Distress over denied care was repackaged as public danger, reinforcing toxic stereotypes.

Policy Urgency. Current law doesn’t prevent serial escalations—DHS first, local next—until some agency takes action.

This is not merely a violation. It is a blueprint.

Administrative Erasure in Action

The Administrative Erasure project defines this pattern: using bureaucratic processes to erase dissenters by reframing them as risks.

Escalation Ladder. UHC tried federal; RMHP tried local—until someone might act.

Narrative Laundering. They didn’t need to prove a crime; they only needed to seed suspicion.

Public Stigma. Once law enforcement was involved, even briefly, media could echo the narrative without liability.

What began as coverage disputes over estrogen and surgery became a manufactured threat investigation. That’s erasure, not error.

The Human Cost

This isn’t only about statutes and exhibits. It’s about a trans woman whose identity and medical history were sent to DHS and local police without justification.

It’s about waking up to learn your calls for care were reframed as “threats.” It’s about carrying the stigma of an “investigation” even after it’s dismissed. It’s about knowing your insurer can erase your credibility with one email.

That is the cost of proxy surveillance: fear becomes currency, reputation the collateral, dignity the loss. Read the Full Report The complete 23-page analysis, with exhibits and footnotes, is available here: 👉 Surveillance by Proxy: How UnitedHealthcare Evaded HIPAA Using Local Law Enforcement (PDF)

Parallel Violations, Parallel Survivors: How the Mangione and Dorn Cases Reveal a National Pattern of PHI Weaponization

enter image description here

Parallel Violations, Parallel Survivors: How the Mangione and Dorn Cases Reveal a National Pattern of PHI Weaponization

Two very different people. Two very different legal arenas. One unmistakable pattern.

In July 2025, defense attorneys for Luigi Mangione filed a blistering court motion accusing Aetna—owned by UnitedHealth Group—of unlawfully disclosing their client’s protected health information (PHI) to prosecutors. The information included mental health and medication history, and was handed over without a valid subpoena, without Mangione’s consent, and without meeting the narrow legal exceptions outlined under HIPAA.

Meanwhile, a separate but equally devastating story was unfolding just miles away. In a forthcoming civil action, trans woman and Medicaid patient Samara Dorn is preparing to sue UnitedHealthcare of Colorado, Rocky Mountain Health Plans, and UnitedHealth Group for disclosing her PHI—including gender identity, surgical history, and metadata-laced call logs—to local law enforcement and, chillingly, to the Department of Homeland Security.

The disclosure occurred thirty-five days after final contact. No warrant. No subpoena. No clinical emergency. Just a bureaucratic escalation, fueled by metadata and convenience, masquerading as concern.

One Corporate Entity, Two Victims While the legal details differ, both cases trace back to the same empire: UnitedHealth Group. In Mangione’s case, Aetna’s unlawful disclosure placed him at heightened risk in a capital murder prosecution. The leaked information was used to paint him as unstable and dangerous—shaping a death penalty narrative rooted not in evidence, but in psychiatric speculation.

In Dorn’s case, the disclosure to law enforcement created a parallel narrative: not of criminal guilt, but of institutional threat. She was flagged not because she committed a crime, but because her voice, identity, and digital footprint were deemed inconvenient. Through the use of backend tagging, AI-generated profiling, and misclassification, UnitedHealthcare constructed a narrative of risk that never existed—then passed that narrative on to police and federal agencies.

This Was Not Care. It Was Control. Both cases demonstrate a catastrophic breach of trust and legality—not because the PHI disclosures failed to help, but because they were never meant to help. In each instance, patient records were disclosed for the convenience and liability protection of the insurer, not for the safety of the individual or the public.

In Mangione’s case, the mental health data was handed over after investigators began seeking the death penalty—raising serious questions about motive, legality, and institutional betrayal.

In Dorn’s case, the PHI disclosure occurred more than a month after any clinical interaction, in violation of HIPAA’s “imminent threat” standard under 45 C.F.R. § 164.512(j). Her data was used not to intervene in an emergency, but to justify reputational abandonment and surveillance escalation.

Administrative Erasure in Action Dorn’s civil complaint outlines how metadata—call tags, risk flags, internal notes—was used to construct a false paper trail. This digital narrative was then used to reclassify her from “patient” to “public threat,” providing justification for disclosure to law enforcement and DHS. This process, which she calls administrative erasure, mirrors the logic in Mangione’s case: that PHI can be converted into reputational ammunition by the same system that claims to protect it.

What links these two cases is not merely the entity that caused the harm. It’s the infrastructure—the policies, the tools, the logic—that converts care into containment, healing into harm, and records into weapons.

One Shared Fight These cases are not isolated. They are flashpoints in a growing national pattern: vulnerable individuals being profiled, criminalized, or erased under the guise of healthcare compliance.

Aetna gave PHI to prosecutors.

UnitedHealthcare gave PHI to police.

Both actions occurred without proper legal justification.

Both targeted those already marginalized.

Both used medical information to destroy, not protect.

Luigi Mangione is currently fighting for his life in a criminal courtroom. Samara Dorn is preparing to fight for hers in a civil one. Their stories are different—but the machine harming them is the same.

Read the Motion We encourage the public, the press, and policymakers to read the Mangione defense team's powerful motion for themselves. It is available here:

📄 Download the Motion – 2025.07.17 HIPAA Violation – Mangione Defense (Google Drive)

This document is more than a legal filing. It is a warning.

Closing Statement The idea that PHI can be quietly weaponized behind closed doors should terrify everyone. What happened to Luigi Mangione could happen to any criminal defendant. What happened to Samara Dorn could happen to any trans patient, any disabled person, or any Medicaid recipient who speaks too loudly.

We are no longer talking about privacy. We are talking about targeting.

We are no longer talking about compliance. We are talking about complicity.

We are no longer talking about care. We are talking about power.

And together, these cases demand accountability.

AI, Healthcare, and Trans Futures: Charting a Path Beyond Administrative Erasure

Heading

As machine learning and predictive algorithms become the scaffolding of modern healthcare, we can’t ignore the ways these tools inherit and amplify the biases of the world around them. In the last year we’ve watched a major insurer mine call logs and metadata to classify a trans woman as a “threat” for challenging her care denial; we’ve seen risk scores determine who gets a surgery approval and who is kicked to the curb. This isn’t some distant science fiction; it’s happening right now, in our own communities.

At their best, AI systems could help flag patterns of discrimination, streamline access to gender‑affirming care, or surface unseen symptoms that human doctors miss. At their worst, they become black boxes that encode transphobia, racism, and ableism into the very logic of care. When a health insurer uses an algorithm to mark certain patients as “high risk” based on their identity or advocacy, that’s not innovation—that’s administrative erasure in a new, shinier wrapper.

What does a future beyond this look like? It starts with transparency. Patients have a right to know when algorithms are making decisions about their care and what data is being fed into those models. Insurers and hospitals must be held accountable for the outcomes of their automated systems. And we, as a community, need to resist the myth that data is neutral. Data is always collected, cleaned, and interpreted by humans with their own agendas; without oversight it can reproduce harm at scale.

This isn’t a call to abandon technology. It’s a call to reclaim it. Imagine AI that actually serves trans people: recommendation engines that connect us to affirming providers, predictive models that anticipate hormone shortages and reroute supply, or chatbots that offer real‑time support without judgment. These are all possible—but only if the people most affected are at the table designing and governing these systems.

We also have to get loud about policy. Laws like HIPAA were never built for the age of predictive policing; we need updates that explicitly prohibit the sharing of sensitive health data with law enforcement absent due process. We need regulatory frameworks that audit algorithms for bias and provide mechanisms for patients to contest automated decisions. And we need to fund grassroots tech projects that prioritize community control over corporate profit.

Ultimately, the future of AI in healthcare can be either a dystopian surveillance apparatus or a tool for liberation. Which path we choose depends on us. If we stay passive, insurers will continue to deploy opaque risk scores that decide who is deserving of care. If we organize, educate, and demand accountability, we can harness technology to amplify our resilience and creativity.

As we build this archive of administrative erasure, let’s also build a blueprint for something better. Algorithms don’t have to erase us; with intention and care, they can help us write ourselves back into the story.

Administrative Erasure: Why Every American Should Pay Attention

Abstract

In an era of predictive policing, algorithmic triage, and privatized surveillance, a dangerous new frontier of civil rights abuse has emerged: administrative erasure.

This exposé outlines how UnitedHealthcare weaponized metadata and indirect police collaboration to erase the voice, safety, and medical autonomy of a transgender patient who dared to speak up.

Cover image featuring redacted medical code and metadata overlays


Drawing from Real Documents

This report draws from: - 🚨 Whistleblower disclosures
- 🧠 Metadata forensics
- 🎙️ Internal voice profiling records

Together, they paint a disturbing picture of institutional denial retooled as digital erasure—and a growing threat to civil rights across the board.

This is not theoretical.
This is not speculative.


⚠️ This Is Not Hypothetical

⚖️ This Is Happening Now

Behind the curtain of managed care and "population health" are data triggers, auto-escalation protocols, and system rules that punish outliers, profile vulnerability, and silence the inconvenient.

A trans patient’s voice was flagged.
Her metadata was logged.
Her care was denied.
And her profile—generated not by doctors, but by algorithms—was sent to police.


📄 Download the Full Exposé

You can read the full exposé and decide for yourself:
Download PDF


Administrative Erasure – Why Every American Should Pay Attention

This isn’t just a trans story.
It’s a red flag for every American.
If data can erase one voice, it can erase many.

📍 For background documents and legal disclosures, see:
- The Evidence They Can’t Ignore
- Exhibit AA – The Whistleblower Files
- Press Room – Administrative Erasure in the Media

"> ');