This article takes a look at NYC Health + Hospitals’ decision to end its Palantir contract. It dives into the heated debates over data privacy, especially around de-identified patient info used to claim public benefits, and how activist campaigns are shaking up policy in the UK’s National Health Service.
What happened in New York City’s public hospital system
New York City’s public hospital system won’t renew its short-term contract with Palantir, which is set to expire in October. This comes after news that NYC Health + Hospitals paid Palantir nearly $4 million since November 2023 to review patient notes and help secure more public benefits like Medicaid.
Contract specifics and data handling
The contract allowed Palantir, with agency permission, to “de-identify” protected health information and use it for “purposes other than research.” Privacy advocates quickly raised alarms, pointing to risks of re-identification and mission creep outside clinical care. The hospital system says it’ll move to fully in-house systems and cut off Palantir access or data sharing after the contract ends.
Privacy concerns and the de-identification debate
NYC Health + Hospitals claims its in-house data capabilities will keep things tightly controlled. Still, critics argue that de-identification isn’t always enough to prevent re-identification, especially with detailed clinical notes in play.
People keep asking if governance, consent, and access controls actually do enough to protect patient privacy while chasing program benefits. The debate’s far from settled.
What Palantir says about data ownership and security
Palantir says it doesn’t own customer data and that client environments are secure and auditable. The company insists any misuse would be illegal and a breach of contract, emphasizing a wall between client data and Palantir’s own platforms.
UK NHS context: Palantir’s role under external scrutiny
Meanwhile, Palantir faces growing scrutiny in the UK over a £330 million NHS deal, with fresh concerns about data privacy and the risk of re-identifying supposedly de-identified records. UK officials claim the NHS federated data platform uses de-identified data with strict controls, but MPs and campaigners are pushing for investigations and maybe even the end of Palantir contracts.
Activist campaigns and policy implications
Activist groups like Purge Palantir, backed by unions and community organizations, helped push policy change in New York and are now targeting NHS England. Medact and Amnesty International UK argue Palantir’s software could enable state uses beyond health care, like immigration enforcement, fueling fears that these tech tools could be repurposed for social control.
What this means for patients, health systems, and data governance
Experts warn that re-identification risks aren’t just theoretical. AI advances make it harder to keep de-identified data separate from other sources, raising tough questions about consent, governance, and where to draw the line on data use in public health.
- Strengthened in-house data capabilities and clearer governance frameworks to manage who gets access and how data gets used.
- Greater transparency about who can see health data and for what reasons, so there’s less confusion about where data goes.
- Stronger safeguards to stop health data from being used for enforcement or other non-clinical purposes.
- Increased public trust by showing real privacy protections and responsible data practices.
What comes next
NYC is wrapping up its move to fully in-house systems. Meanwhile, NHS England is still figuring out its relationship with Palantir.
The debate over data privacy, patient rights, and data-driven care isn’t going anywhere. Everyone involved keeps talking about finding that tricky balance—encouraging innovation but making sure there are real safeguards.
No one wants to see health data misused, but there’s real hope for using it to improve public health and care. The conversation feels far from over, and honestly, it’s hard to say where it’ll land next.
Here is the source article for this story: New York City hospitals drop Palantir as controversial AI firm expands in UK