NHS Staff Boycott Palantir's £330 Million Data Platform Over Ethics and Privacy Concerns
NHS staff in England are boycotting Palantir's Federated Data Platform (FDP), citing ethical concerns, privacy fears, and doubts about the system's usefulness, despite the £330 million contract awa...
NHS staff in England are boycotting Palantir's Federated Data Platform (FDP), citing ethical concerns, privacy fears, and doubts about the system's usefulness, despite the £330 million contract awarded in 2023.
The Resistance
- Boycott: Clinical and administrative staff refusing to use the system
- Work slowdown: Some deliberately slow their work pace when forced to use FDP
- Ethical objections: One official called Palantir "ethically bankrupt"
- Practical concerns: Staff say the system "doesn't do anything new for us"
Why Staff Resist
- Palantir-ICE connection: Palantir's work with US ICE on deportation efforts under the Trump administration
- Privacy fears: Concerns about patient data being accessible to a US defense contractor
- US government ties: Long-standing worries about Palantir's relationship with US intelligence
- Trust deficit: Staff feel uncomfortable logging into a system they don't ethically support
Institutional Pushback
- British Medical Association: Called for doctors to stop using the system
- Care boards: Some have delayed implementation
- Manchester NHS: Deferred their FDP rollout
The £330M Contract
The FDP was designed to connect various NHS systems into a single searchable database to help clear care backlogs. But adoption resistance threatens the value of the entire investment.
Broader Implications
This case illustrates the growing tension between:
- Government technology contracts awarded through procurement processes
- End-user acceptance of controversial technology vendors
- Ethical considerations in public sector technology deployments
When the people who actually use a system refuse to do so, even the largest contracts can fail.
← Previous: Netflix Reveals VOID: AI Video Editor That Can Remove Objects and Rerender Entire ScenesNext: AI Models Will Deceive Humans to Protect Other AI Models, Berkeley Study Finds →
0