On March 26th, 2018, I stepped into the London headquarters of Palantir. Mercy Corps had been using Palantir’s software in Syria and Jordan pro bono as part of its Philanthropy Engineering Program. They claimed that their software, Foundry, could revolutionize how humanitarian groups use data for their work. So, I made a trip to London, feeling a sense of cautious optimism about this new partnership. Instead, what I observed that week opened my eyes to the harm that Palantir’s technology could cause in the world.
The Technology
Data interoperability is one of the biggest challenges I encounter in my work. This refers to the ability of different computer systems and software to share and understand data. The problem usually stems from companies using outdated and complicated systems to collect information. These uncoordinated data systems, unable to connect, often produce siloed and underutilized data.
Palantir’s flagship software, Foundry, helps to solve the data interoperability challenge by using low or no-code tools to connect various data sources. It can then conduct large-scale data analysis, highlighting trends and patterns companies may have missed before. What makes Palantir’s software so appealing is that it is not trying to change or fix how companies collect data. Rather, it is like a layer companies can add to their system that maps, transforms, and pulls data together.
The Partnership
By the time I was introduced to the Philanthropy Engineering team, Mercy Corps had already been working with them in Syria. They were using Foundry to combine data on when and where air strikes, shelling events, and barrel bombs occurred. When layered onto a map, this information helped to paint a real-time picture for Mercy Corps’ teams planning supply and aid deliveries.
When Palantir wanted to expand its work with Mercy Corps into Nepal, I was skeptical of their intentions. Companies often have mixed motives for their philanthropic efforts. There’s usually a cost tied to the money they donate. Palantir was also associated with some unsavory corporate and government clients. However, when Palantir showed up at Mercy Corps’ office in Nepal, I tried to keep an open mind. As the regional data and evaluation advisor, I would need to spend several days with Palantir learning how to use Foundry before we could finalize the partnership.
The Office
Palantir’s London office was completely different from the working environment I was used to. Chic workspaces with the latest technology, an onsite chef to make you the perfect omelet in the morning, and places to relax, play video games, or take a nap. I was used to trying to work with the electricity and internet regularly going out while sweating through my “nice” work shirts.
Palantir works with some of the biggest companies and government agencies in the world. At the beginning of our training week, Palantir presented how Airbus was using Foundry to streamline its operations. It was impressive to see how Foundry could sort through massive amounts of disorganized data into clean data pipelines. I was excited to try it out for myself. Using Foundry was like learning a new language. Inside the bootcamp, we learned to structure and connect our data with a specific use case in mind (in Palantir-speak, this was called building the “operational ontology”). We presented our use cases to a small team of engineers and got feedback on how to improve our designs. The technology was impressive and unparalleled.
Things Begin to Unravel
However, as the week progressed, I started to have the feeling that we had made a huge mistake in working with Palantir. The first clue was a framed article of Palantir’s work with the U.S. Immigration and Customs Enforcement (ICE). In what can only be described as propaganda, Palantir portrayed its work with ICE as courageously combating human trafficking and terrorism. With the Trump Administration’s family separation policy being in full swing in the U.S. at this time, my skepticism about Palantir accelerated into distrust.
The next incident unfolded in real time as we were in the Foundry boot camp. In summer 2014, a start-up named Cambridge Analytica, aided by a Palantir employee, scraped private data from 50 million American Facebook users [1]. They used this data to create psychological profiles, which Ted Cruz’s and Donald Trump’s political campaigns reportedly used in 2016.[2]. On March 17, 2018, the Guardian and New York Times broke the story about a Cambridge Analytica whistleblower. Ten days later, when I was working in the Palantir office, the whistleblower testified in the UK Parliament. In his testimony, Christopher Wylie, the ex-Research Director at Cambridge Analytica, gave details about the illegal use of Facebook users’ data aided by the Palantir employee.
I could not get out of the Palantir office fast enough. I was not going to be using Foundry and didn’t think Mercy Corps should continue the partnership. It took the leadership at Mercy Corps a bit longer to come to this conclusion, but eventually, the partnership dissolved. What we now know is that Palantir is fully leaning into its role as the Trump administration’s data broker. They have built a database that, when coupled with facial recognition technology, can identify the real-time location of people ICE is pursuing for deportation[3].
This experience forced me to confront the ethical cost of partnerships with companies whose values don’t reflect my own. Palantir promised efficiency, but in the end, only offered morally compromising solutions. What appeared to be an innovative tool for humanitarian work revealed itself as part of a much larger ecosystem where data can be weaponized against the very people that humanitarian organizations aim to protect.
[1] https://www.nytimes.com/2018/03/27/us/cambridge-analytica-palantir.html
[2] https://www.npr.org/2018/03/20/595338116/what-did-cambridge-analytica-do-during-the-2016-election#:~:text=No%2C%20Cambridge%20Analytica%20couldn’t,ve%20had%20with%20other%20people.%22
[3] https://www.nytimes.com/2026/01/30/technology/tech-ice-facial-recognition-palantir.html

Leave a comment