The UK's Financial Conduct Authority (FCA) has awarded Palantir, a US-based AI company, a contract to analyze its internal intelligence data, sparking fresh concerns about the company's deepening reach into the British state. As reported by The Guardian, this deal is part of the FCA's effort to utilize digital intelligence in tackling financial crime, including fraud, money laundering, and insider trading.
The contract, worth over £30,000 per week for a three-month trial, will see Palantir apply its AI system, known as Foundry, to vast quantities of sensitive information held by the watchdog. This includes case intelligence files, reports from lenders about proven and suspected frauds, and data about the public, including consumer complaints to the financial ombudsman. The data also encompasses recordings of phone calls, emails, and social media posts, raising significant privacy concerns.
Palantir, co-founded by billionaire Donald Trump donor Peter Thiel, has already secured over £500m in UK public deals, including contracts with the NHS, military, and police. The company's technology is used by the Israeli military and in the US president's ICE immigration crackdown, prompting leftwing MPs to label it a "highly questionable" and "ghastly" company. The FCA's decision to award Palantir this contract has raised concerns inside the regulator, with one source questioning the company's ethical reliability.
Privacy Concerns and Expert Analysis
Prof Michael Levi, an internationally recognized expert in money laundering at Cardiff University, acknowledges that AI can be a valuable technology in tackling financial crimes. However, he raises concerns about whether Palantir's owners might use the methodologies learned from the FCA's data for their own purposes. Levi questions the protocols agreed between the FCA and Palantir regarding the onward use of the data and intellectual property derived from it.
Christopher Houssemayne du Boulay, a partner and barrister at Hickman & Rose, highlights the significant privacy concerns associated with using real data in the pilot. He notes that the FCA's enforcement investigations can compel firms to hand over vast quantities of data, including personal information, which may be ingested and used to train Palantir's AI system.
The FCA has stated that Palantir will be a "data processor" and not a "data controller," meaning it can only act on instruction from the regulator. The FCA will retain exclusive control over the encryption keys for the most sensitive files, and the data will be hosted and stored solely in the UK. Palantir will be required to destroy the data after completing the contract, and any intellectual property derived from the data trawling should be retained by the FCA.
Contract Details and Implications
The contract has sparked warnings of "very significant privacy concerns" from campaign groups and experts. The FCA considered using dummy data or scrambling company and individual names but decided that using real data was the only worthwhile test. The regulator ran a competitive procurement process and has strict controls in place to ensure data is protected, according to a spokesperson.
Palantir has previously defended its work, citing its contributions to the NHS and UK police. However, the company's involvement in the FCA's efforts to tackle financial crime has raised questions about its role in the British state and the potential implications for privacy and human rights. As the contract progresses, it remains to be seen how Palantir's technology will be used and what measures will be taken to ensure the protection of sensitive data.
Broader Implications and Future Developments
The FCA's decision to award Palantir this contract is part of a broader trend of UK agencies utilizing digital intelligence to tackle financial crimes. As the use of AI technology becomes more prevalent in the public sector, it is essential to consider the potential implications for privacy and human rights. The FCA's efforts to balance the need for effective technology with the need to protect sensitive data will be closely watched, and the outcome of this contract may have significant implications for the future of AI in the British state.

