TL;DR: Few people have read Trump's Big Beautiful Bill in full—even those tasked with voting on it. Hidden inside, the bill gives unprecedented access and power to Artificial Intelligence and makes not cooperating with federal AI projects a criminal offense. Meanwhile Palantir’s has been granted over $1B+ of government contracts to date, and is working with or talking to federal agencies in the US.
Everyone is talking about the Big Beautiful Bill. But have they read page 278?
That’s where it says states must remove any obstacles to AI, make all tasks easier for adoption, and not impose fees or requirements. And it's a criminal penalty if states don't comply.
Given Trump’s executive order in March of this year to remove data silos, and the huge increase in federal contracts with Palantir, one must wonder if this is part of a larger agenda.
Palantir’s very first client was the CIA, and since then they have specialized in military and government intelligence software. They’ve now racked up $673M in contracts and just landed a new $795M contract with the military.
What is Palantir? Its a big data company specializing in government and military. It helps organization takes huge swaths of data from different sources, create digital representations of real world objects like people, assets, and events - and then consolidate it in one place so it can be searched by AI. This gives these organizations a real-time look at what’s happening.
Palantir’s products also connect back to these systems giving AI-enabled autonomous control. For instance, shutting down a production center and re-scheduling or re-routing all deliveries.
Sounds efficient? Just wait till you see how the products are used by the government.
Here's a look at some of Palantir's signature projects.
Gotham was Palantir's first software project where they developed the idea of making a digital representation of all digital data in a military setting. They called this technology “Ontology”. Gotham used satellite and GIS imagery and overlaid friendly assets, but more importantly identified and predicted whether objects on the ground were military targets.
The AI could prioritize targets, and all a human operator would need to do would be to sign off to send the strike.
Gotham proved to be so good at this that police departments sought out the technology. BuzzFeed News was the organization to submit FoIA requests and receive instructional documents on how the LAPD used Gotham. In 2013, an LAPD Crime Officer said he used Gotham on a daily basis, and half of the officers in the LAPD were using the tool.
Gotham was fed daily data from the LAPD, like arrest records and police reports. But it also consolidated data from other sources, data sharing agreements with other police departments, and even the police departments at universities and schools. Traffic camera images and license plate scanners were also included.
Any individual interacting with the LAPD or their partner agencies had a profile inside Palantir. This profile could include their home address, vehicle, known associates / family, and even when they were last seen driving through the city.
Police could search using a text based description of a suspect they were looking for. “White male, skull tattoo, part of the X gang” - and get a list of matching results.
We’ve never seen a tool like this before - and the use of this tool is likely to spread.
Foundry is the ‘corporate’ version of Gotham, meant for corporations instead of the military. It also helps create a digital representation of large companies all the way down to the employee and process level. Once the model is complete, Foundry searches for optimizations, runs simulations on potential decisions, and can act as a mission control for organizational changes. The human makes the decision and the AI carries it out throughout the organization. We’ve seen Foundry used by the CDC to help make decisions on vaccine distribution.
Most controversially, Palantir is involved in Maven - a Pentagon project that Google had previously pulled out of due to public pressure. Originally named Algorithmic Warfare Cross Functional Team - Maven uses data fusion and machine learning to process data from many sources. The project involves at least 20 companies, and it collects intelligence from satellites, sensors, radar, and more in order to identify potential targets. The first AI-enabled artillery strike took place from Fort Bragg in 2020.
We’ve never seen such tools being used by the military, government, and police - and these tools help further dehumanize United States citizens, all potential suspects in Palantir’s tools.
Data-sharing agreements between governments, corporations, and local entities will create a web of complete surveillance - anything you do could show up in a Palantir like system.
With AI assisted targetting, taking the human even further out of the equation this type of software has major moral implications - and accelerates the AI powered war march of modern governments.
Solutions:
Be mindful of who you're sharing your data with and what you're sharing. Anytime you share your data with a government body, you risk it being added to a centralized system like Gotham.
One of the best ways to protect your data is to stop using big tech devices. It can feel like a big task, but when we see what massive tech companies like Palantir are actually doing, it seems well worth it.
Watch this full episode
Follow the #TBOT Show
Get privacy gear
Share this post