Home Politics How US Intelligence and an American Company Feed Israel’s Killing Machine in Gaza

How US Intelligence and an American Company Feed Israel’s Killing Machine in Gaza

by globedaily.net
0 comment
Spread the love


April 12, 2024

Because it isn’t so much the bombs that kill but the list that puts civilians in the way of the bombs.

Palestinians standing next to a vehicle in the central Gaza Strip, where employees from the World Central Kitchen (WCK) were killed in an Israeli airstrike.
Employees from the World Central Kitchen were killed in an Israeli air strike on their vehicles in the central Gaza Strip. (Majdi Fathi / NurPhoto via Getty Images)

Glilot Junction, in the Israeli city of Ramat HaSharon, is a crossroad where the Trans-Samaria Highway bisects the Coastal Highway. It is also a key crossroad for Israel’s human and technical spies. On the east side of the highway is Hertzog Camp and to the west is Dane Camp. Together, the 462 acres of land make up Camp Moshe Dayan‑the most secret military base in the country.

Surrounded by a variety of intrusion-prevention devices and high steel fences topped with barbed wire, Camp Dayan is home to a number of highly sensitive military intelligence instillations, including its central training base; the advanced Targeting Center; Unit 81, a secret technology unit attached to the Special Operations Division; and the IDF Signals Intelligence College.

But by far the most secret is the headquarters of Unit 8200, which specializes in eavesdropping, codebreaking, and cyber warfare—Israel’s equivalent of the American National Security Agency. One of Unit 8200’s newest and most important organizations is the Data Science and Artificial Intelligence Center, which, according to a spokesman, was responsible for developing the AI systems that “transformed the entire concept of targets in the IDF.” Back in 2021, the Israeli military described its 11-day war on Gaza as the world’s first “AI war.” Israel’s ongoing invasion of Gaza offers a more recent—and devastating—example.

More than 70 years ago, that same patch of land was home to the Palestinian village of Ajleel, until the residents were killed or forced to abandon their homes and flee in fear during the Nakba in 1948. Now, soldiers and intelligence specialists are being trained at Camp Moshe Dayan to finish the job—to bomb, shoot, or starve to death the descendants of the Palestinians forced into the squalor of militarily occupied Gaza decades ago.

Earlier this month saw a continuation of that effort, with the targeting of three well-marked and fully approved aid vehicles belonging to World Central Kitchen, killing their seven occupants and ensuring that the food would never reach those dying of starvation. The targeting was precise—placing missiles dead center in the aid agency’s rooftop logos. Israel, however, said it was simply a mistake, similar to the “mistaken” killing of nearly 200 other aid workers in just a matter of months—more than all the aid workers killed in all the wars in the rest of the world over the last 30 years combined, according to the Aid Worker Security Database.

Such horrendous “mistakes” are hard to understand, considering the enormous amount of advanced targeting AI hardware and software provided to the Israeli miliary and spy agencies—some of it by one American company in particular: Palantir Technologies. “We stand with Israel,” the Denver-based company said in posts on X and LinkedIn. “The board of directors of Palantir will be gathering in Tel Aviv next week for its first meeting of the new year. Our work in the region has never been more vital. And it will continue.” As one of the world’s most advanced data-mining companies, with ties to the CIA, Palantir’s “work” was supplying Israel’s military and intelligence agencies with advanced and powerful targeting capabilities—the precise capabilities that allowed Israel to place three drone-fired missiles into three clearly marked aid vehicles.

Current Issue


Cover of April 2024 Issue

“I am pretty encouraged about talent here and that we are getting the best people,” Alex Karp, cofounder and CEO of the company, told a group soon after arriving in Tel Aviv last January. “What I see in Israel is this hybrid of talent that is qualitative and argumentative.” Immediately after the talk, Karp traveled to a military headquarters where he signed an upgraded agreement with Israel’s Ministry of Defense. “Both parties have mutually agreed to harness Palantir’s advanced technology in support of war-related missions,” said Executive Vice President Josh Harris.

The project involved selling the ministry an Artificial Intelligence Platform that uses reams of classified intelligence reports to make life-or-death determinations about which targets to attack. In an understatement several years ago, Karp admitted, “Our product is used on occasion to kill people,” the morality of which even he himself occasionally questions. “I have asked myself if I were younger at college: ‘Would I be protesting me?’” Recently, a number of Karp’s employees decided to quit rather than be involved with a company supporting the ongoing genocide in Gaza. And in London’s Soho Square, dozens of pro-Palestine protesters and health workers gathered at Palantir’s UK headquarters to accuse the firm of being “complicit” in war crimes.

Palantir’s AI machines need data for fuel—data in the form of intelligence reports on Palestinians in the occupied territories. And for decades a key and highly secret source of that data for Israel has been the US National Security Agency, according to documents released by NSA whistleblower Edward Snowden. After fleeing to Hong Kong in 2013 with a pocket full of flash drives containing some of the agency’s highest secrets, Snowden ended up in Moscow where, soon after he arrived, I met with him for Wired magazine. And in the interview, he told me that “one of the biggest abuses” he saw while at the agency was how the NSA secretly provided Israel with raw, unredacted phone and e-mail communications between Palestinian Americans in the US and their relatives in the occupied territories. Snowden was concerned that as a result of sharing those private conversations with Israel, the Palestinians in Gaza and the West Bank would be at great risk of being targeted for arrest or worse.

According to the Top Secret/Special Intelligence agreement between the NSA and Israel, “NSA routinely sends ISNU [Israeli SIGINT National Unit] minimized and unminimized raw collection…as part of the SIGINT relationship between the two organizations.” It adds, “Raw SIGINT includes, but is not limited to, unevaluated and unminimized transcripts, gists, facsimiles, telex, voice and Digital Network Intelligence metadata and content.”

Now, with Israel’s ongoing war in Gaza, critical information from NSA continues to be used by Unit 8200, according to a number of sources, to target tens of thousands of Palestinians for death—often with US-supplied 2,000-pound bombs and other weapons. And it is extremely powerful data-mining software, such as that from Palantir, that helps the IDF to select targets. While the company does not disclose operational details, some indications of the power and speed of its AI can be understood by examining its activities on behalf of another client at war: Ukraine. Palantir is “responsible for most of the targeting in Ukraine,” according to Karp. “From the moment the algorithms set to work detecting their targets [i.e., people] until these targets are prosecuted [i.e., killed]—a term of art in the field—no more than two or three minutes elapse,” noted Bruno Macaes, a former senior Portuguese official who was given a tour of Palantir’s London headquarters last year. “In the old world, the process might take six hours.”

The company is currently developing an even more powerful AI targeting system called TITAN (for “Tactical Intelligence Targeting Access Node”). According to Palantir, TITAN is a “next-generation Intelligence, Surveillance, and Reconnaissance ground station enabled by Artificial Intelligence and Machine Learning to process data received from Space, High Altitude, Aerial and Terrestrial layers.” Although designed for use by the US Army, it’s possible that the company could test prototypes against Palestinians in Gaza. “How precise and accurate can you know a system is going to be unless it’s already been trained and tested on people?” said Catherine Connolly of the Stop Killer Robot coalition, which includes Human Rights Watch and Amnesty International.

The most in-depth examination of the connection between AI and the massive numbers of innocent Palestinian men, women, and children slaughtered in Gaza by Israel comes from an investigation recently published by +972 Magazine and Local Call. Although Palantir is not mentioned by name, the AI systems discussed by the journalists appear to fit into the same category. According to the lengthy investigation, Unit 8200 is currently using a system called “Lavender” to target thousands of alleged Hamas fighters. But the magazine also reported that, while before the brutal attack on October 7 Israel’s rules of engagement tightly restricted the numbers of non-combatant casualties allowed in targeting a single alleged Hamas militant, such limitations have been loosened in the months since to the point of allowing for the slaughter of dozens of Palestinian noncombatants (including women and children) for each targeted individual. President Joe Biden’s December warning that Israel was losing international support because of its “indiscriminate bombing” of Gaza appears to have had no effect.

The +972 Magazine report details how the Israeli military uses powerful algorithms to sort through enormous volumes of surveillance data—phone, text, and digital——to come up with lengthy kill lists of targets. And adding to that haul would be the data from the NSA intercepts of Palestinians in the United States communicating with their families in Gaza—a process that continued after Snowden left NSA, according to a number of sources.

In 2014—more than a year after Snowden turned up in Moscow—Unit 8200’s targeting of innocent Palestinian in the occupied territories was so extreme that it even caused 43 veterans of the unit, including many serving in the reserves, to go public and accuse the organization of startling abuses. They declared that they had a “moral duty” to no longer “take part in the state’s actions against Palestinians.”

In a letter to their commanders, Prime Minister Benjamin Netanyahu, and the head of the Israeli army, they charged that Israel used the information collected against innocent Palestinians for “political persecution.” And in testimonies and interviews given to the media, they specified that data were gathered on Palestinians’ sexual orientations, infidelities, money problems, family medical conditions and other private matters that could be used to coerce Palestinians into becoming collaborators or create divisions in their society.”

Several years ago, Brig. Gen. Yossi Sariel, the current director of Unit 8200, published a book outlining a supposedly fictional and far-reaching AI system. But the journalists from +972 and Local Call discovered that the super-powerful target generation machine he wrote about then as fiction actually exists. “During the first weeks of the war, the army almost completely relied on Lavender,” they write, “which clocked as many as 37,000 Palestinians as suspected militants—and their homes—for possible air strikes.” And from the beginning, there was little attempt to verify or justify the thousands of names generated by the machine’s “kill list.”

“One source,” they write, “stated that human personnel often served only as a ‘rubber stamp’ for the machine’s decisions, adding that, normally, they would personally devote only about ‘20 seconds’ to each target before authorizing a bombing—just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 percent of cases and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.”

But it gets even worse. They found that the Israeli army deliberately and systematically bombed the homes of targeted individuals—killing entire families—simply because another AI algorithm told them the individual was there. “The result,” they write, “as the sources testified, is that thousands of Palestinians—most of them women and children or people who were not involved in the fighting—were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.”

Such actions were likely contributing factors in the recent decision by the United Nations Human Rights Council to adopt a resolution calling for Israel to be held accountable for possible war crimes and crimes against humanity committed in Gaza. Twenty-eight countries voted in favor, with only six voting against the resolution. Still, in the White House and on Capitol Hill, there seems little concern for the dangers of Israel’s deadly weaponization of AI and the technology’s connection to the staggering numbers of innocent men, women, children, and whole families massacred in Gaza. Washington director for Human Rights Watch Sarah Yager told Politico, “Nobody has any insight including, I would say, U.S. policymakers on how Israel is conducting this war.”

For years, the United States has had strict regulations on the export of weapon systems to foreign countries because of the lack of accountability once in the users’ possession, and the potential for serious war crimes. Even Palantir CEO Alex Karp has argued that “the power of advanced algorithmic warfare systems is now so great that it equates to having tactical nuclear weapons against an adversary with only conventional ones.” Israel’s indiscriminate killing in Gaza offers the perfect example of why it’s time to also begin far stricter regulation of the export of AI systems, like those developed by Palantir. Systems that, as Karp suggested, are the digital equivalent of a weapon of mass destruction. After all, it’s not just the bomb that kills, but the list that puts you and your family under it.

Thank you for reading The Nation!

We hope you enjoyed the story you just read. It’s just one of many examples of incisive, deeply-reported journalism we publish—journalism that shifts the needle on important issues, uncovers malfeasance and corruption, and uplifts voices and perspectives that often go unheard in mainstream media. For nearly 160 years, The Nation has spoken truth to power and shone a light on issues that would otherwise be swept under the rug.

In a critical election year as well as a time of media austerity, independent journalism needs your continued support. The best way to do this is with a recurring donation. This month, we are asking readers like you who value truth and democracy to step up and support The Nation with a monthly contribution. We call these monthly donors Sustainers, a small but mighty group of supporters who ensure our team of writers, editors, and fact-checkers have the resources they need to report on breaking news, investigative feature stories that often take weeks or months to report, and much more.

There’s a lot to talk about in the coming months, from the presidential election and Supreme Court battles to the fight for bodily autonomy. We’ll cover all these issues and more, but this is only made possible with support from sustaining donors. Donate today—any amount you can spare each month is appreciated, even just the price of a cup of coffee.

The Nation does not bow to the interests of a corporate owner or advertisers—we answer only to readers like you who make our work possible. Set up a recurring donation today and ensure we can continue to hold the powerful accountable.

Thank you for your generosity.

James Bamford

James Bamford is a best-selling author, Emmy-nominated filmmaker, and winner of the National Magazine Award for Reporting. His most recent book is Spyfail: Foreign Spies, Moles, Saboteurs, and the Collapse of America’s Counterintelligence, published by Twelve Books.

More from The Nation


Israeli Jews and Palestinians Standing Together

Interview / April 12, 2024 How US Intelligence and an American Company Feed Israel’s Killing Machine in Gaza A conversation about the biggest Jewish-Palestinian grassroots grou…

Interview

/

Jon Wiener


Palestinians return to inspect the destruction following the withdrawal of Israeli troops in Khan Younis, southern Gaza, on Wednesday, April 10, 2024.

Solutions crafted by outsiders to avoid, suppress, and restrict Palestinian agency are bound to fail. Palestinians should decide their own destiny.

Tony Karon and Daniel Levy


Meet Claudia Sheinbaum, Who May Be Mexico’s First Woman President

The former student activist and current mayor of Mexico City is poised to make history with an ambitious platform on education, clean energy, and combatting violence against women…

Feature

/

Arturo Cano


The Unnatural Disaster Threatening Puerto Rico's Children

The island’s health system is on its knees. Doctors are fleeing in droves. And the crisis is hurting kids most of all.

Feature

/

Coral Murphy Marcos


Guantánamo Bay

I’ve been covering the detention center since 2002, and I’ll continue to do so until it is eradicated—because I refuse to let this injustice be relegated to the past.

Karen J. Greenberg


Dr. Juan Romagoza at La Clinica del Pueblo in 1998.

In Everyone Who Is Gone Is Here, Jonathan Blitzer examines how North and Central American migration moves in two directions.

Books & the Arts

/

Gaby Del Valle






Source link

You may also like

Leave a Comment

twelve + seventeen =

About Us

We’re a media company. We promise to tell you what’s new in the parts of modern life that matter. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo. Sed consequat, leo eget bibendum sodales, augue velit.

@2022 – All Right Reserved.