Israel is rubber-stamping AI bombing targets

If they aren't stopped, these tools will be in a military's hands near you!

MajorLinux
MajorLinux - Editor-in-chief
Photo by Anna Tarazevich on Pexels.com

We are now over six months into Israel’s ethnic cleansing of Palestinians in Gaza. Of course, this whole time, they have been covering it under the guise of targeting Hamas militants in the region. Israel initially said they were going after hostage that were taken on October 7, but nobody is really talking about that anymore. But what I’d like to talk about is how Israel has chosen its targets in Gaza.

Israel’s bombing campaign

Israel’s bombing campaign has looked somewhat random. Every now and then, you’d hear that the military would shell near an area where expected Hamas militants or Israeli hostages are being held. Other than that, it’s another neighborhood, school, or place of worship being randomly targeted. What if I told you they weren’t randomly targeted?

Well, I have. I’ve mentioned it several times on Tech Talk Thursdays and there’s an article on DCA about Project Nimbus, Google and Amazon’s joint project with Israel to provide cloud services. There have been others stateside like The Verge that have reported on Google Photos along with other tools to inacurately target Hamas members.

What I’m here today to talk about is the name of the tool, how it works, and what oversight it has. This is necessary for a lot of reasons. The reason why it’s being talked about now is because of the targeting of World Central Kitchen staff being targeted not once but three times by the Israeli military on a route they were supposed to be on after discussing it with said military.

Lavender

The tool in question is called Lavender. It was an AI tool that was spent up shortly after the October 7 attacks. This news was brought to the world by +972 Magazine and Local Call. According to them, Lavender has marked over 37,000 Palestinians in Gaza as “Hamas militants”. What’s wild about this number is that, according to Axios, there are 30,000 – 40,000 fighters.

Of course, Israel has denied that such a tool exists. An Israeli military spokesperson told CNN that AI isn’t being used, but didn’t deny Lavender’s existence. To them, it is “merely tools for analysts in the target identification process.”

This is being called into question by +972. Reporting from them states that Israeli intelligence officers sait they aren’t required to conduct independent verification of the targets. They just rubber-stamp them. The most they would do with the targets is verify if they’re male.

Lavender was trained on a set of known Hamas and Palestinian Islamic JIhad operatives along with people who are loosely associated with Hamas. This included employees of Gaza’s Internal Security Ministry.

I was bothered by the fact that when Lavender was trained, they used the term ‘Hamas operative’ loosely, and included people who were civil defense workers in the training dataset.”

A source told +972 Magazine

Lavender was said to be 90 percent accurate when targeting people. That means that 10 percent of people were killed because you may have a similar name or nickname. Worse, you might be a relative of an operative and have nothing to do with this. But, the Israelis treated this as a margin of error, that “mistakes were treated statistically,” a source said.

Because of the scope and magnitude, the protocol was that even if you don’t know for sure that the machine is right, you know statistically that it’s fine. So you go for it.”

A source told +972 Magazine

Collateral damage

But it gets worse. Intelligence officers were allowed collateral damage. For any low level Hamas operative, they were allowed 15-20 civilian deaths. For senior officials, the Israeli military authorized “hundreds” in collateral martyrs.

Where’s Daddy?

Then there’s “Where’s Daddy?”. This targeted Hamas operatives in their homes. This put targets identified by Lavender on surveillance. They would follow them until they got home. Once they got home to their families, the Israeli military would then bomb the target. But sometimes, they’d get it wrong. Targets were bombed without verifying if the actual target was at home.

It happened to me many times that we attacked a house, but the person wasn’t even home. The result is that you killed a family for no reason.”

A source told +972 Magazine

The Gospel

I’ll leave you with one more. The Israeli military has something called “The Gospel”. This marks buildings that Hamas might be operating from. This tool has also been the cause of a huge number of civilian casualties.

When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target.”

A military source told +972 Magazine

This has gone way beyond collective punishment for people who just want to be free and live in their land. This is showing how far a government and its military are willing to go to barbarically slaughter a group of people over resources and real estate.

This government policy has been developed and perfected for over 75 years. And if we don’t check it now, it’ll be on your doorstep before you know it.

In some instances, it already is.

Source: The Verge

Share This Article
By MajorLinux Editor-in-chief
Follow:
Marcus Summers is a Linux system administrator by trade. He has been working with Linux for nearly 15 years and has become a fan of open source ideals. He self identifies as a socialist and believes that the world's information should be free for all.
1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *