Israel Now Using AI Technology To Kill Palestinians

Started by yankeedoodle, October 01, 2022, 03:00:19 PM

Previous topic - Next topic

yankeedoodle

Israel Now Using AI Technology To Kill Palestinians
Israel has a long track record of testing weapons technology on Palestinians it rules over in the occupied territories, however, the Israeli military's latest move has gone on pretty much unnoticed. Israeli AI-powered rifles and crowd control technology is now in action, with potentially lethal consequences.

There are imbedded videos with this article, which you can see by reading the article here:  https://www.thelastamericanvagabond.com/israel-now-using-ai-technology-to-kill-palestinians/

Tensions inside the Israeli occupied West Bank are continuing to escalate, in a year of violence not witnessed inside the territory since 2007. Following a lethal Israeli invasion of Jenin refugee camp, in the north of the West Bank, resulting in 4 Palestinians being killed and 44 injured, Israel's military seems to have gotten the green-light for even greater horrors. The chief of staff for the Israeli army, Aviv Kohavi, was reported to have permitted the usage of attack drones to launch airstrike assassinations inside the territory, something that had previously only taken place in the Gaza Strip.

Even more shocking, in the past weeks, Palestinians have noted another appalling development; Israel is deploying AI-powered guns at checkpoints and on military vehicles. The most prominently reported instance of an AI-rifle system being deployed is the system at an Israeli checkpoint in the city of Al-Khalil (Hebron). The Israeli company, called 'Smart Shooter', is responsible for the creation of this new technology, which the Israeli military has claimed was installed in the Shuhada street checkpoint as a prototype for testing purposes.

According to an Israeli military official who spoke to Haaretz, the system is not yet using live-fire, but instead fires stun-grenades, tear gas, and sponge-tipped bullets. Although the claim is that the technology is not intended to be lethal, sponge-tipped bullets will kill when fired from a short range, and even from a distance have murdered Palestinians in the past — so too have tear gas canisters. A local Palestinian activist, Issa Amro, who spoke to Haaretz, gave the following response:

Quote"The system was placed in the centre of a heavily populated area, with hundreds of people passing by. Any failure of this technology could have an impact on many people...I see that as part of the transition from human to technological control. We Palestinians have become an object for training the high tech-industry of the Israeli army, which is not held to account for what it does."

The technology however, has not just been spotted at the ever-busy Shuhada street checkpoint, with sightings of the technology reported elsewhere too. In one case, Palestinians filmed an Israeli military jeep that appeared to have a similar AI-powered rifle system attached to it, one that local reports say is being deployed to combat Palestinian demonstrators. Israel's usage of such technology has not only been limited to the West Bank. In fact, it was revealed that the Israeli Mossad team that assassinated Iran's top nuclear scientist Mohsen Fakhrizadeh, in November 2020, did so using an AI-powered automatic weapon.

In the past, Israel has been embroiled in multiple surveillance based scandals. One example is that of the Israeli military's usage of 'Blue Wolf', a secretive military technology program that works as an application for Israeli soldier's phones. The app logs photos and uses facial recognition technology, then matches them with a database that is run by Israel's military and intelligence. According to a report from The Washington Post, occupation soldiers were encouraged to photograph as many Palestinians as possible, often against their will.

According to documents that were obtained by The Intercept, Google"s controversial contract with Israel may also be providing a service for the military to target Palestinians and even read their intentions through body language. The infamous 'Project Nimbus' is a cloud computing system, that was jointly built by Amazon and Google, sold last year to Israel on a contract for 1.2 billion dollars.

The Google contract reportedly caused internal disputes inside the company itself, which led to the purging of a number of employees at the company who were critical of the deal. Ariel Koren, the former director of marketing for Google's educational products department, was one of those who resigned in opposition to the deal and who faced consequences for opinions.

The Project Nimbus technology is designed so that it will fit in with Israel's ever tightening security state, monitoring and processing Palestinians under what all the world's leading human rights groups call an apartheid system. One of the potential applications of the technology is its ability to be deployed as part of AI systems that will discern the intention of an individual, prior to any given action even taking place, something that no technology on earth is successfully able to do. The major fear is that the usage of technology, as part of weapons systems, could cause untold death and destruction, and the worst part is that Israel is constantly using Palestinians as lab rats for its new technology. It is not out of the question that the Google technology will be used in this way.

According to documents obtained by The Intercept, there seems to be real fear that Project Nimbus will be used in the occupied territories, possibly for a number of purposes. Earlier this year, Israel issued a 97-page ordinance, which implements even tighter restriction on the occupied West Bank's Palestinian population, even forcing newly married couples to register their marriage with the Israeli military, or face a consequence. The ordinance has severely restricted foreign travel into the occupied territory, by making people declare their intention to travel there prior to entrance into an Israeli port-of-entry and has introduced further means of monitoring and tracking Palestinians.

"The former head of Security[sic] for Google Enterprise — who now heads Oracle's Israel branch — has publicly argued that one of the goals of Nimbus is preventing the German government from requesting data relating on the Israel Defence Forces for the International Criminal Court," according to reporting in The Intercept.

Now we have proof that Israel is using AI-powered weapons in the West Bank, weapons that will be tested on Palestinians, seeking to judge what a threat looks like from protesters. A common practice of the Israeli Government is to sell weapons as "battle-tested" and "combat-proven" if they were first used on Palestinians in the occupied territories:

Most likely, Israel does not need this technology to deal with crowd control and has proven effective against children and young people throwing stones for years. If the AI-powered weapon systems prove somewhat successful, there is a large possibility that they will either sell the technology and/or develop it for lethal purposes. Even now, whilst the technology is said to be non-lethal in nature, it poses an enormous threat to the lives of innocent Palestinians.

Another fear is that using Google's AI technology, Israel may even attempt to imprison Palestinians for intending to commit crimes, something that the AI is said to be developing ways of figuring out. If true, this could be horrifying. A world in which we are told that an AI system can alert authorities to someone's intent to commit a crime (whether or not this is true) based upon body language alone. Perhaps in nations around the world, outside the Holy Land, courts would easily throw out such cases, but in the occupied territories Palestinians face a 99% conviction rate and can be held for up to 20 years without any charge at all.