Home News Unveiling the Use of AI in Targeted Killings: Insights into Israel’s Military Tactics

Unveiling the Use of AI in Targeted Killings: Insights into Israel’s Military Tactics

0
Unveiling the Use of AI in Targeted Killings: Insights into Israel’s Military Tactics
Lavender system

Watan-Foreign Policy magazine revealed how the Israeli occupation army uses artificial intelligence to select whom to kill among Palestinians.

The magazine stated that since October 7th of last year, the occupation army has abandoned its old methods of carefully targeting armed leaders, whether of high or medium ranks, and replaced them with artificial intelligence tools to determine who to target.

It added that, according to investigative journalism reports published by the Israeli “Local Call” website, with an English version in magazine “972”, the occupation army relies on a new system that examines a massive amount of data to identify potential targets, then delivers them to human analysts to determine what action should be taken.

In most cases, human analysts recommend carrying out airstrikes against the identified targets.

Lavender system and identifying Hamas members

The report discussed how the information system works, relying on artificial intelligence, used by the occupation army, called “Lavender”.

The system tracks almost every individual’s name in Gaza, gathering extensive intelligence inputs, such as captured videos and messages, data from social media platforms, and simple analyses of communication networks, to determine the likelihood of the person being a fighter from Hamas or other Palestinian armed groups.

The Lavender system relies on artificial intelligence to identify targets.

The report indicated that it is up to the occupation army to determine the acceptable error rate in target identification by the “Lavender” system, explaining that during most periods of war, the error rate was 10%.

Targeting via artificial intelligence

According to the report, after identifying targets that exceed the acceptable error rate, which is 10%, the names to be targeted are sent to the operations team.

A human analyst reviews the names: “In a period estimated at about 20 seconds only, usually to determine whether the name is male or female, assuming that women are not fighters.”

The report mentioned that targeting 10% of those whose names were identified by the information system as errors, for example, due to having names similar to members of Hamas, was considered an acceptable mistake under the conditions of war by the occupation army.

Another information system called “Where’s Dad” determines whether the targeted individuals are in their homes or outside.

The Lavender system
The Lavender system relies on artificial intelligence to identify targets.

The Israeli “Local Call” website, which published investigative journalism reports on this topic, stated that the occupation army prefers to strike targeted individuals in their homes because reaching them is much easier compared to targeting them during clashes with the army.

The website added that the families and neighbors of the targeted individuals, who are potential members of Hamas, are considered by the occupation army as “insignificant collateral damage”.

Israeli estimates of the nature of those targeted

An Israeli intelligence officer described most of those targeted as “unimportant individuals”, as they are low-ranking members of Hamas, but he considered them “legitimate targets because they are fighters, even if they do not have significant strategic importance”, according to his claim.

The Israeli “Local Call” website explained that what the occupation army is doing may amount to criminal negligence in its readiness to strike targets when the risk of death to those around them is extremely high.

Claim of not targeting civilians

However, since the “Lavender” system selects its targets from militants, the occupation army claims it does not target civilians.

Foreign Policy magazine noted that most lawyers and ethicists believe that the occupation army’s strikes in Gaza have failed to adhere to reasonable standards of proportionality, which means “they may remain war crimes”.

LEAVE A REPLY

Please enter your comment!
Please enter your name here