rewrite this content and keep HTML tags
AI warfare may conjure images of killer robots and autonomous drones, but a different reality is unfolding in the Gaza Strip. There, artificial intelligence has been suggesting targets in Israel’s retaliatory campaign to root out Hamas following the group’s Oct. 7, 2023 attack. A program known as “The Gospel” generates suggestions for buildings and structures militants may be operating in. “Lavender” is programmed to identify suspected members of Hamas and other armed groups for assassination, from commanders all the way down to foot soldiers. “Where’s Daddy?” reportedly follows their movements by tracking their phones in order to target them—often to their homes, where their presence is regarded as confirmation of their identity. The air strike that follows might kill everyone in the target’s family, if not everyone in the apartment building.
A body of a Palestinian is retrieved from the rubble of a house destroyed in an Israeli strike at the Nuseirat refugee camp in the central Gaza Strip on Nov. 12, 2024.Majdi Fathi—NurPhoto/Reuters
The treaties that govern armed conflict are non-specific when it comes to the tools used to deliver military effect. The elements of international law covering war—on proportionality, precautions, and distinctions between civilians and combatant—apply whether the weapon being used is a crossbow or a tank—or an AI-powered database. But some advocates, including the International Committee of the Red Cross, argue that AI requires a new legal instrument, noting the crucial need to ensure human control and accountability as AI weapons systems become more advanced.
One intelligence officer tasked with authorizing a strike recalled dedicating roughly 20 seconds to personally confirming a target, which could amount to verifying that the individual in question was male.
Graves are prepared for the funeral of Palestinians killed in overnight Israeli strikes at a cemetery in Rafah, on the southern Gaza Strip on Feb. 21, 2024.Said Khatib—AFP/Getty Images
The IDF did not specifically dispute Abraham’s reporting about Lavender’s 10% error rate, or that an analyst might spend as little as 20 seconds analyzing the targets, but in a statement to TIME, a spokesperson said that analysts “verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.”
Converting data into target lists is not incompatible with the laws of war. Indeed, a scholar at West Point, assessing the Israeli programs, observed that more information could make for greater accuracy. By some contemporary accounts, that may have been the case the last time Israel went to war in Gaza, in 2021. That brief conflict apparently marked the first time the IDF used artificial intelligence in a war, and afterward, the then-head of UNRWA, the U.N. agency that provides health, education, and advocacy for Palestinians, remarked on “a huge sophistication in the way the Israeli military struck over the last 11 days.” But the 2021 round of combat, which produced 232 Palestinian deaths, was a different kind of war. It was fought under Israeli rules of engagement ostensibly intended to minimize civilian casualties, including by “knocking on the door”—dropping a small charge on the rooftop of a building to warn occupants that it was about to be destroyed, and should evacuate.
In the current war, launched more than 14 months ago to retaliate for the worst attack on Jews since the Holocaust, Israeli leaders shut off water and power to all of Gaza, launched 6,000 airstrikes in the space of just five days, and suspended some measures intended to limit civilian casualties. “This time we are not going to “knock on the roof” and ask them to evacuate the homes,” former Israeli military intelligence chief Amos Yadlin told TIME five days after Oct. 7, warning that that the weeks ahead would be “very bloody” in Gaza. “We are going to attack every Hamas operative and especially the leaders and make sure that they will think twice before they will even think about attacking Israel.” Abraham reported that targeting officers were told it was acceptable to kill 15 to 20 noncombatants in order to kill a Hamas soldier (the number in previous conflicts, he reports, was zero), and as many as 100 civilians to kill a commander. The IDF did not comment on those figures.
Israeli army battle tank at a position along the border with Gaza, on March 19, 2024.Jack Guez—AFP/Getty ImagesA Ukrainian analyst views drone footage of Russian trenches near Bakhmut, Ukraine, on Jan. 6, 2023. Nicole Tung—The New York Times/Redux
With fighting largely confined to battlefields, where both Russian and Ukrainian forces are dug in, the issues that animate the debate in Gaza have not been in the foreground. But any country with an advanced military—including the U.S.—is likely to soon confront the issues that come with machine learning.
“A more comprehensive and public approach is necessary to address the risk of AI weapons and maintain America’s leadership in ethical technology development,” Welch says, “as well as establish international norms in this critical space.”
.Organize the content with appropriate headings and subheadings (h1, h2, h3, h4, h5, h6), Retain any existing tags from
AI warfare may conjure images of killer robots and autonomous drones, but a different reality is unfolding in the Gaza Strip. There, artificial intelligence has been suggesting targets in Israel’s retaliatory campaign to root out Hamas following the group’s Oct. 7, 2023 attack. A program known as “The Gospel” generates suggestions for buildings and structures militants may be operating in. “Lavender” is programmed to identify suspected members of Hamas and other armed groups for assassination, from commanders all the way down to foot soldiers. “Where’s Daddy?” reportedly follows their movements by tracking their phones in order to target them—often to their homes, where their presence is regarded as confirmation of their identity. The air strike that follows might kill everyone in the target’s family, if not everyone in the apartment building.
A body of a Palestinian is retrieved from the rubble of a house destroyed in an Israeli strike at the Nuseirat refugee camp in the central Gaza Strip on Nov. 12, 2024.Majdi Fathi—NurPhoto/Reuters
The treaties that govern armed conflict are non-specific when it comes to the tools used to deliver military effect. The elements of international law covering war—on proportionality, precautions, and distinctions between civilians and combatant—apply whether the weapon being used is a crossbow or a tank—or an AI-powered database. But some advocates, including the International Committee of the Red Cross, argue that AI requires a new legal instrument, noting the crucial need to ensure human control and accountability as AI weapons systems become more advanced.
One intelligence officer tasked with authorizing a strike recalled dedicating roughly 20 seconds to personally confirming a target, which could amount to verifying that the individual in question was male.
Graves are prepared for the funeral of Palestinians killed in overnight Israeli strikes at a cemetery in Rafah, on the southern Gaza Strip on Feb. 21, 2024.Said Khatib—AFP/Getty Images
The IDF did not specifically dispute Abraham’s reporting about Lavender’s 10% error rate, or that an analyst might spend as little as 20 seconds analyzing the targets, but in a statement to TIME, a spokesperson said that analysts “verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.”
Converting data into target lists is not incompatible with the laws of war. Indeed, a scholar at West Point, assessing the Israeli programs, observed that more information could make for greater accuracy. By some contemporary accounts, that may have been the case the last time Israel went to war in Gaza, in 2021. That brief conflict apparently marked the first time the IDF used artificial intelligence in a war, and afterward, the then-head of UNRWA, the U.N. agency that provides health, education, and advocacy for Palestinians, remarked on “a huge sophistication in the way the Israeli military struck over the last 11 days.” But the 2021 round of combat, which produced 232 Palestinian deaths, was a different kind of war. It was fought under Israeli rules of engagement ostensibly intended to minimize civilian casualties, including by “knocking on the door”—dropping a small charge on the rooftop of a building to warn occupants that it was about to be destroyed, and should evacuate.
In the current war, launched more than 14 months ago to retaliate for the worst attack on Jews since the Holocaust, Israeli leaders shut off water and power to all of Gaza, launched 6,000 airstrikes in the space of just five days, and suspended some measures intended to limit civilian casualties. “This time we are not going to “knock on the roof” and ask them to evacuate the homes,” former Israeli military intelligence chief Amos Yadlin told TIME five days after Oct. 7, warning that that the weeks ahead would be “very bloody” in Gaza. “We are going to attack every Hamas operative and especially the leaders and make sure that they will think twice before they will even think about attacking Israel.” Abraham reported that targeting officers were told it was acceptable to kill 15 to 20 noncombatants in order to kill a Hamas soldier (the number in previous conflicts, he reports, was zero), and as many as 100 civilians to kill a commander. The IDF did not comment on those figures.
Israeli army battle tank at a position along the border with Gaza, on March 19, 2024.Jack Guez—AFP/Getty ImagesA Ukrainian analyst views drone footage of Russian trenches near Bakhmut, Ukraine, on Jan. 6, 2023. Nicole Tung—The New York Times/Redux
With fighting largely confined to battlefields, where both Russian and Ukrainian forces are dug in, the issues that animate the debate in Gaza have not been in the foreground. But any country with an advanced military—including the U.S.—is likely to soon confront the issues that come with machine learning.
“A more comprehensive and public approach is necessary to address the risk of AI weapons and maintain America’s leadership in ethical technology development,” Welch says, “as well as establish international norms in this critical space.”
and integrate them seamlessly into the new content without adding new tags. Include conclusion section and FAQs section at the end. do not include the title. it must return only article i dont want any extra information or introductory text with article e.g: ” Here is rewritten article:” or “Here is the rewritten content:”