
"The darkness in the watchtower was a condition of the terrain. The darkness inside the algorithm is a condition of the design. In both cases, the blindness was chosen. It was chosen because blindness is useful: it creates deniability, it makes the violence feel inevitable, it moves the question of who decided from a person to a procedure."
"Israel's recent war in Gaza has been described as the first major AI war—the first war in which AI systems have played a central role in generating Israel's list of purported Hamas and Islamic jihad militants to target. Systems that processed billions of data points to rank the probability that any given person in the territory was a combatant."
"The weapons were precise. Munitions experts described the targeting as incredibly accurate, each building individually struck, nothing missed. The problem was not the execution. The problem was intelligence."
Israel's 'fog procedure' originated during the second intifada as an unofficial military rule requiring soldiers to fire into darkness at military posts, theorizing invisible threats might exist. This violence was licensed by blindness and justified as deterrence. Modern AI warfare systems have systematized this logic by processing billions of data points to rank combatant probability, creating algorithmic darkness instead of terrain darkness. Both forms of blindness are deliberately chosen because they create deniability, make violence seem inevitable, and shift responsibility from individuals to procedures. The darkness inside algorithms represents designed opacity rather than environmental conditions, yet serves identical purposes: obscuring decision-making and enabling violence while maintaining plausible deniability.
#ai-warfare #algorithmic-accountability #military-targeting-systems #civilian-casualties #designed-opacity
Read at www.theguardian.com
Unable to calculate read time
Collection
[
|
...
]