The real danger of military AI isn't killer robots; it's worse human judgement
Briefly

The real danger of military AI isn't killer robots; it's worse human judgement
"The more you use AI, the more you will use your brain in a different way. And so [we need to] be able to have some oversight, to be able to critique what we see from AI, and to be sure you are not fooled by a sort of false presentation of things. It's something we need to take care of."
"Because of the pace and the urgency associated with deploying these models, maybe that hasn't been put in place."
"Especially as you get deeper into any conflict, there will be more and more pressure to find more targets. That happens in every conflict as well."
"Wide use of large language models can undermine human thought and communication, homogenizing thinking among users and reinforcing dominant styles while marginalizing alternative voices and reasoning strategies."
The Pentagon's swift integration of commercial AI tools poses risks to military decision-making, potentially impairing the ability to discern truth from falsehood. Research indicates that reliance on AI can diminish native cognitive skills. Military leaders, including NATO's Adm. Pierre Vandier, emphasize the need for oversight to prevent being misled by AI outputs. However, there is little evidence that the Pentagon is monitoring the cognitive effects of AI usage. The urgency to deploy AI tools increases pressure on military personnel to generate targets quickly, further complicating the situation.
Read at Nextgov.com
Unable to calculate read time
[
|
]