Anthropic vs. the Pentagon: What's actually at stake? | TechCrunch
Briefly

Anthropic vs. the Pentagon: What's actually at stake? | TechCrunch
"Anthropic doesn't want its AI models to be used for mass surveillance of Americans or for autonomous weapons with no human in the loop for targeting and firing decisions. Traditional defense contractors typically have little say in how their products will be used, but Anthropic has argued from its inception that AI technology poses unique risks and therefore requires unique safeguards."
"At its core, this fight is about who controls powerful AI systems - the companies that build them, or the government that wants to deploy them. Secretary Hegseth has argued the Department of Defense shouldn't be limited by the rules of a vendor, arguing any lawful use of the technology should be permitted."
"The U.S. military already relies on highly automated systems, some of which are lethal. The decision to use lethal force has historically been left to humans, but there are few legal restrictions on military use of autonomous weapons. The DoD doesn't categorically ban fully autonomous weapons systems."
Anthropic CEO Dario Amodei and Defense Secretary Pete Hegseth are engaged in a significant dispute over military AI deployment. Anthropic maintains strict ethical boundaries, refusing to allow its AI models for mass surveillance of Americans or fully autonomous weapons without human control over targeting and firing decisions. Hegseth argues the Department of Defense should not be constrained by vendor rules and should be permitted any lawful use of technology. The core conflict centers on control: whether companies building AI systems or the government deploying them should determine usage parameters. Anthropic believes AI technology poses unique risks requiring distinctive safeguards, distinguishing it from traditional defense contractors. The U.S. military currently operates highly automated systems, and existing 2023 DOD directives permit AI systems to select and engage targets without human intervention under certain conditions.
Read at TechCrunch
Unable to calculate read time
[
|
]