
"These actions are unprecedented and unlawful. The Constitution does not allow the government to wield its enormous power to punish a company for its protected speech. No federal statute authorizes the actions taken here. Anthropic turns to the judiciary as a last resort to vindicate its rights and halt the Executive's unlawful campaign of retaliation."
"Anthropic said it sought to restrict its technology from being used for two high-level usages: mass surveillance of Americans and fully autonomous weapons. Defense Secretary Pete Hegseth and other officials insisted the company must accept 'all lawful uses' of Claude and threatened punishment if the company did not comply."
"Designating the company a supply chain risk cuts off Anthropic's defense work using an authority that was designed to prevent foreign adversaries from harming national security systems. It was the first time the federal government is known to have used the designation against a U.S. company."
Anthropic filed two lawsuits against the Pentagon's decision to designate the AI company a supply chain risk following disputes over military applications of its Claude chatbot. The company sought to restrict its technology from mass surveillance and fully autonomous weapons use, but Defense Secretary Pete Hegseth demanded unrestricted military access. The supply chain risk designation, typically used against foreign adversaries, marks the first time applied to a U.S. company. Anthropic argues the government's actions violate constitutional protections and lack statutory authorization, characterizing them as unlawful retaliation for protected speech. President Trump ordered federal agencies to cease using Claude, allowing the Pentagon six months to phase out the technology from classified military systems.
#ai-regulation #military-technology #government-litigation #autonomous-weapons #supply-chain-security
Read at ABC7 Los Angeles
Unable to calculate read time
Collection
[
|
...
]