
"Last week, Anthropic pulled out of a pending agreement with the Defense Department over stated concerns about how its technologies could be used for autonomous warfare and mass domestic surveillance. OpenAI, the maker of ChatGPT, the next day announced a deal with the Pentagon that some experts say lacks safeguards to protect citizens' privacy and limit robot-aided war."
"The dealings sparked worries that Silicon Valley AI firms, competing among each other and against China, will leave issues of safety and privacy behind in the frenzied scramble for technological supremacy. "There have been some dynamics of this that have been race to the bottom-y," said Nathan Calvin, vice-president of state affairs and chief attorney at Encode."
""There's a concern that there's a sense of inevitability, that, 'If we don't do it, then somebody else will.'" This sentiment reflects the competitive pressure driving AI companies to accept military contracts despite ethical concerns about surveillance and autonomous weapons applications."
Anthropic and OpenAI have taken divergent approaches to military contracts, creating tension in Silicon Valley's relationship with the U.S. Department of Defense. Anthropic withdrew from a Pentagon agreement citing concerns about autonomous warfare and mass domestic surveillance applications. OpenAI simultaneously announced a Pentagon deal that experts argue lacks adequate privacy protections and safeguards against autonomous weapons. This competition between AI firms, intensified by rivalry with China, has sparked concerns about a "race to the bottom" where companies prioritize technological supremacy and market advantage over ethical considerations. The dispute reflects broader tensions about Silicon Valley's military involvement, including questions about government surveillance capabilities and autonomous weapons systems.
#ai-military-contracts #autonomous-weapons #government-surveillance #silicon-valley-ethics #pentagon-technology-deals
Read at The Mercury News
Unable to calculate read time
Collection
[
|
...
]