
"The next time you chat with Claude, you'll have the option to have it reference your previous conversation to inform its outputs. Anthropic first made its chatbot capable of remembering past interactions last August, before giving it the ability to compartmentalize memories in the fall. Making memory a free feature is well-timed."
"If after enabling memory you decide to turn it off, you can either pause the feature, preserving Claude's memories for use down the road, or completely delete them so they're not saved on Anthropic's servers."
"US Defense Secretary Pete Hegseth labeled the company a supply chain risk after it refused to sign a contract that would allow the Pentagon to use Anthropic models for mass surveillance against Americans and in fully autonomous weapons."
Anthropic is expanding Claude's capabilities by offering memory functionality to free users, enabling the chatbot to reference past conversations. This feature, initially launched in August and refined in fall, allows users to pause or completely delete memories stored on Anthropic's servers. The timing coincides with Claude's rise to the top of the App Store's free app charts and Anthropic's new tool for importing conversations from competing chatbots. Simultaneously, Anthropic faces a significant contract dispute with the US government, with Defense Secretary Pete Hegseth designating the company a supply chain risk after it refused to sign a Pentagon contract permitting mass surveillance and autonomous weapons use.
Read at Engadget
Unable to calculate read time
Collection
[
|
...
]