
"claims the first developer beta of iOS 26.1 includes code to "suggest" Apple is working toward implementing support for the open-source Model Context Protocol (MCP) developed by Anthropic in its operating systems. (MCP is widely used by other AI companies, including OpenAI and Google.) If true, this potentially unlocks the kind of useful, human-centred AI tasks we believe Apple dreams of - contextually aware solutions that can answer complex questions by interacting with one, two, or even more apps."
"Hustle over to GitHub and you'll find a collection of independently-developed Apple-native tools for MCP. The developer says these tools give Macs "superpowers." If true, these powers include: Create, search through and find Notes with spoken interrogations. Find your contact details without endless scrolling. Send emails, including attachments, using voice. Search emails the same, too. Create and search for events, listing upcoming event, and reminders."
Apple has an ecosystem and hardware capable of running advanced AI, with software integration coming next. iOS 26.1 developer beta code suggests potential support for Anthropic’s open-source Model Context Protocol (MCP), a protocol also used by OpenAI and Google. MCP support could enable context-aware, multi-app AI that answers complex questions by interacting across apps. Independent Apple-native MCP tools on GitHub demonstrate features like creating and searching Notes by voice, finding contacts, composing emails with attachments via voice, searching email, creating and searching events and reminders, and chaining commands across apps.
Read at Computerworld
Unable to calculate read time
Collection
[
|
...
]