How developers are using Apple's local AI models with iOS 26 | TechCrunch
Briefly

How developers are using Apple's local AI models with iOS 26 | TechCrunch
"Earlier this year, Apple introduced its Foundation Models framework during WWDC 2025, which allows developers to use the company's local AI models to power features in their applications. The company touted that with this framework, developers gain access to AI models without worrying about any inference cost. Plus, these local models have capabilities such as guided generation and tool calling built in."
"As iOS 26 is rolling out to all users, developers have been updating their apps to include features powered by Apple's local AI models. Apple's models are small compared with leading models from OpenAI, Anthropic, Google, or Meta. That is why local-only features largely improve quality of life with these apps rather than introducing major changes to the app's workflow."
Apple's Foundation Models framework enables local AI model use in apps with built-in guided generation and tool calling and eliminates developer inference costs. With iOS 26 rolling out, developers are adding local-model features that prioritize quality-of-life improvements over major workflow changes because the models are smaller than leading cloud models. Examples include Lil Artist's AI story creator and emoji-suggestion prototype, MoneyCoach's spending insights and automatic categorization, and a word learning app that generates examples and prompts users to explain usage. On-device capabilities are being prototyped across planners, finance, education, and children's apps.
Read at TechCrunch
Unable to calculate read time
[
|
]