How developers are using Apple's local AI models with iOS 26 | TechCrunch
Briefly

How developers are using Apple's local AI models with iOS 26 | TechCrunch
"Earlier this year, Apple introduced its Foundation Models framework during WWDC 2025, which allows developers to use the company's local AI models to power features in their applications. The company touted that with this framework, developers gain access to AI models without worrying about any inference cost. Plus, these local models have capabilities such as guided generation and tool calling built in."
"As iOS 26 is rolling out to all users, developers have been updating their apps to include features powered by Apple's local AI models. Apple's models are small compared with leading models from OpenAI, Anthropic, Google, or Meta. That is why local-only features largely improve quality of life with these apps rather than introducing major changes to the app's workflow."
Apple introduced the Foundation Models framework at WWDC 2025, enabling developers to run the company's local AI models inside applications without inference costs. The local models include capabilities such as guided generation and tool calling. As iOS 26 rolls out, developers are updating apps to add local-AI features that improve quality of life rather than overhaul workflows because Apple's models are smaller than leading cloud models. Early app integrations include Lil Artist's AI story creator, Daylish's emoji suggestion prototype, MoneyCoach's spending insights and autofill categories, and LookUp's learning mode that generates examples and prompts user explanations.
Read at TechCrunch
Unable to calculate read time
[
|
]