How developers are using Apple’s local AI models with iOS 26
Earlier this year, Apple introduced its Foundation Models framework during WWDC 2025, which allows developers to use the company’s local AI models to power features in their applications.
Ivan Mehta
Published on October 3, 2025 · Updated October 5, 2025

Developers Begin Using Apple's Local AI Models in iOS 26
The company touted that with this framework, developers gain access to AI models without worrying about any inference cost. Plus, these local models have capabilities such as guided generation and tool calling built in.
As iOS 26 is rolling out to all users, developers have been updating their apps to include features powered by Apple’s local AI models. Apple’s models are small compared with leading models from OpenAI, Anthropic, Google, or Meta. That is why local-only features largely improve quality of life within these apps rather than introducing major workflow changes.
Below are some of the first apps to tap into Apple’s AI framework.
Lil Artist
The Lil Artist app offers various interactive experiences to help kids learn different skills like creativity, math, and music. Developers Arima Jain and Aman Jain shipped an AI story creator with the iOS 26 update. This allows users to select a character and a theme, with the app generating a story using AI. The text generation in the story is powered entirely by Apple’s on-device model.
Image Credits: Lil Artist
Daylish
The developer of the Daylish app is working on a prototype that automatically suggests emojis for timeline events based on the title for the daily planner app.
MoneyCoach
MoneyCoach, a finance tracking app, now features two capabilities powered by local models. The first shows spending insights — for instance, if you spent more than average on groceries that week. The second feature automatically suggests categories and subcategories for quick entries.
Image Credits: MoneyCoach
LookUp
The word-learning app LookUp has added two new modes using Apple’s local AI models. One is a new learning mode that leverages a local model to create examples for a given word, prompting users to explain the usage of that word in a sentence. The developer is also using on-device models to generate a map view of a word’s origin.
Image Credits: LookUp
Image Credits: LookUp
Tasks
The Tasks app now automatically suggests tags for new entries using local models. It can also detect recurring tasks and schedule them automatically. Users can even dictate tasks, with the app using the local model to break speech into individual tasks without an internet connection.
Image Credits: Tasks
Day One
Day One, the journaling app owned by Automattic, uses Apple’s on-device models to generate highlights, suggest titles for entries, and create reflective prompts encouraging deeper writing based on previous content.
Image Credits: Day One
Crouton
The recipe app Crouton leverages Apple Intelligence to suggest tags for recipes, name timers, and break down text into easy-to-follow cooking steps.
Signeasy
Signeasy, a digital signing app, uses local models to extract key insights from contracts and summarize documents for users before signing.
Dark Noise
Dark Noise, the background sound app, uses Apple’s local models to let users describe a soundscape in words and generate one automatically. Users can then fine-tune the sound by adjusting the levels of individual elements.
Lights Out
Lights Out, an F1 tracking app from developer Shihab Mehboob (creator of Avery and Mammoth), uses on-device AI to summarize live race commentary and provide key takeaways during Grand Prix events.
Capture
The note-taking app Capture uses local AI to suggest categories as users type notes or tasks, improving organization and tagging speed.
Image Credits: Capture
Lumy
The sun and weather-tracking app Lumy now uses AI to provide weather-related suggestions and personalized forecasts.
Image Credits: Lumy
CardPointers
CardPointers, an app for managing credit card benefits, now lets users ask questions about their cards and offers using Apple’s on-device AI. It also provides personalized tips for maximizing rewards.
Image Credits: CardPointers
Guitar Wiz
Guitar Wiz integrates Apple’s Foundation Model framework to explain chords, give advanced learners real-time insights, and provide multilingual support for over 15 languages.
SmartGym
SmartGym uses local AI to convert workout descriptions into step-by-step sets with reps, intervals, and equipment details. It also generates monthly summaries and performance reports.
Stoic
Stoic, a journaling app, uses Apple’s models to provide personalized prompts based on user mood tracking. It also summarizes posts and helps users search and organize past entries.
Image Credits: Stoic
SwingVision
SwingVision helps tennis and pickleball players improve their form through video analysis. The app now uses Apple’s Foundation Models to provide specific, actionable feedback on gameplay.
Zoho
Zoho, the India-based productivity suite, is leveraging Apple’s local models for summarization, translation, and transcription across apps like Notebook and Tables.
TrainFitness
TrainFitness uses on-device models to suggest exercise alternatives when users lack specific equipment, enhancing workout flexibility.
Stuff
Stuff, a to-do list app, includes a new listen mode powered by Apple’s AI models that converts spoken words into individual tasks.
We will continue updating this list as more apps adopt Apple’s on-device AI framework in iOS 26.
Related Articles

With its latest acqui-hire, OpenAI is doubling down on personalized consumer AI
OpenAI has acquired Roi, an AI-powered personal finance app. In keeping with a recent trend in the A...

The Pulse of Cloud and Cyber
In this edition of “Nabz-e Abr & Cyber,” we track five meaningful waves—from Microsoft’s $15.2B bet...

What to expect at OpenAI’s DevDay 2025, and how to watch it
OpenAI is gearing up to host its third annual developer conference, DevDay 2025, on Monday.