• Menu
  • Skip to right header navigation
  • Skip to main content
  • Skip to primary sidebar

DigiBanker

Bringing you cutting-edge new technologies and disruptive financial innovations.

  • Home
  • Pricing
  • Features
    • Overview Of Features
    • Search
    • Favorites
  • Share!
  • Log In
  • Home
  • Pricing
  • Features
    • Overview Of Features
    • Search
    • Favorites
  • Share!
  • Log In

Apple’s Foundation Models Framework gives developers API access to on-device 3-billion-parameter LLM powering 20+ apps including SmartGym, Stoic, and OmniFocus with zero-cost inference, no token limits and offline functionality

October 1, 2025 //  by Finnovate

Proving Apple Intelligence’s worth, third-party developers are now using it to make apps more personal to users, and users more productive. By using the Foundation Models Framework, developers get an API that means they can pass prompts to Apple Intelligence. It’s done privately, and with specific limitations — but also specific freedoms: No limit on user requests; No tokens or API keys for the user to install; Access to the same Apple Intelligence on device. That last part is significant, because it is both a limitation and a guarantee of privacy. Developers can’t use the full Apple Intelligence LLM in the cloud, nor can they use extensions such as directly making requests of ChatGPT. Nonetheless, developers have been implement Apple Intelligence across a wide range of apps. Apple has now picked out more than 20 to champion, ranging from To Do apps to mental health ones.  CellWalk takes users through a 3D journey around molecules. Now it can automatically tailor its explanations to the user’s level of knowledge.  More heavyweight To Do apps such as OmniFocus are adopting Apple Intelligence too. It can generate whole projects and suggested tasks to help a user get started. Starting with iOS 26, iPadOS 26, and macOS 26, Apple provides app developers with access to a new Foundation Models framework that allows their apps to tap into the on-device LLM at the core of Apple Intelligence. Apple today highlighted some of the iPhone, iPad, and Mac apps utilizing the Foundation Models framework to power new features and capabilities. For example, in the fitness app SmartGym, users can now describe a workout and turn it into a structured routine with sets, reps, rest times, and equipment adaptation. SmartGym also generates insightful summaries of workout data, and more. In the journaling app Stoic, users can now receive contextual journaling prompts that are generated from their recent entries, and there are other new features. SwingVision, an app that helps users with their tennis or pickleball skills, uses the framework to generate advice for players to improve their game. Foundation Models powers new Listen Mode and Scan Mode options in the to-do app Stuff. The task management app OmniFocus 4 can now generate projects and next steps on a user’s behalf, such as helping them know what to pack for an upcoming trip.

Read Article

Category: Apple, Companies and Organizations

Previous Post: « Embedded payments are seeing rising adoption in the parking sector through AI-recognition tech that lets customers just drive in and scan a QR code to enter their credit card information the first time they park, with automatic vehicle identification and charges applied on subsequent trips

Copyright © 2025 Finnovate Research · All Rights Reserved · Privacy Policy
Finnovate Research · Knyvett House · Watermans Business Park · The Causeway Staines · TW18 3BA · United Kingdom · About · Contact Us · Tel: +44-20-3070-0188

We use cookies to provide the best website experience for you. If you continue to use this site we will assume that you are happy with it.