Apple has changed its operating system names at WWDC 2025, introducing the year as part of the name. All systems will be called iOS 26, iPadOS 26, macOS 26, tvOS 26, watchOS 26, and visionOS 26, aiming to make the naming process clearer and more consistent across all platforms. This change aligns with rivals Samsung and Microsoft, making it easier for users to understand and identify the latest updates. The design overhaul, known as “Liquid Glass,” features a transparent interface.
Apple introduces live translation across Messages, FaceTime, and Phone at WWDC 25
Apple is introducing Live Translation, powered by Apple Intelligence, for Messages, FaceTime, and Phone calls. Live translation can translate conversation on the fly. The translation feature is “enabled by Apple Built models that run entirely on your device so your personal conversations stay personal. In Messages, Live Translation will automatically translate text for you as you type and deliver it in your preferred language. Similarly, when the person you’re texting responds, each text can be instantly translated. When catching up on FaceTime, Apple’s translation feature will provide live captions. And on a phone call — whether you’re talking to an Apple user or not — your words can be translated as you talk, and the translation is spoken out loud for the call recipient. As the person you’re speaking to responds in their own language, you’ll hear a spoken translation of their voice.
Developers can use API keys to bring AI models from other providers to Xcode
Apple has released a new version of its app development suite, Xcode, which integrates OpenAI’s ChatGPT for coding, document generation, and more. Developers can use API keys to bring AI models from other providers to Xcode for AI-powered programming suggestions. The new AI integrations allow developers to generate code previews, iterate on designs, and fix errors. ChatGPT can be accessed without creating an account, and paid users can increase rate limits. Apple also launched the Foundation Models framework, allowing developers to access on-device AI models with just three lines of code. The company chose ChatGPT over a vibe-coding software in partnership with Anthropic.
Apple’s widgets, now integrated into your space with visionOS 26, offer personalized information at a glance
Personalized spatial widgets: Apple’s widgets, now integrated into your space with visionOS 26, offer personalized information at a glance. Users can customize widgets to size, color, and depth, and add features like customizable clocks, weather adapters, quick music access, and photos that can transform into panoramas. Adding depth to 2D images: Apple has updated its visionOS Photos app with an AI algorithm that creates multiple perspectives for 2D photos, allowing users to “lean right into them and look around.” Spatial browsing on Safari can also enhance web browsing by hiding distractions and revealing inline photos, and developers can add it to their apps. Talking heads: Apple has introduced Personas, an AI avatar for video calls, on the Vision Pro. The new avatars, created usingvolumetric rendering and machine learning technology, are more realistic and accurate in appearance, including hair, eyelashes, and complexion. They are created on-device in seconds. Immerse together: VisionOS 26 allows users to collaborate with headset-wearing friends to watch movies or play spatial games. This feature is also being marketed for enterprise clients, such as Dassault Systèmes, which uses the 3DLive app to visualize 3D designs in person and with remote colleagues. Enterprise APIs and tools: Apple has introduced visionOS 26, a new operating system that allows organizations to share devices among team members and securely save eye and hand data, vision prescription, and accessibility settings to iPhones. The system also includes a “for your eyes only” mode to restrict access to confidential materials. Additionally, Apple has introduced Logitech Muse, a spatial accessory for Vision Pro, allowing precise 3D drawing and collaboration. The company plans to add more APIs for app development.
WWDC 25 was notably quiet on a more personalized, AI-powered Siri
Apple announced several updates to its operating systems, services, and software, including a new look called “Liquid Glass” and a rebranded naming convention. However, the company was notably quiet on a more personalized, AI-powered Siri, which was first introduced at WWDC 24. The company’s SVP of Software Engineering, Craig Federighi, only briefly mentioned the Siri update during the keynote address, stating that the work needed more time to reach high-quality standards. The delay in the AI era suggests that Apple won’t have news about the Siri update until 2026, a significant delay in the AI era. The more personalized Siri is expected to bring artificial intelligence updates to the virtual assistant built into iPhone and other Apple devices. Bloomberg reported that the in-development version of the more personalized Siri was functional, but it was not consistently working properly, making it not viable to ship. Apple officially announced in March that the Siri update would take longer to deliver than anticipated.
Apple Intelligence opened up to all developers with Foundation Models Framework
Apple has announced that developers will soon be able to access the on-device large language models that power Apple Intelligence in their own apps through the Foundation Models framework. This will allow third-party apps to use the features for image creation, text generation, and more. The on-device processing will allow for fast, powerful, privacy-focused AI features that are available without an internet connection. Apple has also announced plans to expand the number of languages its AI platform supports and make the generative models that power it more capable and efficient. The company’s move comes as the company continues to make its intelligence systems accessible to third-party apps.
Apple Vision Pro ‘Spatial Widgets’ blend digital life into your real space allowing users to pin interactive elements like clocks, music controls, weather panels, and photo galleries directly into their physical space.
Apple has introduced spatial widgets in its Vision Pro headset, allowing users to pin interactive elements like clocks, music controls, weather panels, and photo galleries directly into their physical space. These widgets are customizable in size, color, depth, and layout, and are meant to be part of the user’s space. The Vision Pro update marks a clear step toward persistent spatial computing, with widgets like Photos, Clock, Weather, and Music playing the role of physical objects. However, the experience of using spatial widgets raises questions about how digital environments are changing the way we relate to physical ones. While Vision Pro is still shown in cozy, furnished homes, the integration of digital objects into physical spaces could lead to a different reality. The visionOS 26 update is currently available in developer beta, with a public release expected in fall 2025. As more developers build spatial widgets, the headset might feel useful in quiet, everyday ways. The end goal of AR/VR is to augmentation of reality, overlaying digital things on analog reality. However, Apple is not pushing this path for now, as it would be crucified if it did. The company has a decent track record for a corporation, despite the potential for a dystopian future where technology works against us.
iOS 26 lets you know how long It’ll take your battery to charge
iOS 26 introduces a new feature that allows users to know the estimated time it takes for their battery to charge when plugged in or on a wireless charger. This allows users to optimize their charging practices and determine the speed of a charger. The estimated time remaining until a full charge can be found in the Battery section of the Settings app. Apple has not yet added a new widget for this feature, but it could be added in the future. The iPhone battery charging estimates are available to developers in iOS 26, with the update expected to be released to everyone this fall.
iOS 26 upgrades CarPlay with new compact view for incoming phone calls
Apple announced iOS 26, and the upcoming software update includes several new features and changes for CarPlay in vehicles. Liquid Glass Design: When you are using CarPlay with an iPhone running iOS 26, the new Liquid Glass design extends to the CarPlay interface. Like on the iPhone, the new look includes more shimmery app icons and translucent user interface elements. New Messages App Features: Starting with iOS 26, you can respond to messages with standard Tapbacks like a heart, thumbs up, or exclamation marks directly through CarPlay. Plus, you can now view your pinned conversations in the Messages app on CarPlay. Compact View for Phone Calls: CarPlay has a new compact view for incoming phone calls, so that you can still see other information on the screen, such as turn-by-turn directions. Live Activities: CarPlay’s Dashboard screen can now show Live Activities, letting you keep track of things like a flight’s arrival time at a glance. Widgets: The regular version of CarPlay now has a customizable widgets screen, for things like calendar appointments and HomeKit accessory controls.
Apple improves app discovery on App Store with new tags that will help surface information that’s often buried in app listings, like the app’s App Store description, category, metadata, or even in its screenshots
Apple is introduceing App Store Tags — labels that highlight specific features and functionalities found in an app. These tags will initially be generated by Apple’s LLMs using various sources, like the app’s metadata. They’ll then be human-reviewed before being applied to apps on the App Store. Apple customers will be able to use these tags when searching for apps on the App Store, where the tags appear alongside the categories on the search page and the apps that appear in the search results. Apple says the new tags will help surface information that’s often buried in app listings, like the app’s App Store description, category, metadata, or even in its screenshots. The tags will help users more easily find the apps that offer the functionality they’re looking for, while also giving developers a better idea about how their apps are being discovered. When App Store users tap on one of the new tags, they’ll be taken to a new page offering a curated collection of all the apps and games that offer similar features or functionality — an extension to the App Store’s existing feature that points users to apps they “might also like,” found at the bottom of individual listings. Developers will also be able to create custom product pages that appear when a user searches for apps using particular keywords.