Apple has introduced spatial widgets in its Vision Pro headset, allowing users to pin interactive elements like clocks, music controls, weather panels, and photo galleries directly into their physical space. These widgets are customizable in size, color, depth, and layout, and are meant to be part of the user’s space. The Vision Pro update marks a clear step toward persistent spatial computing, with widgets like Photos, Clock, Weather, and Music playing the role of physical objects. However, the experience of using spatial widgets raises questions about how digital environments are changing the way we relate to physical ones. While Vision Pro is still shown in cozy, furnished homes, the integration of digital objects into physical spaces could lead to a different reality. The visionOS 26 update is currently available in developer beta, with a public release expected in fall 2025. As more developers build spatial widgets, the headset might feel useful in quiet, everyday ways. The end goal of AR/VR is to augmentation of reality, overlaying digital things on analog reality. However, Apple is not pushing this path for now, as it would be crucified if it did. The company has a decent track record for a corporation, despite the potential for a dystopian future where technology works against us.
iOS 26 lets you know how long It’ll take your battery to charge
iOS 26 introduces a new feature that allows users to know the estimated time it takes for their battery to charge when plugged in or on a wireless charger. This allows users to optimize their charging practices and determine the speed of a charger. The estimated time remaining until a full charge can be found in the Battery section of the Settings app. Apple has not yet added a new widget for this feature, but it could be added in the future. The iPhone battery charging estimates are available to developers in iOS 26, with the update expected to be released to everyone this fall.
iOS 26 upgrades CarPlay with new compact view for incoming phone calls
Apple announced iOS 26, and the upcoming software update includes several new features and changes for CarPlay in vehicles. Liquid Glass Design: When you are using CarPlay with an iPhone running iOS 26, the new Liquid Glass design extends to the CarPlay interface. Like on the iPhone, the new look includes more shimmery app icons and translucent user interface elements. New Messages App Features: Starting with iOS 26, you can respond to messages with standard Tapbacks like a heart, thumbs up, or exclamation marks directly through CarPlay. Plus, you can now view your pinned conversations in the Messages app on CarPlay. Compact View for Phone Calls: CarPlay has a new compact view for incoming phone calls, so that you can still see other information on the screen, such as turn-by-turn directions. Live Activities: CarPlay’s Dashboard screen can now show Live Activities, letting you keep track of things like a flight’s arrival time at a glance. Widgets: The regular version of CarPlay now has a customizable widgets screen, for things like calendar appointments and HomeKit accessory controls.
Apple improves app discovery on App Store with new tags that will help surface information that’s often buried in app listings, like the app’s App Store description, category, metadata, or even in its screenshots
Apple is introduceing App Store Tags — labels that highlight specific features and functionalities found in an app. These tags will initially be generated by Apple’s LLMs using various sources, like the app’s metadata. They’ll then be human-reviewed before being applied to apps on the App Store. Apple customers will be able to use these tags when searching for apps on the App Store, where the tags appear alongside the categories on the search page and the apps that appear in the search results. Apple says the new tags will help surface information that’s often buried in app listings, like the app’s App Store description, category, metadata, or even in its screenshots. The tags will help users more easily find the apps that offer the functionality they’re looking for, while also giving developers a better idea about how their apps are being discovered. When App Store users tap on one of the new tags, they’ll be taken to a new page offering a curated collection of all the apps and games that offer similar features or functionality — an extension to the App Store’s existing feature that points users to apps they “might also like,” found at the bottom of individual listings. Developers will also be able to create custom product pages that appear when a user searches for apps using particular keywords.
New macOS 26 Phone app will eventually combine iPhone calls & desktop seamlessly- onboard voicemail function of the iPhone is also accessible from the macOS Phone app
With the introduction of the Phone app on macOS, you can now take and place calls directly from your Mac. The macOS Phone app takes many of the cues from the updated iOS Phone app. The new Unified view is recreated on the bigger screen, with the default appearance being a list of recent calls and favorite callers on the top. These all take advantage of contact photos and posters, if other contacts have employed them. Selecting a recent call will bring up a larger version of the contact poster, along with more information about that person. There’s also an option to Manage Filtering, which on an iPhone would bring up the Settings app with options on how to handle unknown or spam callers. An Edit button lets you change your Favorites list or to select multiple logs in a list for mass deletion. The onboard voicemail function of the iPhone is also accessible from the macOS Phone app. If there is a voicemail recording and transcript available on the connected iPhone, these can be heard and read from the Mac directly. When you place a call by pressing the relevant icon or using the on-screen keypad, or receive a call, a box will appear in the top right corner of your Mac’s display. The buttons on the box offer extra features, including a compact keypad for menu systems. One brings up options for enabling Call Recording, Live Translation, Hold Assist, and Screen Sharing.
Overhauled Shortcuts app in iOS 26 supports Apple Intelligence models for actions like summarizing PDFs, generating recipes, answering questions, and more
Apple overhauled the Shortcuts app in iOS 26, iPadOS 26, and macOS Tahoe, and there are now Apple Intelligence options that users can take advantage of. The app supports Apple Intelligence models for things like summarizing PDFs, generating recipes, answering questions, and more. Here’s what Apple offers, along with the descriptions: Morning Summary – Use Model to describe the day ahead of you. Action Items From Meeting Notes – Use Model to grab action items from meeting notes. Summarize PDF – Use Model to summarize the open PDF in Safari. Is Severance Season 3 Out? – Use Model to find out if something has been released. ASCII Art – Use Model to draw you some ASCII art. Document Review – Use Mode to help you compare and contrast documents. Reminders Roulette – Use Model to punt an unimportant reminder to tomorrow. Get Started With Language Models – A tutorial for Use Model with examples. As the last pre-made Shortcut suggests, you can create your own shortcuts that incorporate Apple’s AI model, and Apple’s offerings serve as examples. When you go to create a Shortcut, there’s a new Apple Intelligence section. You can opt to use an on-device model, a cloud model that takes advantage of Private Cloud Compute, or ChatGPT. There are some pre-determined options, so you can do things like open Visual Intelligence or generate an image with Image Playground. There are several Writing Tools features for adjusting the tone of text, proofreading, creating a list from text, summarizing text, or rewriting text. When you tap on Cloud, On-Device Model, or ChatGPT, there’s an open-ended prompt where you can write in what you want to do. You need to work within the confines of the model that Apple provides, pairing it with other functionality in Shortcuts. You can pull in data from the Weather app, your Calendar, and Reminders, then ask the model to prepare a summary, for example. AI models can be incorporated into any Shortcut.
iOS 26 update allows users to deploy Visual Intelligence on anything on their iPhone’s screen, without requiring them to point the iPhone camera at anything
Apple has made the smallest update to Visual Intelligence in iOS 26, and yet the impact of being able to use it on any image is huge, and at least doubles the usefulness of this one feature. Visual Intelligence involved pointing your iPhone camera at whatever you were interested in. What Apple has done with iOS 26 is take that step away. Everything else is the same, but you no longer have to use your camera. You can instead deploy Visual Intelligence on anything on your iPhone’s screen. This one thing means that researchers can find out more about objects they see on websites. And shoppers can freeze frame a YouTube video and use Visual Intelligence to track down the bag that influencer is wearing. There is an issue that this means there are now two different ways to use Visual Intelligence, and they involve you having to do two different things to start them. The new version is an extra part of Visual Intelligence, not a replacement. Visual Intelligence is replete with different ways to use it, one of which provides a very different service to the rest. Yet being able to identify just about anything on your screen is a huge boon. And consequently Apple increased the usefulness of Visual Intelligence just by not requiring the step where you point your iPhone camera at anything.
Apple Intelligence’s transcription tool is as accurate and 2X as fast as OpenAI’s Whisper
Newly released to developers, Apple Intelligence’s transcription tools are fast, accurate, and typically double the speed of OpenAI’s longstanding equivalent. Pitching Apple Intelligence against MacWhisper’s Large V3 Turbo model showed a dramatic difference. Apple’s Speech framework tools were consistently just over twice the speed of that Whisper-based app. A test 4K 7GB video file was read and transcribed into subtitles by Apple Intelligence in 45 seconds. It took MacWhisper with the Large V3 Turbo LLM at total of 1 minute and 41 seconds.Then the MacWhisper Large C2 model took 3 minutes and 55 seconds to do the same job.None of these transcriptions were perfect, and all required editing. But the Apple Intelligence version was as accurate as the Whisper-based tools, and twice as fast.As well as releasing these Apple Intelligence tools to developers, Apple has published videos with details of how to implement the technology.
Apple’s speech transcription AI is twice as fast and cost-effective as OpenAI’s Whisper
Apple’s speech transcription AI is twice as fast and cost-effective as OpenAI’s Whisper, according to early testing by MacStories. The AI is used in Apple’s apps like Notes and phone call transcriptions, and Apple has made its native speech frameworks available to developers within macOS Tahoe. The AI processes a 7GBm 34-minute video file in just 45 seconds, 55% faster than Whisper’s fastest model. This is due to Apple processing speech on the device, making it faster and more secure. This indicates that Apple will continue to introduce new Language Learning Models (LLMs) to drive software solutions that compete well in the market, boosted by privacy and price.
iPadOS 26 turns iPad into a productivity powerhouse- lets iPad users to export or download large files in the background while they do other stuff, open several windows at once and freely resize them, and access downloads and documents right from the Dock, making it more Mac-like
iPadOS 26 is going to boost iPad users’ productivity not only with the new design, but with several new features that make the iPad with a Magic Keyboard the ultimate laptop replacement. Here are five ways iPadOS 26 is going to improve productivity for iPad users: Folders in the Dock: For the first time, users will be able to access downloads, documents, and other folders right from the Dock, making it more Mac-like. Supercharged Files app: The Files app is a key part of the iPad experience. With iPadOS 26, Apple takes this application to the next level, from an updated list view with resizable columns to collapsible folders. Users can add colors and other customization options to make it easier to find important documents. They can also set default apps for opening specific file types. Preview app: It’s easier than ever to open, edit, and mark up PDFs and images. Apple says the new Preview app was designed for a proper Apple Pencil experience, which means signing documents and taking notes should be faster and more reliable than ever. Background Tasks: Believe it or not, iPadOS 26 finally unlocks true background tasks. Users can now export or download large files in the background while they do other stuff. This might be one of the best iPadOS 26 productivity features. Better windowing system: Apple revamped the iPadOS 18 windowing system. Forget about Stage Manager, Split View, and Slide Over. With the upcoming iPadOS 26 update, users will be able to open several windows at once and freely resize and arrange them. There are also new ways to control windows with a familiar menu bar and Mac-like controls.