Personalized spatial widgets: Apple’s widgets, now integrated into your space with visionOS 26, offer personalized information at a glance. Users can customize widgets to size, color, and depth, and add features like customizable clocks, weather adapters, quick music access, and photos that can transform into panoramas. Adding depth to 2D images: Apple has updated its visionOS Photos app with an AI algorithm that creates multiple perspectives for 2D photos, allowing users to “lean right into them and look around.” Spatial browsing on Safari can also enhance web browsing by hiding distractions and revealing inline photos, and developers can add it to their apps. Talking heads: Apple has introduced Personas, an AI avatar for video calls, on the Vision Pro. The new avatars, created usingvolumetric rendering and machine learning technology, are more realistic and accurate in appearance, including hair, eyelashes, and complexion. They are created on-device in seconds. Immerse together: VisionOS 26 allows users to collaborate with headset-wearing friends to watch movies or play spatial games. This feature is also being marketed for enterprise clients, such as Dassault Systèmes, which uses the 3DLive app to visualize 3D designs in person and with remote colleagues. Enterprise APIs and tools: Apple has introduced visionOS 26, a new operating system that allows organizations to share devices among team members and securely save eye and hand data, vision prescription, and accessibility settings to iPhones. The system also includes a “for your eyes only” mode to restrict access to confidential materials. Additionally, Apple has introduced Logitech Muse, a spatial accessory for Vision Pro, allowing precise 3D drawing and collaboration. The company plans to add more APIs for app development.
WWDC 25 was notably quiet on a more personalized, AI-powered Siri
Apple announced several updates to its operating systems, services, and software, including a new look called “Liquid Glass” and a rebranded naming convention. However, the company was notably quiet on a more personalized, AI-powered Siri, which was first introduced at WWDC 24. The company’s SVP of Software Engineering, Craig Federighi, only briefly mentioned the Siri update during the keynote address, stating that the work needed more time to reach high-quality standards. The delay in the AI era suggests that Apple won’t have news about the Siri update until 2026, a significant delay in the AI era. The more personalized Siri is expected to bring artificial intelligence updates to the virtual assistant built into iPhone and other Apple devices. Bloomberg reported that the in-development version of the more personalized Siri was functional, but it was not consistently working properly, making it not viable to ship. Apple officially announced in March that the Siri update would take longer to deliver than anticipated.
Apple Intelligence opened up to all developers with Foundation Models Framework
Apple has announced that developers will soon be able to access the on-device large language models that power Apple Intelligence in their own apps through the Foundation Models framework. This will allow third-party apps to use the features for image creation, text generation, and more. The on-device processing will allow for fast, powerful, privacy-focused AI features that are available without an internet connection. Apple has also announced plans to expand the number of languages its AI platform supports and make the generative models that power it more capable and efficient. The company’s move comes as the company continues to make its intelligence systems accessible to third-party apps.
Apple Vision Pro ‘Spatial Widgets’ blend digital life into your real space allowing users to pin interactive elements like clocks, music controls, weather panels, and photo galleries directly into their physical space.
Apple has introduced spatial widgets in its Vision Pro headset, allowing users to pin interactive elements like clocks, music controls, weather panels, and photo galleries directly into their physical space. These widgets are customizable in size, color, depth, and layout, and are meant to be part of the user’s space. The Vision Pro update marks a clear step toward persistent spatial computing, with widgets like Photos, Clock, Weather, and Music playing the role of physical objects. However, the experience of using spatial widgets raises questions about how digital environments are changing the way we relate to physical ones. While Vision Pro is still shown in cozy, furnished homes, the integration of digital objects into physical spaces could lead to a different reality. The visionOS 26 update is currently available in developer beta, with a public release expected in fall 2025. As more developers build spatial widgets, the headset might feel useful in quiet, everyday ways. The end goal of AR/VR is to augmentation of reality, overlaying digital things on analog reality. However, Apple is not pushing this path for now, as it would be crucified if it did. The company has a decent track record for a corporation, despite the potential for a dystopian future where technology works against us.
iOS 26 lets you know how long It’ll take your battery to charge
iOS 26 introduces a new feature that allows users to know the estimated time it takes for their battery to charge when plugged in or on a wireless charger. This allows users to optimize their charging practices and determine the speed of a charger. The estimated time remaining until a full charge can be found in the Battery section of the Settings app. Apple has not yet added a new widget for this feature, but it could be added in the future. The iPhone battery charging estimates are available to developers in iOS 26, with the update expected to be released to everyone this fall.
iOS 26 upgrades CarPlay with new compact view for incoming phone calls
Apple announced iOS 26, and the upcoming software update includes several new features and changes for CarPlay in vehicles. Liquid Glass Design: When you are using CarPlay with an iPhone running iOS 26, the new Liquid Glass design extends to the CarPlay interface. Like on the iPhone, the new look includes more shimmery app icons and translucent user interface elements. New Messages App Features: Starting with iOS 26, you can respond to messages with standard Tapbacks like a heart, thumbs up, or exclamation marks directly through CarPlay. Plus, you can now view your pinned conversations in the Messages app on CarPlay. Compact View for Phone Calls: CarPlay has a new compact view for incoming phone calls, so that you can still see other information on the screen, such as turn-by-turn directions. Live Activities: CarPlay’s Dashboard screen can now show Live Activities, letting you keep track of things like a flight’s arrival time at a glance. Widgets: The regular version of CarPlay now has a customizable widgets screen, for things like calendar appointments and HomeKit accessory controls.
iOS 26 to allow reporting spam voicemails by tapping on new “Report Spam” button for voicemail from an unknown number
iOS 26 has an updated Phone app with several new functions. When you tap into a voicemail from an unknown number, you’ll see a new “Report Spam” button that you can tap if it is a spam call. Tapping on option sends the voicemail to Apple, and you can either report the message as spam and keep it, or report it and delete it. The Call Screening option in iOS 26 intercepts calls from numbers that are not saved in your contacts list, and asks the caller for more information like a name and reason for calling before forwarding the call along to you. The Messages app also has a refined spam reporting workflow in iOS 26. Messages that Apple detects are spam are sent to a specific Spam folder, which is now distinct from the Unknown Senders folder. Messages from numbers that aren’t in your contacts, such as 2FA messages, go in Unknown Senders. Scam messages are sent to the spam folder. Messages from unknown senders and spam messages are both silenced and you won’t get a notification for them, but you will see a badge at the top of the Messages app. You can disable these features in the Messages section of the Settings app, if desired. There is no automatic filtering of spam voicemails, but that is a feature that Apple could use in the future after receiving enough voicemails that people flag as spam.
PayPal USD (PYUSD) plans to use Stellar for real-world payments, commerce, and micro-financing leveraging the network’s speed, low transaction costs, and ease of integration
PayPal plans to make the PayPal USD stablecoin available on the Stellar network pending regulatory approval by the New York State Department of Financial Services (NYDFS). By potentially expanding to Stellar, PYUSD leverages the network’s speed, low transaction costs, and ease of integration to enhance its utility for real-world payments, commerce, and micro-financing, offering an additional option to Ethereum and Solana. PYUSD on Stellar can be used for fast, affordable cross-border payments and expanded access to essential financial services while bridging the digital and physical world with a vast array of on and off ramps. Users may also benefit from improved daily payment options and financing solutions such as working capital and business loans – use cases already thriving on the Stellar network – ultimately enabling a more seamless flow of value across global markets. An expansion on Stellar would give PYUSD users access to its vast network of on and off ramps, providing additional access through digital wallets, and connected to local payment systems and cash networks. Access to extensive Stellar infrastructure will enhance how people can use PYUSD in their everyday financial activities, from payments to remittances to merchant services. PYUSD on Stellar can also enhance liquidity and financing opportunities through Payment Financing or ‘PayFi’, an emerging innovation in digital finance. Small and medium-sized businesses that face delayed receivables or pre-funding requirements would be able to access new sources of real-time working capital, disbursed in PYUSD. This capital can be used to pay suppliers, manage inventory, or address other operational needs – with instant settlement on Stellar. Liquidity providers can fund these opportunities and earn potential, sustainable benefits from real-world economic activity.
Ripple claims XRP could capture 14% of SWIFT volume within five years driven by a focus on liquidity rather than messaging infrastructure
Ripple CEO Brad Garlinghouse shared a bold projection: XRP could capture 14% of SWIFT’s volume over the next five years, driven by a focus on liquidity rather than messaging infrastructure. Garlinghouse’s comments reflect a broader ambition within Ripple to challenge traditional financial rails by leveraging crypto-based liquidity solutions. While SWIFT currently dominates the interbank messaging landscape for cross-border payments, Garlinghouse argued that the true battleground lies in liquidity, the ability to move money, not just send instructions. “There are two parts to SWIFT today: messaging and liquidity,” Garlinghouse said. “Liquidity is owned by the banks. I think less about the messaging and more about liquidity. If you’re driving all the liquidity, it is good for XRP… so I’ll say five years, 14%.” By targeting liquidity, Ripple aims to plug into the critical layer of cross-border finance that determines how quickly and cheaply value can move across borders. Ripple’s Chief Legal Officer also highlighted the potential for rapid growth in the tokenized asset sector. “Hundreds of billions of tokenized global assets [will emerge] fairly quickly,” he said, signaling Ripple’s intent to position XRP as a foundational layer in that transformation.
Apple improves app discovery on App Store with new tags that will help surface information that’s often buried in app listings, like the app’s App Store description, category, metadata, or even in its screenshots
Apple is introduceing App Store Tags — labels that highlight specific features and functionalities found in an app. These tags will initially be generated by Apple’s LLMs using various sources, like the app’s metadata. They’ll then be human-reviewed before being applied to apps on the App Store. Apple customers will be able to use these tags when searching for apps on the App Store, where the tags appear alongside the categories on the search page and the apps that appear in the search results. Apple says the new tags will help surface information that’s often buried in app listings, like the app’s App Store description, category, metadata, or even in its screenshots. The tags will help users more easily find the apps that offer the functionality they’re looking for, while also giving developers a better idea about how their apps are being discovered. When App Store users tap on one of the new tags, they’ll be taken to a new page offering a curated collection of all the apps and games that offer similar features or functionality — an extension to the App Store’s existing feature that points users to apps they “might also like,” found at the bottom of individual listings. Developers will also be able to create custom product pages that appear when a user searches for apps using particular keywords.