With the introduction of the Phone app on macOS, you can now take and place calls directly from your Mac. The macOS Phone app takes many of the cues from the updated iOS Phone app. The new Unified view is recreated on the bigger screen, with the default appearance being a list of recent calls and favorite callers on the top. These all take advantage of contact photos and posters, if other contacts have employed them. Selecting a recent call will bring up a larger version of the contact poster, along with more information about that person. There’s also an option to Manage Filtering, which on an iPhone would bring up the Settings app with options on how to handle unknown or spam callers. An Edit button lets you change your Favorites list or to select multiple logs in a list for mass deletion. The onboard voicemail function of the iPhone is also accessible from the macOS Phone app. If there is a voicemail recording and transcript available on the connected iPhone, these can be heard and read from the Mac directly. When you place a call by pressing the relevant icon or using the on-screen keypad, or receive a call, a box will appear in the top right corner of your Mac’s display. The buttons on the box offer extra features, including a compact keypad for menu systems. One brings up options for enabling Call Recording, Live Translation, Hold Assist, and Screen Sharing.
Overhauled Shortcuts app in iOS 26 supports Apple Intelligence models for actions like summarizing PDFs, generating recipes, answering questions, and more
Apple overhauled the Shortcuts app in iOS 26, iPadOS 26, and macOS Tahoe, and there are now Apple Intelligence options that users can take advantage of. The app supports Apple Intelligence models for things like summarizing PDFs, generating recipes, answering questions, and more. Here’s what Apple offers, along with the descriptions: Morning Summary – Use Model to describe the day ahead of you. Action Items From Meeting Notes – Use Model to grab action items from meeting notes. Summarize PDF – Use Model to summarize the open PDF in Safari. Is Severance Season 3 Out? – Use Model to find out if something has been released. ASCII Art – Use Model to draw you some ASCII art. Document Review – Use Mode to help you compare and contrast documents. Reminders Roulette – Use Model to punt an unimportant reminder to tomorrow. Get Started With Language Models – A tutorial for Use Model with examples. As the last pre-made Shortcut suggests, you can create your own shortcuts that incorporate Apple’s AI model, and Apple’s offerings serve as examples. When you go to create a Shortcut, there’s a new Apple Intelligence section. You can opt to use an on-device model, a cloud model that takes advantage of Private Cloud Compute, or ChatGPT. There are some pre-determined options, so you can do things like open Visual Intelligence or generate an image with Image Playground. There are several Writing Tools features for adjusting the tone of text, proofreading, creating a list from text, summarizing text, or rewriting text. When you tap on Cloud, On-Device Model, or ChatGPT, there’s an open-ended prompt where you can write in what you want to do. You need to work within the confines of the model that Apple provides, pairing it with other functionality in Shortcuts. You can pull in data from the Weather app, your Calendar, and Reminders, then ask the model to prepare a summary, for example. AI models can be incorporated into any Shortcut.
iOS 26 update allows users to deploy Visual Intelligence on anything on their iPhone’s screen, without requiring them to point the iPhone camera at anything
Apple has made the smallest update to Visual Intelligence in iOS 26, and yet the impact of being able to use it on any image is huge, and at least doubles the usefulness of this one feature. Visual Intelligence involved pointing your iPhone camera at whatever you were interested in. What Apple has done with iOS 26 is take that step away. Everything else is the same, but you no longer have to use your camera. You can instead deploy Visual Intelligence on anything on your iPhone’s screen. This one thing means that researchers can find out more about objects they see on websites. And shoppers can freeze frame a YouTube video and use Visual Intelligence to track down the bag that influencer is wearing. There is an issue that this means there are now two different ways to use Visual Intelligence, and they involve you having to do two different things to start them. The new version is an extra part of Visual Intelligence, not a replacement. Visual Intelligence is replete with different ways to use it, one of which provides a very different service to the rest. Yet being able to identify just about anything on your screen is a huge boon. And consequently Apple increased the usefulness of Visual Intelligence just by not requiring the step where you point your iPhone camera at anything.
Apple Intelligence’s transcription tool is as accurate and 2X as fast as OpenAI’s Whisper
Newly released to developers, Apple Intelligence’s transcription tools are fast, accurate, and typically double the speed of OpenAI’s longstanding equivalent. Pitching Apple Intelligence against MacWhisper’s Large V3 Turbo model showed a dramatic difference. Apple’s Speech framework tools were consistently just over twice the speed of that Whisper-based app. A test 4K 7GB video file was read and transcribed into subtitles by Apple Intelligence in 45 seconds. It took MacWhisper with the Large V3 Turbo LLM at total of 1 minute and 41 seconds.Then the MacWhisper Large C2 model took 3 minutes and 55 seconds to do the same job.None of these transcriptions were perfect, and all required editing. But the Apple Intelligence version was as accurate as the Whisper-based tools, and twice as fast.As well as releasing these Apple Intelligence tools to developers, Apple has published videos with details of how to implement the technology.
Apple’s speech transcription AI is twice as fast and cost-effective as OpenAI’s Whisper
Apple’s speech transcription AI is twice as fast and cost-effective as OpenAI’s Whisper, according to early testing by MacStories. The AI is used in Apple’s apps like Notes and phone call transcriptions, and Apple has made its native speech frameworks available to developers within macOS Tahoe. The AI processes a 7GBm 34-minute video file in just 45 seconds, 55% faster than Whisper’s fastest model. This is due to Apple processing speech on the device, making it faster and more secure. This indicates that Apple will continue to introduce new Language Learning Models (LLMs) to drive software solutions that compete well in the market, boosted by privacy and price.
iPadOS 26 turns iPad into a productivity powerhouse- lets iPad users to export or download large files in the background while they do other stuff, open several windows at once and freely resize them, and access downloads and documents right from the Dock, making it more Mac-like
iPadOS 26 is going to boost iPad users’ productivity not only with the new design, but with several new features that make the iPad with a Magic Keyboard the ultimate laptop replacement. Here are five ways iPadOS 26 is going to improve productivity for iPad users: Folders in the Dock: For the first time, users will be able to access downloads, documents, and other folders right from the Dock, making it more Mac-like. Supercharged Files app: The Files app is a key part of the iPad experience. With iPadOS 26, Apple takes this application to the next level, from an updated list view with resizable columns to collapsible folders. Users can add colors and other customization options to make it easier to find important documents. They can also set default apps for opening specific file types. Preview app: It’s easier than ever to open, edit, and mark up PDFs and images. Apple says the new Preview app was designed for a proper Apple Pencil experience, which means signing documents and taking notes should be faster and more reliable than ever. Background Tasks: Believe it or not, iPadOS 26 finally unlocks true background tasks. Users can now export or download large files in the background while they do other stuff. This might be one of the best iPadOS 26 productivity features. Better windowing system: Apple revamped the iPadOS 18 windowing system. Forget about Stage Manager, Split View, and Slide Over. With the upcoming iPadOS 26 update, users will be able to open several windows at once and freely resize and arrange them. There are also new ways to control windows with a familiar menu bar and Mac-like controls.
Car makers are holding off from Apple’s CarPlay Ultra in favor of their own solutions, due to limited avenue to sell subscriptions to drivers from infotainment system and in-car services, along with design and UI challenges
Apple’s CarPlay Ultra faces a long road to becoming a widely-used feature, as car makers are pushing back on supporting Apple’s system in favor of their own solutions. Car manufacturers Mercedes-Benz, Audi, Volvo, Polestar, and Renault have no interest to include CarPlay Ultra support in their vehicles. While Volvo is among those rejecting CarPlay Ultra, chief executive Hakan Samuelsson did admit that car makers don’t so software as well as tech companies. “There are others who can do that better, and then we should offer that in our cars,” he insisted. While design and interface discussions are the more obvious reasons for holding off from CarPlay Ultra, manufacturers also have another incentive. It is said that the infotainment system and in-car services are still a possible revenue source for car makers. This was one of the reasons why GM ditched CarPlay in favor of its own system in 2023, due to the potential to sell subscriptions to drivers. For some car manufacturers shying away from handing over control to CarPlay Ultra, they are stopping short of blocking Apple entirely. In most cases, the current limited CarPlay will still be offered, in tandem with their own systems. BMW insisted that CarPlay will be used in its infotainment system. Meanwhile, Audi believes it should provide drivers “a customized and seamless digital experience” of its own creation, while still maintaining CarPlay support.
Apple’s Swift coding language to add support to Android platform with focus on improving support for official distribution, determining the range of supported Android API levels and developing support for debugging Swift applications
Apple usually doesn’t give Android the time of day, but that’s not stopping the company’s Swift coding language from expanding over to Android app development. Android apps are generally coded in Kotlin, but Apple is looking to provide its Swift coding language as an alternative. Apple first launched its coding language back in 2014 with its own platforms in mind, but currently also supports Windows and Linux officially. Swift has opened up an “Android Working Group” which will “establish and maintain Android as an officially supported platform for Swift.” A few of the key pillars the Working Group will look to accomplish include: 1) Improve and maintain Android support for the official Swift distribution, eliminating the need for out-of-tree or downstream patches 2) Recommend enhancements to core Swift packages such as Foundation and Dispatch to work better with Android idioms 3) Work with the Platform Steering Group to officially define platform support levels generally, and then work towards achieving official support of a particular level for Android 4) Determine the range of supported Android API levels and architectures for Swift integration 5) Develop continuous integration for the Swift project that includes Android testing in pull request checks. 6) Identify and recommend best practices for bridging between Swift and Android’s Java SDK and packaging Swift libraries with Android apps 7) Develop support for debugging Swift applications on Android 8) Advise and assist with adding support for Android to various community Swift packages.
Car makers are holding off from Apple’s CarPlay Ultra in favor of their own solutions, due to limited avenue to sell subscriptions to drivers from infotainment system and in-car services, along with design and UI challenges
Apple’s CarPlay Ultra faces a long road to becoming a widely-used feature, as car makers are pushing back on supporting Apple’s system in favor of their own solutions. Car manufacturers Mercedes-Benz, Audi, Volvo, Polestar, and Renault have no interest to include CarPlay Ultra support in their vehicles. While Volvo is among those rejecting CarPlay Ultra, chief executive Hakan Samuelsson did admit that car makers don’t so software as well as tech companies. “There are others who can do that better, and then we should offer that in our cars,” he insisted. While design and interface discussions are the more obvious reasons for holding off from CarPlay Ultra, manufacturers also have another incentive. It is said that the infotainment system and in-car services are still a possible revenue source for car makers. This was one of the reasons why GM ditched CarPlay in favor of its own system in 2023, due to the potential to sell subscriptions to drivers. For some car manufacturers shying away from handing over control to CarPlay Ultra, they are stopping short of blocking Apple entirely. In most cases, the current limited CarPlay will still be offered, in tandem with their own systems. BMW insisted that CarPlay will be used in its infotainment system. Meanwhile, Audi believes it should provide drivers “a customized and seamless digital experience” of its own creation, while still maintaining CarPlay support.
Apple’s Swift coding language to add support to Android platform with focus on improving support for official distribution, determining the range of supported Android API levels and developing support for debugging Swift applications
Apple usually doesn’t give Android the time of day, but that’s not stopping the company’s Swift coding language from expanding over to Android app development. Android apps are generally coded in Kotlin, but Apple is looking to provide its Swift coding language as an alternative. Apple first launched its coding language back in 2014 with its own platforms in mind, but currently also supports Windows and Linux officially. Swift has opened up an “Android Working Group” which will “establish and maintain Android as an officially supported platform for Swift.” A few of the key pillars the Working Group will look to accomplish include: 1) Improve and maintain Android support for the official Swift distribution, eliminating the need for out-of-tree or downstream patches 2) Recommend enhancements to core Swift packages such as Foundation and Dispatch to work better with Android idioms 3) Work with the Platform Steering Group to officially define platform support levels generally, and then work towards achieving official support of a particular level for Android 4) Determine the range of supported Android API levels and architectures for Swift integration 5) Develop continuous integration for the Swift project that includes Android testing in pull request checks. 6) Identify and recommend best practices for bridging between Swift and Android’s Java SDK and packaging Swift libraries with Android apps 7) Develop support for debugging Swift applications on Android 8) Advise and assist with adding support for Android to various community Swift packages