A new rumor suggests Apple will introduce a web search feature backed by Apple Foundation Models that can call out to Google Gemini to enhance Siri’s ability to gather and summarize information. The new Siri will have three core components: a planner, a search operator, and a summarizer. Apple’s Foundation Model will act as the planner and search since that’s dealing with on-device personal data, but getting the data from the web and collating it may be up to the Google model. There is still a lot to learn about Apple’s approach to AI going forward, as it had to scrap its previous approach entirely. The new Siri powered by Apple Intelligence LLMs is expected to launch in early 2026 with iOS 26.4. That isn’t to say third parties won’t be involved, especially since Apple’s search deal with Google will continue. Google is the default search engine in Safari, and that can be changed by the user, but it’s also the search engine used by Siri. Some queries rely on something called Siri intelligence, which is an old term that predates AI and refers to algorithms derived from device and web data. It seems on-device Apple Foundation Models will be responsible for parsing app intent systems and personal data. These on-device systems will power contextual actions, system-wide suggestions, and more. However, when Siri and its new LLM backend detect that a user query will require additional resources, it’ll call out to an AI agent equipped to deal with the topic. Initially, that system will be Google Gemini via a model running in Private Cloud Compute servers controlled by Apple. The so-called “world knowledge” will come from it and be summarized and presented to the user. This system is different from how it works with Apple’s ChatGPT partnership. When Apple Intelligence passes a query to ChatGPT, it is run on OpenAI servers, though there is a contract in place that forces OpenAI to discard queries and data.