
iOS 27 Siri Extensions. What Apple Is Opening, and Why It Matters.
At WWDC on June 8, 2026, Apple is expected to introduce iOS 27 with a new Siri capability called Extensions. Per Bloomberg's Mark Gurman, confirmed by Reuters and MacRumors, Extensions will let users plug third-party AI assistants directly into Siri. Claude, ChatGPT, Gemini, Perplexity, Grok. A Settings menu under Apple Intelligence and Siri will let you choose which AI handles which kind of request.
This is a more significant change than most coverage is making it. For fifteen years, Siri was a single-vendor Apple product. Starting in fall 2026, it becomes a router. The assistant you speak to on your iPhone is a menu of other companies' AI.
That is not a concession Apple would have made in 2021. It is Apple correctly reading the architecture of the current AI phase, which is that the right answer for personal AI right now is model agnostic routing plus specialists. For users, this is good news. For Apple, it is a graceful acknowledgement that the destination Apple Intelligence described at WWDC 2024 is easier to reach by opening the platform than by building every layer internally.
The more interesting story sits behind the announcement. Extensions solves the chat layer. The next layer, which is the agent layer, is where personal AI in 2026 is actually being decided.
What Extensions actually does
A plain description. In iOS 27, iPadOS 27, and macOS 27, there will be a new Settings section under Apple Intelligence and Siri called Extensions, per Bloomberg and MacRumors. You will install AI apps from the App Store and designate which ones Siri can hand off to. When you ask Siri a question that native Apple models do not handle as well as a specialized model would, Siri routes the request to your chosen third-party AI. The response comes back through Siri's interface.
There is also, separately, a standalone Siri app in testing. It gives you a chat window and voice interface alongside the usual ambient Siri, with preserved history, per Gurman's reporting. In other words, Siri becomes shaped like ChatGPT, Claude, and Gemini already are, with an App Store section underneath it for the models.
Two things Extensions does not do.
It does not replace the core Siri intelligence. The conversational Siri overhaul Apple plans to ship at WWDC 2026 is built on Apple Foundation Models, which are being rebuilt in part using Google Gemini as a distillation partner, per The Verge.
It does not let third-party AI take deep actions across your iPhone the way Apple Intelligence described at WWDC 2024. Booking lunch around a landing flight. Pulling the photo your mother sent. Drafting a text with knowledge of your calendar. Those features are still native Siri, and they are still the delayed piece the Ternus era has to deliver.
Why this is happening now
Three forces converged in the twelve months before the iOS 27 announcement.
Apple's own AI did not meet internal thresholds. Per Bloomberg's reporting through 2025, some Siri features failed up to a third of the time in internal testing. Craig Federighi, Apple's software chief, pushed back on releasing at that rate. Apple delayed. That delay is now three years long.
Legal exposure. Elon Musk's xAI filed a lawsuit in August 2025 alleging that Apple's exclusive ChatGPT integration was anticompetitive. Opening Siri to all major providers, per analysis by Winbuzzer and MacDailyNews, reduces that exposure. A neutral marketplace is harder to challenge than a single partnership.
Revenue opportunity. The App Store takes a 30 percent commission on subscriptions. If the AI your Extensions menu routes to is a paid subscription purchased through the App Store, Apple participates in revenue it did not generate, per Bloomberg. Distribution becomes a business model in a category Apple has not been the lead builder in.
All three are real. The third is the most durable, and it describes Apple's strategy correctly. Own the front door. Set the rules. Let others compete behind it. That is how the iPhone has worked for nearly two decades.
The Extensions lineup, honestly evaluated
The expected initial lineup, per reporting from Bloomberg, MacRumors, and Winbuzzer, is Claude (Anthropic), Gemini (Google), ChatGPT (OpenAI), Perplexity, Grok (xAI), and possibly Copilot (Microsoft). Each is well funded, each has genuine strengths.
Where they do similar work. Questions, drafting, summarising, coding, research. Any of these will outperform native Siri for these tasks in 2026.
Where they differ. Claude is the most careful at writing and long-document reasoning. Gemini is strongest inside Google workflows, especially Workspace and Search. ChatGPT is the most fluent generalist with the most mature memory. Perplexity is the strongest for research with citations. Grok is the most connected to real-time X data.
Where none of them yet excels. Personal context with agency. An assistant that knows your calendar, your messages, your health data, and your training history, and uses that context to take actions on your behalf. The Extensions lineup as announced is, in the honest language of the DEV Community's March 2026 analysis, a set of digital assistants. They live on your screen. They are not yet agents that act inside specific domains.
The real story is the agent layer
This is the part most Extensions coverage misses, and it is the thing that actually matters for personal AI in the next two years.
The industry-wide shift in 2026 is from chat to agent. Claude's tool use features. OpenAI's agent mode. Google's Project Mariner. Perplexity's shopping agent. Each is a different bet on the same architectural thesis, which is that the next phase of AI is not about better answers. It is about the model taking actions in the real world with the user's consent. Book the flight. Update the calendar. Draft and send the message. Coordinate between the user and the services the user uses.
In narrow high-value domains, specialist agents are delivering on this first because the action space is clear and the user benefit is concrete. Personal health is one of the best examples because the downstream actions, modifying a plan, updating an order, scheduling a clinician follow-up, are clear and auditable.
This is where the most interesting post-iOS-27 composition lives. General purpose chat routed through Siri Extensions for the information layer. Specialist agents handling the action layer inside specific categories. Your personal AI on iPhone becomes a stack, not a single product.
The personal health cohort, named
Because naming peers helps any reader calibrate, here is the category we work in.
Zoe focuses on nutrition with microbiome data. Levels focuses on continuous glucose monitoring and metabolic response. Function Health and Superpower anchor comprehensive lab panels with interpretation. Whoop, Oura, and Lumen each own a different wearable or respiration signal with coaching layers on top. Apple Watch and Apple Health together provide the data storage layer that most of these products integrate with.
moccet sits on the personal AI side of the cohort. We are building a personal AI designed to help you live better, with health as the first and deepest domain. Three specialist agents inside the platform. chef for nutrition, reading labs, CGM, wearable, and calendar, generating plans that update when your biology updates. trainer for training, reading heart rate variability, sleep, and recovery, adjusting programs when your body is under stress. medic for general health questions, reading any data you upload from labs to imaging reports, and answering with evidence.
The layer underneath, which is the part that feels genuinely new, is the active agent layer. The agents do not only recommend. They act. A change to your meal plan updates your grocery order. A modified training session enters your calendar. A follow-up with your clinician is drafted and coordinated on your behalf. This is, in a specific domain, what Apple Intelligence described at WWDC 2024 and what the iOS 27 Extensions layer does not yet reach.
moccet integrates with Apple Health, Garmin, Whoop, Oura, Fitbit, and continuous glucose monitors including Dexcom, Levels, and Lingo. HIPAA and SOC2 compliant. Biometric data encrypted end to end, not used to train generic models. Three days are free at moccet.ai/invite/relax.
How to set up your iPhone for the iOS 27 era
Practical, because that is what readers of this piece want.
When iOS 27 arrives in the fall, expect a Settings section under Apple Intelligence and Siri called Extensions. You will pick which AI apps Siri can route to. Most people will want three things composed together.
A general purpose AI for chat, writing, and research. Claude or ChatGPT is the cleanest default. Perplexity is the cleanest for research specifically.
A productivity AI for calendar and email. Native Siri is the likely winner here because of depth of iOS integration, plus a specialist for heavy users.
A specialist for whatever matters most in your life. For health, a purpose built product because the general models do not yet handle biometric context with agency. For finance, productivity, coding, or any other vertical, the specialist that covers it.
The configuration is composition. The point of Extensions is that the composition becomes trivial inside Siri. The point of the specialists is that the action layer moves fastest in narrow domains.
The strategic reading
Apple's trajectory, legible from a series of 2025 and 2026 signals, is consistent. Extensions opens the platform on the routing layer. The Gemini partnership uses external models on the model layer. The Ternus promotion puts a product engineer on the execution layer. The $130 billion cash position, per The Information, gives Apple the option to acquire in the category if the build pace does not match the market.
The likely Apple outcome by 2028 is a deeper, more agentic Siri with serious personal context, and an Apple Watch that is a genuine health agent. Ternus is the right chief executive to deliver that, given the track record on Apple silicon.
The likely user outcome between now and then is that the best personal AI on your iPhone is a stack. General purpose chat via Extensions. Specialist agents for high-value domains. Apple's native Siri for calendar, mail, and OS-level actions. The composition works today, will work better in September, and will keep improving.
If you want to try the health specialist piece of that composition now, moccet is live on the Apple hardware you already own. Three free days at moccet.ai/invite/relax.
Frequently asked questions
What is iOS 27 Siri Extensions? A system, previewing at WWDC on June 8, 2026, that lets third-party AI assistants plug directly into Siri. The expected initial lineup includes Claude, ChatGPT, Gemini, Perplexity, and Grok. Users pick which AI handles which kind of request in Settings.
When is iOS 27 released? Preview at WWDC on June 8, 2026, expected public release in fall 2026.
Does this mean Siri is being replaced? No. Core Siri intelligence remains native, built on Apple Foundation Models trained in part using Google Gemini. Extensions handles third-party handoff for chat and Q&A.
Which AI should I use on iPhone in 2026? Depends on use case. Claude or ChatGPT for general chat and writing. Perplexity for research. Native Siri for OS actions. For personal health specifically, a purpose built product with action capability, including moccet, Zoe, Levels, Function Health, Superpower, Whoop, and Oura.
What is the agent layer beyond Extensions? The shift from AI that answers to AI that acts. Book calendar items. Update plans when data changes. Draft and coordinate messages. This is moving fastest in narrow domains where the action space is clear, including personal health.
What is moccet? A personal AI designed to help you live better. Three specialist agents, chef for nutrition, trainer for training, medic for any health question, plus an active agent layer that can take actions on your behalf. HIPAA and SOC2 compliant. A three day free trial is at moccet.ai/invite/relax.
Sources
Bloomberg, "Apple Plans to Open Up Siri to Rival AI Assistants Beyond ChatGPT in iOS 27," March 26, 2026. MacRumors, "iOS 27 Rumored to Feature All-New Siri App With Extensions Feature," March 29, 2026. Reuters, confirmation reporting, late March 2026. The Verge, coverage of Apple-Google Gemini deal, 2026. Winbuzzer, "Apple Plans to Open Siri to ChatGPT Rivals in iOS 27," March 31, 2026. Gadget Hacks, "iOS 27 Siri Third-Party AI Assistants Explained," April 2026. MacDailyNews, March 27, 2026. AppleInsider, March 26, 2026. moccet, moccet.ai, 2026.