Apple has upgraded Siri as part of its Apple Intelligence rollout to better understand context and follow up on queries, but it does not have the brainpower or conversational capabilities of ChatGPT.
According to a recent report by Bloomberg, Apple is currently working hard to develop an “LLM for Siri” that will allow the AI voice assistant to conduct conversations like Google Gemini Voice, Meta AI Voice, and ChatGPT Advanced Voice It seems that Apple is working hard on the “LLM for Siri”, which will allow AI voice assistants to have conversations.
Apple is already using on-device language models to power Apple Intelligence. This is not enough to support effective conversational AI such as ChatGPT, which uses a larger cloud-based model.
According to Bloomberg, LLM Siri will use an entirely new Apple AI model that will allow for more extensive use of App Intents. This will allow developers to open up in-app functionality to Siri and allow it to talk to AI.
I have previously written about how ChatGPT's integration into Siri will render voice assistants nearly useless. Everything creative will just be sent to ChatGPT.
Siri is outdated and feels like a relic from a different era. Recent updates have improved it, but it is far from Gemini Live, which is a problem for Apple in its competition with Android.
Giving Siri a dedicated LLM brain and voice mode would require it to run from its own custom-built private cloud model rather than on the device. It will take time to build a global voice assistant infrastructure.
A report by Mark Gurman suggests that LLM Siri may initially be a separate app while users provide feedback. An announcement will be made at WWDC next June, and integration will likely be part of iOS 19 and macOS 16, but may not actually go live until spring 2026.
In the meantime, Apple is looking to expand Siri's ChatGPT integration by bringing in other partners such as Anthropic's Claude and Google's Gemini. This will serve as a “stop-gap” until Siri LLM is ready.
Comments