We’ve already heard that Apple’s upcoming iOS 18 software could be the “biggest” update in history, thanks to a bunch of new AI features coming to the iPhone. This includes everything from summarizing emails and web pages to transcribing voice notes and AI-generated images and emoji.
But Apple is reportedly not going it alone when it comes to rolling out new AI features. Reports surfaced in late April that Apple has reopened dialogue with OpenAI to bring ChatGPT features to the iPhone; In mid-May, we heard that Apple and Open AI were reportedly close to a deal.
Now, this deal appears to have been finalized, as The Information reports that OpenAI CEO Sam Altman has struck a deal with Apple. This is apparently a major blow to Google, which was hoping to bring its Gemini AI tools to the iPhone to complement its existing multi-billion dollar Google search partnership.
So how can bringing ChatGPT AI to Apple’s coveted smartphone transform your iPhone experience? If Apple decided to use the latest GPT-4o technology, it could provide Siri with a much more powerful voice and visual assistant. Our AI editor Ryan Morrisson got a live demo of ChatGPT-4o voice and he said that if anything, the technology is underhyped.
Imagine Siri, but with much more intelligence and more emotion. You can pause it, ask follow-up questions, and get live translations instantly. Most importantly, GPT-4o is more conversational, so Siri can get that injection easily.
Siri + ChatGPT = ?
According to The Information’s report, Apple engineers tested ChatGPT behind the scenes and linked it to Siri. As a result, they were able to create “impressive demonstrations of Siri handling more complex questions than usual, in part by better understanding the context of what users were saying.”
This bit about context is hugely important, because the existing Siri had a terrible time keeping up with the infamous Rabbit R1 AI device when I tested the two against each other with a series of questions. Siri just isn’t very good at answering follow-up questions right now.
To be clear, Apple has not announced an OpenAI partnership or announced any plans for how it might use that technology. But Apple has reportedly discussed using OpenAI for its Siri voice assistant to answer questions it normally wouldn’t be able to answer on its own. Apple may also decide to roll out a separate chatbot-like app, powered by OpenAI.
But it now seems clear that the buzzy GPT-4o demos were explicitly intended to seal the deal with Apple. And a big part of the demo was OpenAI’s improved vision features. The essence is that you should be able to point your phone’s camera at objects for identification, but also at people to gauge their mood or emotional state. We had a chance to try out ChatGPT-4o’s vision features with several prompts and were impressed, although some mistakes were made.
Expect OpenAI disclaimers
So it’s no surprise that Apple will likely let users know when they get responses generated by OpenAI and ChatGPT compared to its own models, because AI will do things wrong and can “hallucinate.”
Either way, potentially adding vision features to Siri could be a game changer as Apple tries to catch up with Google Lens, which gets an even bigger boost with Gemini Live. Even without OpenAI, Siri is reportedly getting a major upgrade with iOS 18, as we’ve heard Siri could provide voice control for several apps including Notes, Photos, and Mail. This capability will become available for third-party apps in the future.
The bottom line is that iOS 18 and Siri will become much smarter with the help of OpenAI, assuming this deal is legit and announced at WWDC next week.