Your iPhone is about to get smarter. Apple invests $1 billion every year, utilizing Google's strongest AI technology.
Apple has paid a high tuition fee for its shortcomings in AI.
According to the latest report from Bloomberg journalist Mark Gurman, Apple is close to reaching a multi-year agreement with Google to pay around $1 billion each year for Google's Gemini model to provide core AI capabilities to the new version of Sir
According to the plan, Gemini will be responsible for the most critical functions of Siri's summarizer and task planner, while other functions will still be handled by Apple's own smaller models.
The key clause of the agreement is that Gemini will run on Apple's private cloud computing servers, and user data will not be exposed to Google's systems. Of course, domestic users currently cannot use Gemini, and Apple will prepare another plan for
Such a huge investment is just to add the most important insurance for the timely delivery of AI Siri.

The best choice for Apple now
What does Apple want in this transaction?
The answer is simple: time.
At the WWDC developer conference in June 2024, Apple showcased its new Siri powered by Apple Intelligence, which includes new features such as stronger contextual understanding, screen sensing, and cross-app operation capabilities.n
According to Apple's previous statement, the new Siri features were initially scheduled to be rolled out gradually during the iOS 18 update cycle. However, many key AI features have been delayed, and the latest unified release date for their initial

Behind such a long delay lies Apple's weakness in large model technology.
To bridge this gap, Apple has to seek support from outside. Reports say that Google has provided Apple with the Gemini model which has 1.2 trillion parameters, far exceeding Apple's current 150 billion parameter model.
As a reference, in July of this year, the Dark Moon first released the Kimi-K2-Instruct model, which has a total of 1 trillion parameters and has become the first domestic open source model to break through the trillion-parameter mark.
This huge disparity in parameter scale directly reflects in the model's inference ability, knowledge breadth, and task complexity -- which are the technical foundations necessary for the new version of Siri to implement core functions like 'summarize
To train self-developed models with comparable parameter scales and performance within a short time, Apple not only needs massive computing power and high-quality training data, but also a stable and experienced research and development team.
The core of the problem is that Apple's AI team is facing serious talent loss. Since July of this year, about dozens of core members of Apple's AI team have left for other companies.

Pang Ruoming, the leader of Apple's fundamental model team, was poached by Meta with a US$200 million salary. Ke Yang, who was just appointed as the leader of Siri's intelligent search project, decided to join Meta as well. Several key researchers wh
This small team, already with over 100 members, has lost its main leader at the most critical moment when it needs to make the most progress.
This is a full-blown crisis of confidence. When your employees vote with their feet, it means the problem cannot be solved by simply offering more pay raises.
Apple's confidentiality culture used to be its moat. Strict information control makes product launches full of surprises, making it impossible for competitors to imitate.
But in the era of AI, this approach becomes ineffective. When researchers cannot freely publish their papers and build their reputation in the academic world, they miss out on the rapid iteration of the entire AI community, which is integral to open
Moreover, Apple has relatively limited computing power resources, and its training data is restricted by privacy policies.
While OpenAI and Google invest thousands of GPUs to train large-scale models, Apple needs to find a balance between user privacy protection and data usage scale, which partially restricts its progress in training large models.
So Apple had no choice but to seek help from outside.
Why Google instead of others?
According to previous reports, when selecting a partner, Apple evaluated OpenAI's ChatGPT, Anthropic's Claude, and ultimately chose Google Gemini.
Although it may seem like an unexpected choice, it is actually inevitable. Firstly, Google is powerful and stable enough.
As a veteran giant in the AI field, Google's Gemini 2.5 Pro ranks high on most large model leaderboards, with unquestionable technological prowess. This remarkable technical capability is also reflected in Token usage.

Last month, Logan Kilpatrick, the publicity committee member of Google's AI team, revealed on social media that Google processes 1.3 trillion tokens per month, with a record-breaking amount of computing power consumption in the industry.
Moreover, Google's advantages go beyond this point.
As one of the few overseas giants that engage in AI full-stack research and development, Google has a world-leading cloud computing infrastructure and engineering team, supporting enormous daily requests of Siri. This is beyond the reach of start-ups
The history of cooperation has also paved the way for this transaction.
From the initial iPhone with built-in Google Maps and YouTube, to Safari's annual search engine agreement worth over $20 billion, and even to Apple storing some of its iCloud data on Google Cloud - the trust accumulated by these two companies over th

Google's willingness to compromise is crucial.
According to the agreement, Google's Gemini model will run on Apple's private cloud computing server, and user data will not contact Google's system. This means that Apple can enjoy Google's technology while maintaining control of user privacy.
Note that this is precisely what Apple cares most about.
It is worth mentioning that Apple is positioning the new Siri as a new generation of search entry on devices.
If the knowledge and reasoning behind Siri were provided by Google, it would be an extension and upgrade of the alliance between the two companies in the search field - when users ask Siri questions, it is still Google's technology at work, just in a
It can be said that under the predicament of Apple's limited options, Google is the only one that meets the requirements in technology, trust, control, and business terms.
A graceful rescue
Integrating Google Gemini brings the most direct benefit of significantly improving Apple's on-time delivery probability.
If Apple insists on the pure self-developed route, considering the talent loss and technological gap, it is uncertain whether it can achieve its goal by the Spring of next year. However, by introducing Google's already developed model, Apple has foun
It is reported that the Siri revamping project is led by Vision Pro headsets head Mike Rockwell and software engineering director Craig Federighi, and the new version of Siri is code-named 'Linwood' internally at Apple.

Apple's emphasis on this upgraded AI Siri can be seen from its personnel arrangements.
Gemini will be responsible for the extractors and task planning functions in Siri, which are the core abilities to integrate information and determine how to execute complex tasks. Other functions will still be handled by Apple's own models. This dua
It is worth mentioning that Apple's technical architecture is already prepared for this kind of integration.
The new version of Siri uses a modular design: the small model on the device is responsible for simple tasks and privacy-sensitive operations, while the large model on the cloud is responsible for complex reasoning and knowledge queries. This archite

Of course, it is expected that the domestic version of AI Siri will not use Gemini.
Apple must prepare different AI solutions for different markets, such as cooperating with local manufacturers like Alibaba and Baidu, or using special versions of self-developed models. This flexibility is also an advantage of the modular architectur
The underlying issues are not resolved, merely.
In the past, Apple was accustomed to a wait-and-see strategy, yet each time it managed to come from behind by honing its product experience to the utmost. This strategy was based on a premise that technological evolution is linear and that you always

But AI has broken this rule.
Today, although the discussion on whether Scaling Laws will continue to be effective is still ongoing, the first-mover advantage is indeed more pronounced in the field of AI: the training of each generation of models is built on the foundation of pre
One billion dollars is actually buying a chance to catch a breath.
This is also Apple's last chance to save its AI Siri reputation while users' patience still exists, as both onlookers, product users, and Apple executives are aware that there is little room for Apple to make mistakes.
This article is from the WeChat public account “APPSO”.


