Apple's Deal with Google Lets It Use Gemini to Train Smaller AI Models
As part of a deal announced in January, Apple reportedly has 'complete access' to Google's Gemini in its data centers, including the ability to use Gemini to train smaller, device-optimized AI models through a technique called distillation.
The Distillation Strategy
Model distillation is a technique where a large "teacher" model (in this case, Google's Gemini) is used to train a smaller "student" model. The student model learns to replicate the teacher's capabilities while requiring significantly less computing power and memory — making it suitable for running on consumer devices like iPhones, iPads, and Macs.
What Apple Gets
- Complete access to Gemini in Google's data centers
- Ability to train specialized student models tuned for Apple's hardware
- Reduced computing requirements for on-device AI processing
- Faster iteration cycles without building massive training infrastructure from scratch
Strategic Implications
This partnership allows Apple to leapfrog years of in-house AI development. Rather than building foundation models from scratch, Apple can leverage Google's cutting-edge technology and customize it for its ecosystem. This is particularly important given that Apple's on-device AI strategy requires models that can run efficiently on mobile chips.
Context: Apple's AI Push
Apple has been behind in the AI race compared to competitors like Google, Microsoft, and OpenAI. The company has faced setbacks including leadership changes in its AI division. This Google deal, combined with the iOS 27 Extensions feature that will let users choose between multiple AI chatbots for Siri, represents Apple's most aggressive AI strategy to date.
Apple's WWDC 2026, scheduled for June 8-12, is expected to showcase the first major fruits of this partnership.