News

Apple Intelligence still broken: seven years after Google AI hire, Siri features remain undelivered

May 19, 2025

Key Points

  • Apple Intelligence, unveiled at WWDC 2024 as central to iPhone 16, shipped with zero promised features and has delivered them piecemeal over months, triggering class action lawsuits for false advertising.
  • Seven years after hiring Google AI executive John Giannandrea, Apple still lacks basic speech-to-text competency in Siri and uses outdated transcription technology instead of modern transformer architectures.
  • Apple's $20 billion annual profit from the Google search deal creates structural vulnerability if Google faces AI disruption, leaving the company dependent on a competitor's technology despite controlling device and hardware advantages.

Summary

Apple's $3 trillion valuation masks a deepening AI crisis that threatens iPhone dominance and future product categories like robots. Seven years after hiring John Giannandrea from Google to lead AI efforts, the company has fallen further behind competitors, not closer to parity.

Giannandrea arrived in 2018 as a significant hire. He had run Google's search and AI groups, overseeing deployment of cutting-edge AI in Photos, Translate, and Gmail. Apple's leadership reorganized around him, uniting fragmented AI work under a single umbrella reporting directly to Tim Cook and placing AI on equal footing with hardware, software, and services.

That structure has not delivered results. Since OpenAI's ChatGPT reset expectations in 2022, every major tech company has accelerated large language model development and integration into voice assistants. Apple's rollout has been defined by delays and mediocre execution.

At WWDC in June 2024, Apple unveiled Apple Intelligence, described as "AI for the rest of us." The company promised writing tools, email summaries, notification prioritization, custom emoji generation, and a reimagined Siri capable of pulling information from emails and texts to answer complex questions like travel itineraries. The announcement drove the stock price higher. In September, the iPhone 16 shipped as "built from the ground up for Apple Intelligence." It had none of the features.

Basic writing tools and summaries did not arrive until November, six weeks after the device launched. Genoji didn't ship until December. Notification summaries didn't roll out until March 2025. The Siri upgrade, originally targeted for April 2025, was delayed indefinitely after Craig Federigi beta-tested it on his own phone weeks before the planned OS release and discovered that core features, including pulling up a driver's license number via voice search, did not work.

Apple shipped videos at WWDC showing early prototypes, not production-ready features. Class action lawsuits accusing the company of false advertising followed. Features continued running in iPhone 16 commercials even as delivery slipped.

The execution failures point to deeper problems. Writing tools are straightforward—pass text into an LLM with a summarization prompt and return it. That Apple could not ship this with the device or within weeks signals fundamental execution failure, not optimization delays. Siri still lacks basic competency in speech-to-text and text-to-speech, the foundation any voice assistant requires before layering advanced features on top. The transcription technology remains outdated, not upgraded to modern transformer-based architectures like Whisper.

Apple's structural vulnerability runs deeper still. About a quarter of Apple's profit comes from the Google search deal, roughly $20 billion annually. If Google itself becomes fragile due to AI disruption, Apple loses control over a core profit stream. The company that owns the device, the hardware, and the portal to the internet remains a follower in the technology that should amplify those advantages most.