Skip to content
Menu
  • Home
  • Tips
  • Reviews
  • Buying Guides
    • How to choose a Mac laptop
    • How to choose a desktop Mac
    • Laptop vs desktop – how to decide
    • How to choose an Apple Watch
  • Security
  • Hardware
    • Mac
    • iPhone
    • iPad
    • Apple Watch
    • Vision Pro
    • Apple TV
    • Accessories
  • Software
    • macOS
    • iOS
    • iPadOS
    • visionOS
    • watchOS
    • tvOS
    • Apps
  • About
  • RSS

Apple embracing third-party LLMs to power the next version of Siri

Posted on April 11, 2025April 11, 2025
Share on Social Media
x facebook linkedin reddit email

Following the management shakeup that saw Apple dump John Giannandrea as its head of AI and bring trusted manager Craig Federighi in to oversee the redevelopment of Siri, there’s news that Apple is no longer wedded to the idea of using its own LLMs.

Large Language Models (LLMs) are the foundation upon which AI applications are built. There are now dozens of open source LLMs available for developers – OpenAI, DeepSeek, Grok and Gemini are amongst the most well known. But, until Federighi took over, Apple’s engineers could only use those models to benchmark the performance of their own models. 

The big shift is that Apple’s engineers can now, per a report published at The Information [subscription], integrate third-party LLMs into Siri for better performance. From an end-user perspective the actual LLM that is used, whether that’s Apple’s bespoke one or an open source solution, is irrelevant. What matters is the speed and accuracy of the tool.

One of the interesting pieces of information from that article is that Apple demonstrated features at WWDC 2024 that were not yet built. Typically, when Apple shows off a new feature, they have it working on demo devices and in early stage software. But the Apple Intelligence features shown off last year were supposedly faked. 

The move to use third-party LLMs is critical. The poor response to Apple Intelligence was likely to be highlighted at WWDC 2025. But by integrating third-party LLMs, Apple can move forward faster than if it was developing its own LLMs. It gets to release a better product and it gives the company time to continue developing its own LLMs and slowly reduce its reliance on third party providers.

In a way, this mimics their hardware strategy. Apple started putting its own chips into the iPhone and iPad before moving to the Mac. The release of the company’s first modem chip begins a transition away from Qualcomm and a new Wi-Fi chip will enable them to reduce dependence on Broadcom. While Apple will be using OpenAI, and potentially other, LLMs to power the next iteration of Siri, they can slowly wean themselves away as they increase the performance of their software. 

Anthony Caruana

Anthony is the founder of Australian Apple News. He is a long-time Apple user and former editor of Australian Macworld. He has contributed to many technology magazines and newspapers as well as appearing regularly on radio and occasionally on TV.

Share this:

  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on X (Opens in new window) X

Related

Latest reviews

  • Review: Huawei Watch Fit 4 Pro 
  • Review: HidrateSpark Pro Smart Water Bottle
  • Health app showdown - Athlytic v Welltory

Latest tips

  • How to change default apps in iOS 18.2
  • How to create a new iMovie Library
  • WWDC25 is coming - should you upgrade to the newest software?

Latest news

  • Apple announces winners for the 2025 Apple Design Awards 
  • OpenAI's deal with Jony Ive's io will not work out
  • Apple appeals EU order to share data with rivals

No authorisation is provided for the content on the site to be scraped or otherwise used for the training of machine learning, AI models or any other reuse without the express written permission of the site owner.

©2025 | WordPress Theme by Superb WordPress Themes