Most people’s experience of AI is through generative AI tools. You ask them a question or ask them to create something and they deliver a result. That can anything from writing an email to creating an image or a video. Agentic AI goes further. The Model Context Protocol is the next step towards creating autonomous systems that can operate more independently. And Apple is bringing it to iPhones, iPads, Macs and more.
The first developer release of iOS 26.1 included a new tool called the Model Context Protocol (MCP). Anthropic created MCP. It has quickly become an industry standard that solves a problem all AI systems face.
AI systems are comprised of data models and algorithms. When you ask an AI system to do something, the algorithms interpret your query by using the data in the underlying models. Once the system ‘believes’ it understands your query it responds using data from the underlying model.
For an AI to answer complex questions it needs access to lots of data. MCP makes it easier for AI systems to access different data sources. This addresses a major challenge for AI developers. MCP standardises the way AI systems can access data. Without MCP, developers need to connect unique connectors for each different data source. This is labour intensive and complex.
How is Apple implementing MCP in iOS?
Apple is integrating MCP into iOS through App Intents. App Intents is one of the underlying frameworks in iOS. It allows native iOS, macOS, tvOS, or watchOS apps to expose custom actions, or intents.
It means that MCP-friendly AI models like ChatGPT, Claude and Gemini can interact with Mac, iPhone and iPad apps and autonomously take actions within these apps.

The key word here is ‘autonomously’. This is where agentic AI differs from generative AI.
Generative AI requires a user to enter a prompt to illicit a response. Agentic AI can ‘perceive’ what is going on around it and act independently to achieve an outcome.
The MCP technical notes provide examples of how it can be used.
- Agents can access your Google Calendar and Notion, acting as a more personalized AI assistant.
- Claude Code can generate an entire web app using a Figma design.
- Enterprise chatbots can connect to multiple databases across an organization, empowering users to analyze data using chat.
- AI models can create 3D designs on Blender and print them out using a 3D printer.
MCP provides a way to address my biggest bugbear in the Journal app – its lack of integration with other apps.
What’s in it for developers and users?
From the evidence we have today, it looks like Apple will add system-level MCP integration that will enable developers to open their apps to a broad suite of AI tools without having to create bespoke connectors.
MCP will be integrated into Apple’s software so we can safely assume privacy and security will be a foundation. It also means users will benefit from agentic AI faster as developers will be able to focus on delivering better functionality instead of creating custom connectors for each different AI model they want to use.
Generative AI has garnered significant attention but it is a stepping stone. Agentic AI is the next stone in the path to autonomous systems that can act independently.

Anthony is the founder of Australian Apple News. He is a long-time Apple user and former editor of Australian Macworld. He has contributed to many technology magazines and newspapers as well as appearing regularly on radio and occasionally on TV.