VENDORiQ: ‘Agenticify’: The Made-Up Word That Will Change Enterprise Architecture Forever

Google’s new Model Context Protocol (MCP) servers fundamentally enable structured AI orchestration at the cloud infrastructure layer, accelerating predictable, governable automation for enterprises.

The Latest

Google has announced the release of fully-managed, remote Model Context Protocol (MCP) servers, enhancing its existing API infrastructure. What is MCP?

MCP is a framework that allows AI agents to connect with and control external information services. Essentially, MCP translates the actions requested by an AI agent into structured, executable operations on underlying enterprise systems and infrastructure.

With this announcement, Google aims to provide a unified layer for AI agents to control Google and Google Cloud services. Initial MCP support is rolling out for several key services, including Google Maps Platform for geospatial data grounding, BigQuery for reasoning over enterprise data, Google Compute Engine (GCE) for autonomous infrastructure management, and Google Kubernetes Engine (GKE) for autonomous container operations. This initiative positions MCP as a method for AI agents to interact with a broad application stack, from databases to infrastructure. It literally brings the underlying infrastructure of cloud applications to AI services.

Why it Matters

There are three important, but largely hidden implications stemming from Google’s move.

AI Access at the Foundational Layer of Business Solutions

Any enterprise applications built upon the Google Cloud Platform (GCP) foundational services will immediately gain MCP capabilities.  Rather than building bespoke MCP wrappers for their business solutions running on GCP, developers will have the ability to ‘agenticify’ (yes, it’s a totally made-up word!) business services at an architectural level. This has the potential to make every part of a business solution a service that AI services can call upon.  It will cause an acceleration of development of AI capabilities for cloud-native business solutions.

SaaS AI Acceleration

Related to the above, the ability to bring MCP to foundational layers of cloud solutions enables SaaS solutions running on GCP to rapidly release MCP access to every aspect of their solutions. Just as SaaS solutions thrive by providing a rich catalogue of APIs that allow clients to expand and integrate solutions into their business, IBRS expects rich MCPs catalogues to emerge so SaaS solutions will fit within AI workflows.  

Google’s ‘MCP at the cloud service layer’ strategy will enable SaaS vendors running on its platform to dramatically increase the speed at which they deploy these expected MCP services.

MCP is not just Agentic – Classic Automation is where the money is!

Hidden in Google’s release notes is a comment on ‘structured automation’. This warrants closer examination. The announcement highlights agents’ ability to interpret schemas, execute queries, and manage infrastructure within defined parameters. This contrasts with the notion of largely autonomous AI applications, aligning instead with the perspective that AI’s most impactful enterprise applications will reside within structured, process-driven workflows. 

For organisations exploring AI adoption, this development reinforces the IBRS argument that orchestrated AI will be a dominant pattern, where AI services operate within pre-defined parameters. This is essentially classic, deterministic workflow automation, but with AI services added to the mix. For the bulk of business tasks, AI orchestration offers a more practical, secure, governable and cost-effective approach than agentic deployments. It should also be noted that true agentic services can be sub-components of a larger orchestration. Importantly, such orchestrations have predictable results and thus predictable and measurable benefits realisation. 

By providing structured access to services like BigQuery and GKE, Google’s MCP strategy facilitates predictable interactions. For instance, agents utilising BigQuery MCP servers can access and query enterprise data without the associated risks or latency of data movement, ensuring governance and security by keeping data in place. Similarly, GKE MCP servers offer a structured interface for agents to reliably interact with Kubernetes APIs, mitigating the challenges of parsing unstructured outputs or complex command-line sequences. This controlled interaction contrasts with the complexities and potential inefficiencies of entirely autonomous, less constrained agentic operations. 

The implication is that structured orchestration at the foundational layers of cloud services not only enhances safety and reduces operational costs associated with adding business solutions to AI workflows, but also contributes to improved quality and productivity by embedding AI within established, predictable enterprise processes.

How Google’s Approach Differs from AWS and Microsoft

AWS MCP Server: AI Coding Assistance Focus

AWS launched the ‘AWS MCP Server’ (in preview as of late 2024/early 2025), which is a fully managed, remote MCP server designed to give agents access to the broader AWS ecosystem.

  • What it does: It consolidates access to over 15,000 AWS APIs and integrates with AWS Knowledge (documentation). This allows agents to perform tasks like provisioning EC2 instances, managing S3 buckets, or troubleshooting Lambda functions using natural language.
  • Key Difference: AWS emphasizes a ‘Serverless MCP Server’ and specific tools for coding assistants, while Google’s announcement highlights data grounding (Maps/BigQuery) slightly more aggressively.

Microsoft Azure MCP Server: Different Path with a Business Layer Focus

Microsoft is integrating MCP deeply, but their marketing often focuses on their business applications (SaaS) and developer tools rather than foundational infrastructure.

  • The Offering: Microsoft introduced the Azure MCP Server that enables agents to interact with Azure resources via Azure CLI and Developer CLI.
  • API Management Support: They support MCP within Azure API Management, effectively allowing any enterprise to turn their own APIs into MCP servers for agents to use. This is a strong strategy that aligns with much of the analysis in this paper, though from a different architectural trajectory to Google.
  • Business/ERP Layer: Microsoft has been very aggressive at the SaaS layer, and launched the Dynamics 365 ERP MCP server, allowing agents to perform complex business tasks (like financial reconciliation) across their ERP suite.

Who’s Impacted?

  • Chief Information Officers (CIOs): Google will not be the only cloud adding MCP to core cloud services. Monitor AWS and Azure for similar capabilities. Assess the potential for integrating structured MCP services into their existing cloud infrastructure (namely GCP at this time) to enhance service delivery and operational efficiency, particularly concerning data governance and security. 
  • AI project Teams: Evaluate how MCP can standardise AI service interactions with enterprise services, promoting scalable and manageable AI deployments.
  • Enterprise Architects:  Become familiar with the architecture of MCP and how it integrates with cloud environments, as well as monitor the development of these services over time. Consider how ‘cloud infrastructure layer’ MCP will fit into robust and secure AI-driven solutions.
  • Development Team Leads: Should review the developer tools and emerging MCP coding patterns to understand how their teams can build and deploy AI agents that leverage structured automation.
  • Operations Teams/Site Reliability Engineers (SREs): Will be impacted by autonomous infrastructure management capabilities offered through GCE and GKE MCP servers, requiring new skills for monitoring and managing AI-driven operations.
  • Governance/Security Teams: Understand the implication of MCP in opening up new data access pathways. Implement new policies and best practices with the AI project teams, development teams, and operations teams to mitigate risks of MCP. Note that previous work on determining safe API usage and governance over APIs provides strong parallels to MCP.

Next Steps

  • Conduct a thorough assessment of existing enterprise workflows to identify areas where structured AI-driven automation, facilitated by frameworks like MCP, could deliver tangible benefits.
  • Prioritise pilot projects that integrate AI with established data sources and operational tools, focusing on quantifiable outcomes in cost reduction, efficiency, or quality improvement.
  • Develop a clear governance framework for AI orchestration within workflows, emphasising controlled access, data security, and auditability in line with the principles of orchestrated AI.
  • Invest in upskilling IT and development teams in the use of structured AI tools and platforms, ensuring they can effectively design, implement, and maintain these systems.
  • Engage with your cloud vendors to understand their long-term roadmaps for MCP for foundational cloud services, structured automation and how these align with internal strategic objectives.

Trouble viewing this article?

Search

Register for complimentary membership where you will receive:
  • Complimentary research
  • Free vendor analysis
  • Invitations to events and webinars
Delivered to your inbox each week