Observations
Enabling the AICoE to Support Innovation
An AICoE should be developed as the operational heartbeat of a modern AI strategy. It is a specialised unit, or functional grouping, designed to industrialise data capabilities, champion data literacy, and, most critically, accelerate innovation by providing the AI tools and guardrails that allow teams to move fast without breaking the organisation.
Where innovation is a leading priority for many startup organisations, the AICoE is likely to be centralised and therefore flipped, for a period of time, from a traditional AI governance model that treats governance as a gatekeeper. Primarily, the AICoE would support a rapid test and fail mantra, which provides governance as guardrails.1 As a startup organisation’s products and processes mature, the AICoE role returns to providing governance as a gatekeeper.
The AICoE’s true value-add will include the ability to help organisations move beyond constant AI experimentation by assessing the exploration and experimentation stages of each AI Innovation project. During these assessments, guidance can be provided on relevant predefined AI tools, including generative AI (large language models – LLMs), predictive AI (traditional machine learning – ML), and graph neural networks (GNNs). Whichever AI solution is recommended, the AICoE responsibility should include providing the project with relevant key performance indicators (KPIs) in the business case. These are examples of a range of KPIs that the AICoE should apply to help the project determine Innovation success.
- Business Impact: Revenue growth, market share, risk mitigation, etc.
- Operation Efficiency: Lifecycle reductions, automation efficiencies, lower transaction costs.
- AI Model Performance: F1 score, recall accuracy, hallucination rate, stability under load.
- Customer Experience: CSAT, NPS, etc.
- Innovation Capacity: Innovations delivered per quarter, workforce upskilling rate, and adoption rate.
- Economic Efficiency: ROI, total cost of ownership (TCO), payback period.
- Business Agility: Faster pivots to new markets or regulatory changes2.
The AICoE role is to recommend a suitable framework to achieve the project outcomes. Each of the following models is an example of a framework and set of skills needed to support AI innovations, depending on the organisation’s maturity.
1. Structural Models for an AICoE
There is no one-size-fits-all. The structure must match your organisation’s maturity and culture. The three dominant models are centralised, decentralised, and federated (hub-and-spoke).
A. Centralised Model
All AI talent (product managers, ML engineers, AI architects) sits in one central team, often reporting to a CAIO (chief AI officer) or CDO. They act as a consultancy to the rest of the business.
- Innovation Assessment: Low-Medium (in Enterprise context). Great for building core platforms, but often too slow for rapid business experimentation. Contrary to this view, a startup organisation may benefit from a centralised model to accelerate innovation during its infancy.
B. Decentralised Model
Each business unit (marketing, finance, supply chain) hires its own AI staff to develop solutions. There is no central authority.
- Innovation Assessment: High (Locally) but Chaotic. Innovation happens fast but cannot be scaled or reused across the organisation.
C. The Federated (Hub-and-Spoke) Model
A Federated model may be seen as the optimal balance for achieving an innovation-first organisation; however, small- to medium-sized businesses may find it unachievable if more resources are required. Perhaps more suitable for larger enterprises that are less resource-constrained.
In this model, a small core hub leads the AI strategy, platform choices, and governance. The business units are embedded with AI expertise, with a dotted-line reporting back to the Hub for support and standards referenceability3.
- Innovation Assessment: High and Scalable. This structure allows the business to drive the what (innovation ideas) while the AICoE enables the how (technical execution).
2. AICoE Roles & Responsibilities: Supporting Innovation
To prioritise innovation, roles must be defined by what they enable, not just what they control. Here, we examine the roles and business interactions needed to support an AI-led approach to Innovation, supported by an effective AICoE.
| Role | Functional Description | Innovation-First Interaction |
| Head of AI/AICoE Lead | A leader who aligns AI strategy with business goals. Helping secure budget, talent, and executive buy-in. | The Diplomat: Focus on value realisation rather than compliance metrics alone, and reduce bureaucratic interference to maintain momentum for innovation projects4. |
| AI Solution Architect | Designs the technical Lego blocks (platforms & tools) that teams use to build solutions. | The Enabler: Instead of reviewing every line of code, their role is to build pre-approved sandboxes with ready-to-use data and tools, allowing teams to start building on Day 1, avoiding IT procurement delays5. |
| Product Manager (AI) | The bridge between the business and the tech team. Defines the Why and What of the product. | The Value Hunter: Actively identifies friction points in the business that AI can solve. They prioritise the most impactful use cases based on potential business value and innovation potential6. |
| AI/ML Engineer/AI/MLOps | Focuses on automating the deployment and monitoring of AI models, including any infrastructure. | The Accelerator: Provide pre-packaged standardised templates, platform agnostic/specific tools, and reusable code, all designed to reduce time to Innovation7. |
| Data Scientist | The explorer who builds the models and algorithms to solve complex problems. | The Inventor: Given a safe space in the sandbox to experiment with radical ideas without needing approval for every new dataset or library. |
| AI Ethicist/Governance Lead | Ensures solutions are safe, unbiased, and compliant. | The Guardrail Builder: (Critical Change) Instead of acting as a police officer at the end of a project, they work with the team at the start to define safety boundaries. As long as the team stays within those boundaries, they do not intervene8. |
3. How to Build an AICoE: A Phased Approach
Building an AICoE is a journey, not simply the flip of a switch. Using a phased approach as described below will help build credibility within the organisation. However, highly regulated industries, such as healthcare and finance, may find the proposed timelines unachievable. These highly regulated industries should develop a bespoke approach for each horizon to align with and address any regulatory constraints.
The timeline below is an example of a framework developed by an organisation called Dataiku. Their goal was to rapidly build their AI platform with little time to document and review DEVOps process innovations or develop a clear framework in horizon 1. As AI projects progressed, learning from horizon 1 became reusable artefacts and knowledge sources, thereby reducing the need for constant experimentation. horizon 2 saw these innovations from the early stages of their AICoE.
As the innovations to DevOps process matured, they reduced build time and avoided the delays previously experienced. After DataIku completed the software development program, the organisation was able to update the learning into the AICoE, which then supported ongoing AI maturity9.
Growth in AI projects and the efficiencies derived from an effective AICoE can be considered by adapting the following framework, as developed by McKinsey & Co10. Each horizon requires a specific goal, action, and innovation strategy that leads to the final outcome in horizon 3.
Horizon 1: Incubate (Months 1–6)
- Goal: Prove value and build trust.
- Action: Assemble a short-term tiger team of 3–5 top performers to deliver value quickly. Pick 2 high-value, low-risk use cases (e. g., customer churn prediction, automated invoice processing). Include metrics in the business case to exit the project if the agreed value is not achieved.
- Innovation Strategy: Provide the team the freedom to bypass standard IT red tape, excluding security protocols, to deliver value fast for the limited set of designated projects. This speed and agility may rely on the AI solution architect having built pre-approved AI sandboxes.
Horizon 2: Scale (Months 7–18)
- Goal: Standardise and democratise.
- Action: As AI projects move to production, the AICoE becomes a knowledge centre, allowing business teams and data scientists to improve AI Platforms, reducing the number of centralised teams focused on AI build.
- Innovation Strategy: Firm up the number of pre-secured AI sandboxes available for testing AI theories. Accessing these pre-approved AI sandboxes would not require formal approval11.
Horizon 3: Transform (Month 18+)
- Goal: AI everywhere.
- Action: The AICoE shifts from doing to enabling. Focus on training and self-service tools which may have been developed in horizon 1 by the tiger team, enabling the AICoE to progress to a more federated model, so business users (citizen data scientists) can build their own simple solutions.
- Innovation Strategy: Governance becomes automated. Code scanners and model monitors run in the background, enabling thousands of micro-innovations each day without human bottlenecks.
Next Steps
- Decide now on an AICoE model that can benefit your organisation.
- Conduct an AICoE resources and capability gap analysis.
- Conduct timely health checks of AI projects to assess maturity.
Related IBRS articles
‘AI as the Engine of the Innovation Economy: Part 1 – Strategy’, IBRS, 2025.
AI as the Engine of the Innovation Economy: Part 2 – Integrating AI into the Business – IBRS
AI as the Engine of the Innovation Economy: Part 3 – Navigating the AI Talent Trap – IBRS
AI as the Engine of the Innovation Economy: Part 4 – AI Governance Models – IBRS
AI as the Engine of the Innovation Economy: Part 5 – Value Realisation – IBRS
Footnotes
- ‘Principled Artificial Intelligence: Mapping Consensus in Ethical and Rights-Based Approaches to Principles for AI’, J. Fjeld, et al, 2020.
- ‘AI Success Metrics: AI KPIs and Tracking Strategic Impact’, A. Kharche, 2025.
- ‘What Every Enterprise Needs to Know about Federated Analytics’, IIA, 2025.
- ‘Establish an AI Center of Excellence – Cloud Adoption Framework’, Microsoft Lrn, 2025.
- ‘Shaping the AI Sandbox Ecosystem for the Intelligent Age’, WEF and MEIT India, 2025.
- ‘Enterprise Generative AI & Value Creation’, Altman Solon, 2025.
- ‘The Codex MLOps Accelerator – AWS Approved’, Codex, 2025.
- ‘AI Code Guardrails: A Practical Guide for Secure Rollout’, SNYK, 2025’
- ‘The AI Maturity Framework’, ElementAI, 2025.
- ‘Enduring Ideas: The three horizons of growth’, McKinsey Quartily, 2009.
- ‘Azure Sandbox’, Microsoft Learn, 2025.

