If you’re a CIO in a board meeting getting hammered about why IT can’t deliver simple changes, you’re likely sitting between a rock and a hard place.
Your company’s big enough to carry decades of technical debt but not big enough to have modernized like Amazon or Walmart.
Worse yet, you’re also in the middle of an AI hype cycle, adding further heat when you deliver progress reports in the boardroom. While the board demands AI miracles yesterday, you’re grappling with talent and a killer use case to launch your AI Center of Excellence (AI CoE).
More than a lab, a true AI CoE is a strategic enabler, it connects the messy back-end of legacy systems to the front-end promise of value creation. And that’s exactly what this company did.
The pressure is real and measurable: Recent analysis of over 8,300 cloud implementations shows that 22% now include AI elements, while Microsoft has captured 45% of new cloud AI case studies—far outpacing their 29% overall cloud market share.
The message is clear: competitors are moving fast on AI, and legacy infrastructure is becoming a competitive liability.
One major automotive distributor’s $22 billion operation was trapped in exactly this sweet spot—too big to ignore their AS/400-era mainframe that required millions to add database fields, yet too tangled in 200+ apps to swap out without a multi-year fight.
“Our technology has become complex, out of date, and has inhibited our ability to innovate at speed.”
Transparent and forthright, the company essentially admitted they were held hostage by legacy systems when trying to modernize operations, as they took ambitious steps to escape mainframe hell and transform an entire operation using the Azure platform.
Knowing your strengths and weaknesses, as we shall see, remains a transformational edge for cloud adoption, machine learning optimization, and ultimately enterprise-scale AI apps.
The Azure Foundation (2019–2022)
The transformation began at Microsoft Ignite, where the company’s technology team witnessed a dramatically improved Azure platform. Previously dismissed as inferior to competitors, Azure’s cognitive services demonstrations and polished user experience convinced them to run a pilot. Within weeks, they had built a functional application that proved the platform delivered on its promises.
Perfect timing struck when the company hired a new CIO with a “better, faster, cheaper” mandate and strong Microsoft affinity. This aligned with broader market trends—Microsoft’s aggressive AI positioning had captured 62% of new cloud GenAI case studies, driven by their early OpenAI partnership and Azure integration capabilities. By 2020, they had formally selected Azure as their platform for innovation and differentiation, establishing decision-making principles and embracing Microsoft’s Cloud Adoption Framework for structured migration.
The Azure migration delivered immediate surprises. Within three months of platform readiness, they built a fully automated 4-step machine learning process using Azure ML—a capability they hadn’t expected to adopt so quickly. This early AI success became central to their future strategy and laid the groundwork for their eventual AI CoE.

This early win positioned them ahead of the curve. According to IoT Analytics, customer service automation has become the top GenAI use case, but most enterprises are still struggling with pilots while this company moved directly to production systems.
However, the journey included costly missteps. They initially chose Azure Blueprints for their foundational infrastructure, only to discover it required complete rebuilds for any changes, like “demolishing your entire house just to add a room,” as one technology architect explained. Microsoft helped them pivot to ARM templates and deployment pipelines, avoiding project delays but teaching the importance of thoroughly vetting technology choices.
By 2022, they were using Azure Arc on 110 servers to build business cases for analytics migration. This six-month project created comprehensive roadmaps for “replatforming” rather than simple “lift and shift” operations, setting the stage for their massive ERP transformation.
The ERP Transformation Program (2023–2024+)
With Azure as their proven foundation, the company launched their most ambitious initiative: replacing the 50-year-old mainframe with Microsoft Dynamics 365. Their modernization program, named by associates, became the company’s most significant technology investment in history.
Azure became the foundation for this transformation, strategically and architecturally. The company’s distribution subsidiary dedicated over 100 IT and business staff to the program, recognizing the importance of thoroughly capturing and documenting their current state processes before implementing the new systems.
This approach laid the foundations for their AI CoE (launched around 2024) for advanced capabilities like real-time analytics and process automation.
They were operating in a sector deemed more disruptive than healthcare, technology and financial services. A recent survey of 3,200 CEOs (300 from automotive) found that 1 in 2 were concerned about current strategies for attracting top talent, surpassing any other industry surveyed.
The Process Documentation Challenge
The company faced a critical challenge: how to capture 50+ years of mainframe-based intellectual property for system integrator partners without relying on “tribal knowledge and anecdotal sharing.”
A business process manager with 20+ years in business process management led the effort to find a centralized solution. Their selected business process management platform, integrated with Microsoft’s ecosystem, created what they called a “business process capability hierarchy.”
This replaced the typical enterprise approach of scattered Excel files, SharePoint documents, and individual drives, creating a single source of truth with standardized formatting where “if you’ve seen one process, you’ve seen them all.”
The modernization program enabled them to manage their entire automotive supply chain with greater efficiency and precision. By implementing this centralized process management platform during their ERP transformation, they achieved something rarely seen in enterprise modernization: 30–50% acceleration in their process documentation compared to traditional spreadsheet-based approaches.
Building the AI CoE (2024+)
By 2024, the company had cleared enough legacy thorns, including mainframe dependencies, siloed processes, and ERP bloat, to move into the execution phase of building an AI Center of Excellence that actually works.
Under their CTO, the team anchored their strategy on Responsible AI governance, a structured program led by an internal AI Council. This cross-functional group reviewed every initiative for alignment with enterprise strategy, ethical standards and associate experience.
They weren’t chasing headlines. One grassroots project emerged as a standout: a virtual assistant built in-house to help employees navigate HR and IT requests, reducing ticket volume and improving experience.
Their product teams adopted agent-based tools for writing requirements and test cases, which slashed documentation time by 30% and boosted consistency. A major cultural shift happened: AI moved from side project to core pipeline.
Today, over 250 use cases are in the AI CoE pipeline, prioritized by business value.
Most AI CoEs stall because they’re siloed, disconnected from business outcomes, or can’t move beyond prototypes. This company sidestepped all three, embedding AI into delivery pipelines and building a product-grade internal agent system now being commercialized.
Inside Their Multi-Agent AI System
At most large enterprises, writing requirements and test cases is a patchwork of spreadsheets, personal style, and institutional memory. The company tackled this head-on with an internal multi-agent AI system designed to systematize and accelerate documentation.
Agents are intelligent AI assistants that operate autonomously to perform tasks, such as answering questions or generating test cases. According to Microsoft, only 6% of leaders say their companies are using agents to automate workflows or processes, putting this company in rare territory.
We can imagine, for example, an environment where business analysts can start with AI-generated cases that can pull together all the necessary datapoints to forecast car delivery, rather than starting from scratch. This is what they have achieved with their agentic framework using the Azure AI Foundry.
Before their agent system, the quality of requirements varied wildly by individual. Test cases were either overbuilt or underscoped. Documentation was scattered across siloed tools, including Word documents, SharePoint folders and email threads.
“When you have a large project, everyone has their own way of doing things. Agents bring standardization—and that’s been huge.”
Since no one could trace why something was built a certain way or how it was supposed to behave, QA cycles became bloated and heavy. All of this added up to the silent tax of rework caused by inconsistent knowledge capture.
From this pain point, their multi-agent system was developed as a hands-on AI tool designed to support every step of the software delivery process.
Currently, the system includes:
- Intent Parser Agent: Converts loosely described business needs into structured feature intent.
- Requirements Synthesizer Agent: Generates formalized user stories and acceptance criteria aligned to internal templates.
- Test Case Generator Agent: Creates traceable, step-by-step test cases with expected results and edge conditions.
- Reviewer Agent: Applies LLM-based QA checks for clarity, completeness, and testability.
These agents communicate via shared memory using Azure OpenAI orchestration with Microsoft 365 connectors to pull from Jira tickets, past test plans, and documentation fragments.
Unlike basic copilots, this system isn’t reactive—it’s composable knowledge work with contextual memory and delivery cadence awareness.
At scale, the benefits include:
- 40% reduction in time spent on writing business requirements
- 60% acceleration in test case creation
- Elimination of undocumented institutional knowledge
The system is now being commercialized for peers in automotive and beyond. They’re adding retrieval-augmented generation and feedback loops to make the tool self-improving over time.
Lessons for CIOs
The company tested frameworks like LangChain and AutoGen but found them too backend-heavy. They instead adopted Azure’s Agent Service—a plug-and-play model that focuses on business outcomes.
Tips from their AI team:
- Be clear about each agent’s purpose.
- Define strict rules for agent behavior.
- Test agents individually before combining them.
They succeeded because they avoided the traps of siloed teams, disconnected experiments, and lack of productization. They built infrastructure, governed data, and enforced a center-led, business-driven execution model.
The Real Takeaway
It started by documenting 50 years of tribal knowledge, which then became a foundation for repeatable AI. As McKinsey notes, indirect costs—cloud, security, tooling—can consume up to 80% of a product’s full lifecycle cost.

While others are stuck with pilots and PowerPoints, this company prioritized infrastructure discipline and value creation. Microsoft’s 33-point lead in GenAI share over its cloud market footprint isn’t accidental—it’s driven by execution like this.
So next time you’re in the boardroom, you might say:
“Our decades-old core systems cost millions for simple changes while competitors deploy AI in weeks. Cloud unlocks 10x faster development and real-time analytics. The market shows 45% of successful AI happens on modern infrastructure. Without it, we can’t retain talent or win new partnerships.”
OpenNova specializes in the embedded talent and delivery teams that help enterprises move from AI ideas to impact. Let’s discuss your first use case. Reach out to Ryan Alfieri, CEO, on LinkedIn.