If you’re a CIO in a board meeting getting hammered about why IT can’t deliver simple changes, you’re likely sitting between a rock and a hard place.
Your company’s big enough to carry decades of technical debt but not big enough to have modernized like Amazon or Walmart.
Worse yet, you’re also in the middle of an AI hype cycle, adding further heat when you deliver progress reports in the boardroom. While the board demands AI miracles yesterday, you’re grappling with talent and a killer use case to launch your AI Center of Excellence (AI CoE).
JM Family’s $22 billion operation was trapped in exactly this sweet spot – too big to ignore their AS/400-era mainframe that required millions to add database fields, yet too tangled in 200+ apps to swap out without a multi-year fight.
“Our technology has become complex, out of date, and inhibited our ability to innovate at speed.”
Transparent and forthright, the company essentially admitted they were held hostage by legacy systems when trying to modernize operations, as they took ambitious steps to escape mainframe hell and transform an entire operation using the Azure platform.
Knowing your strengths and weaknesses, as we shall see, remains a transformational edge for cloud adoption, machine learning optimization, and ultimately enterprise-scale AI apps.
The Azure foundation (2019-2022)
The transformation began at Microsoft Ignite, where JM Family’s technology team witnessed a dramatically improved Azure platform. Previously dismissed as inferior to competitors, Azure’s cognitive services demonstrations and polished user experience convinced them to run a pilot.
Within weeks, they had built a functional application that proved the platform delivered on its promises.
Perfect timing struck when JM Family hired a new CIO with a “better, faster, cheaper” mandate and strong Microsoft affinity. By 2020, they had formally selected Azure as their platform for innovation and differentiation and embracing Microsoft’s Cloud Adoption Framework for structured migration.
The Azure migration delivered immediate surprises.

Within three months of platform readiness, they built a fully automated 4-step machine learning process using Azure ML—a capability they hadn’t expected to adopt so quickly. This early AI success became central to their future strategy and laid the groundwork for their eventual AI CoE.
However, the journey was not without missteps.
They initially chose Azure Blueprints for their foundational infrastructure, only to discover it required complete rebuilds for any changes, like “demolishing your entire house just to add a room,” as one technology architect explained.
Microsoft helped them pivot to ARM templates and deployment pipelines, avoiding project delays but teaching the importance of thoroughly vetting technology choices.
By around 2022, they were using Azure Arc on 110 servers to build business cases for analytics migration. The project created comprehensive roadmaps for “replatforming” rather than simple “lift and shift” operations, setting the stage for their massive ERP transformation.
The ERP transformation “DRIVE” program (2023-2024+)
With Azure as their proven foundation, JM Family launched their most ambitious initiative: replacing the 50-year-old mainframe with Microsoft Dynamics 365. The “DRIVE” program, named by associates, became the company’s most significant technology investment in history.
Azure became the foundation for this transformation, strategically and architecturally. Southeast Toyota Distributors, a subsidiary, dedicated over 100 IT and business staff to the program, recognizing the importance of thoroughly capturing and documenting their current state processes before implementing the new systems.
This approach laid the foundations for their AI CoE (launched around 2024) for advanced capabilities like real-time analytics and process automation.
It’s important to remember they are operating in a sector deemed more disruptive than healthcare, technology and financial services. A recent survey of 3,200 CEOs (300 from automotive) found that 1 in 2 were concerned about current strategies for attracting top talent, surpassing any other industry surveyed.
The process documentation challenge
The company faced a critical challenge: how to capture 50+ years of mainframe-based intellectual property for system integrator partners without relying on “tribal knowledge and anecdotal sharing.”
A business process manager with 20+ years in business process management led the effort to find a centralized solution. Their selected business process management platform, integrated with Microsoft’s ecosystem, created what they called a “business process capability hierarchy.”
This replaced the typical enterprise approach of scattered Excel files, SharePoint documents, and individual drives, creating a single source of truth with standardized formatting.
DRIVE enabled them to manage their entire automotive supply chain with greater efficiency and precision. By implementing this centralized process management platform during their ERP transformation, they achieved something rarely seen in enterprise modernization: 30-50% acceleration in their process documentation compared to traditional spreadsheet-based approaches.
The AI Evolution (2024–present): From use case chaos to operational discipline
By 2024, JM Family had cleared enough legacy thorns, including mainframe dependencies, siloed processes, and ERP bloat, to move into the execution phase of building an AI Center of Excellence that actually works.
Under their CTO, the team anchored their strategy on Responsible AI governance, a structured program led by an internal AI Council. This cross-functional group reviewed every initiative for alignment with enterprise strategy, ethical standards and associate experience.

What OpenNova admires about JM Family’s approach is they build things for humans, not headlines. They show grassroots innovation with a mantra of “encouraging associates at all levels to think creatively and propose ideas that improve the business”.
For instance, we love Hubert AI, a virtual intranet assistant built in-house to help employees navigate internal systems without submitting support tickets. Born from the company’s internal DIBS innovation “shark tank-style” challenge, Hubert quickly became a daily tool, streamlining HR, IT and policy queries via Microsoft’s AI stack.
Whether associates are navigating HR policies or seeking IT help, Hubert AI pulls from internal knowledge bases to provide instant support.
That internal utility mindset spills into product teams, where AI-powered tools for requirements generation and test case design cut delivery time by 30%, while increasing output consistency. Many CIOs reading this would appreciate that this shift is less technical and more cultural.
Teams stopped treating AI as a side experiment and started embedding it into delivery pipelines.
And from what we see at OpenNova, it’s scaling fast. The CoE is now evaluating over 250 use cases mapped to operational needs and prioritized against business value.
But one particular project stands out inside their Microsoft ecosystem, which we examine below.
Inside BAQA Genie and a “team of AI (agentic) coworkers”
At most large enterprises, writing requirements and test cases is a patchwork of spreadsheets, personal style, and institutional memory. JM Family tackled this head-on with Genie, an internal multi-agent AI system designed to systematize and accelerate documentation.
Agents are intelligent AI assistants that operate autonomously to perform tasks, such as answering questions or generating test cases. According to Microsoft, 6 percent of leaders say their companies are using agents to automate workflows or processes, putting JM Family in rare company.
We can imagine, for example, an environment where business analysts can start with AI-generated cases that can pull together all the necessary datapoints to forecast car delivery, rather than starting from scratch. This is what they have achieved with their agentic framework using the Azure AI Foundry.
Before Genie, the quality of requirements varied wildly by individual. Test cases were either overbuilt or underscoped. Documentation was scattered across siloed tools, including Word documents, SharePoint folders and email threads.
“When you have a large project, everyone has their own way of doing things. Agents bring standardization—and that’s been huge.”
Since no one could trace why something was built a certain way or how it was supposed to behave, QA cycles became bloated and heavy. All of this added up to the silent tax of rework caused by inconsistent knowledge capture.
From this pain point, Genie was developed as a hands-on AI tool designed to support every step of the software delivery process.
Currently, the system includes:
- Intent parser agent: Converts loosely described business needs into structured feature intent, using prompt-chaining against pre-approved requirement patterns.
- Requirements synthesizer agent: Generates formalized user stories and acceptance criteria aligned to JM Family’s internal taxonomy and templates.
- Test case generator agent: Pulls from the same requirement structure to create traceable, step-by-step test cases with expected results and edge conditions.
- Reviewer agent: Applies LLM-based quality assurance checks against a rubric (completeness, clarity, testability), flagging ambiguity and duplication across requirements.
These agents communicate across a shared memory structure, using Azure OpenAI orchestration with integrated Microsoft 365 connectors to pull historical Jira tickets, past test plans and documentation fragments to pre-populate context.
The agents learn contextually and are not limited to generation. Unlike basic copilots, Genie isn’t a reactive assistant and can be thought of as a composable knowledge worker that understands the company’s standards, style and delivery cadence.
At scale, the benefits go beyond time savings:
- 40% reduction in time spent on writing business requirements
- 60% acceleration in test case creation
- Elimination of undocumented institutional knowledge as the system encodes internal best practices directly into its prompt design and response structure
Teams likely no longer argue over formatting or interpretation. The Genie provides a common framework that can align business analysts, developers and QA engineers from day one.
Genie has now reached a level of maturity where JM Family is preparing to commercialize it. The internal rollout has proven robust enough to operate beyond their own ERP transformation.
Conversations are underway to develop it into a formal offering, targeting peer automotive firms and other large enterprises facing similar documentation challenges.
They’re also layering in retrieval-augmented generation (RAG) capabilities and structured feedback loops, so the agents can self-improve based on project outcomes, testing anomalies, and retrospective corrections. Now, the toolset morphs into a product with memory, which is critically crucial for knowledge work in deep domain industries.
One of their AI & ML Research Scientists shared some of the challenges in reaching this agentic environment. For instance, they explored LangChain and AutoGen, but both required too much backend effort.
LangChain is a popular tool that enables AI apps to utilize memory and connect with services like databases using chain logic. AutoGen enables multiple AI agents to collaborate on tasks through structured communication.
JM Family first experimented with AutoGen early last year, demoing it to the c-suite. “It was an ‘aha’ moment that agents can communicate with each other and take action on humans’ behalf.”
Their preferred approach became Azure’s Agent Service. This plug-and-play platform enables their agents to collaborate, interact with humans, and focus on delivering business outcomes without the need for deep infrastructure overhead.
If you’re exploring a similar route, consider his advice to be very clear about the goal of the AI use case and define precise rules for each agent’s behavior. Finally, test each agent individually before having them interact; this avoids chaos in multi-agent environments.
From boardroom pitch to AI CoE
All these efforts show strong pipeline discipline when it comes to digital transformation: building the internal muscle to go from problem → evaluation → deployment with repeatability.
Finally, we can clearly see they are betting on talent, including a new initiative underway to roll out personalized AI training plans for software engineers and solution architects. The goal is to ensure AI sinks into the crevices of business units and creates long-lasting value.
If you’re a CIO sitting in the boardroom getting grilled on AI progress, JM Family’s story might sting a little. They’re one of the rare few to crack the equation outlined in Microsoft’s AI CoE guidance: While 79% of leaders say AI is important, 60% still lack a clear strategy, and only 1 in 10 have reached maturity.
JM has steered itself into that 10% stratosphere: They built infrastructure, governed data and legacy sprawl and prioritized value. Now, while your enterprise may still be debating strategy, JM is already deploying, from HR chatbots to multi-agent tools, because they have moved with purpose.

It all started by documenting 50 years of tribal knowledge, which then served as a clean foundation to build an AI ecosystem that works. While most enterprises still treat technology as a series of short-term projects, with incentives misaligned from long-term value, JM Family took the harder path: confronting technical debt head-on. As McKinsey notes, indirect costs—cloud, security, tooling—can consume up to 80% of a product’s full lifecycle cost. By taming process sprawl and capturing institutional memory first, JM avoided that trap—and built a foundation that could support real AI scale.
With this in mind, we could imagine you, the CIO, phrasing a new approach the next time you enter the boardroom: “Our decades-old core systems cost millions for simple changes while competitors deploy AI in weeks. Cloud migration unlocks machine learning, real-time analytics, and 10x faster development cycles. Without modern infrastructure, we can’t compete for our industry partnerships or retain top talent.”