Citi has 4,000 AI champions spread across 182,000 employees in 84 countries. The result: over 70% adoption of firm-approved AI tools. PwC Netherlands scaled from 300 early enthusiasts to all 6,000 employees in under a year.
Meanwhile, most organisations are stuck. Only 5% of enterprise AI systems progress from evaluation to production, according to MIT research. Companies keep buying tools. Employees keep ignoring them.
The difference isn't technology. It's people. Specifically, it's whether you have trained, trusted peers in every team showing colleagues what AI actually looks like in their daily work. That's what an AI champions program does — and it's the highest-leverage move most organisations haven't made yet.
What Is an AI Champions Program?
An AI champions program is a structured peer network of employees who promote, support, and accelerate AI adoption across an organisation. Champions aren't IT specialists — they're trusted colleagues from every function who translate AI strategy into team-level behaviour change. Research consistently shows that peer influence drives AI adoption more effectively than top-down mandates.
Worklytics defines an AI champion as "an influential employee who promotes, supports, and accelerates the adoption of AI within an organization." OpenAI Academy frames the role around three outcomes: improving adoption rates, integrating AI into existing workflows, and establishing team norms around AI use.
The key insight: adoption is not a technology problem. It's a people problem. Employees experiment with new tools but don't integrate them deeply into how work actually gets done. Champions fix this by making AI tangible, practical, and safe to try — one team at a time.
GitHub's framework describes champions as "part coach, part translator, and part feedback loop." They demonstrate practical AI applications within daily workflows, mentor colleagues through initial adoption hurdles, and surface real-time feedback on what's working and what isn't.
Why Your Organisation Needs Champions Now
The AI adoption gap is widening. 88% of organisations use AI somewhere, but only 1% consider their strategy mature, according to Gartner. Champions close this gap by converting isolated experimentation into systematic, team-level adoption — the step most organisations skip entirely.
The numbers tell a clear story. McKinsey's State of AI survey found 62% of organisations are at least experimenting with AI agents, but Deloitte's Tech Trends 2026 report reveals only 14% have deployable solutions. Another 35% have no formal AI strategy at all.
| Metric | Figure | Source |
|---|---|---|
| Enterprise AI systems reaching production | 5% | MIT (2025) |
| Organisations with no formal AI strategy | 35% | Deloitte |
| Employees vs C-suite who believe AI adoption succeeded | 45% vs 75% | Writer (2025) |
| Organisations facing critical AI skills shortages by 2026 | 90% | IDC |
| Potential productivity gains missed due to no training strategy | 40% | EY |
That last row from EY is the one that should concern every executive. Nearly half the value of your AI investment is being left on the table because your people aren't equipped to use it.
A 2025 Writer survey found only 45% of employees believe their organisation has successfully adopted AI — compared to 75% of C-suite executives. Champions help close this perception gap by making adoption visible and measurable at the team level.
Champions solve this by doing what no top-down mandate can: building trust at the peer level. When a colleague in finance shows you how they use AI to cut their month-end close by two hours, that's worth more than any executive keynote about "digital transformation."
How to Structure Your Champion Program
The most effective champion programs follow a three-phase rollout: recruit volunteers in the first 30 days, build community over 90 days, then transition to a self-sustaining model. Start with 5-10% of your AI user base as champions, with one lead for every 10-20 champions. Every team should be able to reach a champion without filing a support ticket.
GitHub's playbook outlines a proven three-phase approach:
Phase 1: Recruit (First 30 Days)
Open calls-to-action on high-visibility channels with clear deadlines. Don't select champions based on who's loudest about AI. PwC Netherlands used organisational network analysis to identify "naturally influential people" rather than just vocal AI fans — and the adoption results were, in their words, "magical."
The best champions often come from non-technical backgrounds. Finance analysts, operations managers, and marketing leads outperform tech enthusiasts because they bring real workflow context. They can answer the question every sceptical employee asks: "But how does this help me do my job?"
Phase 2: Build Community (30-90 Days)
Create dedicated communication channels, run monthly check-ins, and publicly recognise contributions. The time commitment should be light: 30-60 minutes weekly, framed as optional and intellectually stimulating — not another mandatory meeting.
Champions focus on four activities:
- Demonstrating AI in real tasks during team meetings
- Providing contextual help on specific use cases
- Sharing working examples and lessons learned
- Flagging friction points to central teams
Phase 3: Sustain (90+ Days)
Transition to a self-sustaining model by identifying community leaders, distributing ownership, and creating lightweight roles. GitHub recommends templates and guides that let champions operate independently without constant central coordination.
Sizing Your Program
The recommended ratio from Lead with AI's research:
| Organisation Size | Champions Needed | Leads Needed |
|---|---|---|
| 500 employees | 25-50 | 2-5 |
| 1,000 employees | 50-100 | 5-10 |
| 5,000 employees | 250-500 | 25-50 |
| 10,000 employees | 500-1,000 | 50-100 |
The goal: every team can reach a champion without filing a support ticket.
What Champions Actually Need to Succeed
Champions fail when they're enthusiastic but untrained. The most effective programs pair daily skill-building with the champion role — ensuring advocates stay ahead of the tools they're promoting. Continuous AI fluency training is the foundation that makes everything else work.
Here's where most champion programs break down. You identify 50 eager volunteers, give them a Slack channel and a badge, and expect adoption to happen. It doesn't — because enthusiasm without skill decays fast.
Champions need three things to succeed:
1. Continuous AI skill-building. AI moves too fast for one-off training. Champions need to stay current on tools, techniques, and use cases relevant to their function. Daily microlearning — short, role-specific sessions — keeps champions sharp without overloading their schedule. Research shows 6-10 minute daily sessions achieve 80% completion rates versus 20% for traditional courses.
2. A structured knowledge path. Champions can't just "explore AI" — they need a progressive curriculum that builds from AI fluency fundamentals to advanced application. This means understanding the difference between AI literacy and AI fluency and building practical, daily competence, not just awareness.
3. Protection from burnout. Lead with AI's guide identifies burnout as the top program killer. Keep the role at 30-60 minutes weekly. Assign one lead per 10-20 champions for support. Rotate office hours. And critically — keep the work intellectually engaging by focusing champions on complex workflow challenges, not repetitive "where's the button?" queries.
How to Measure Champion Impact
Don't measure champion success by tool login counts. Measure behaviour change: are teams approaching work differently? Are AI workflows becoming team norms? Citi's 4,000-champion network achieved 70%+ adoption of firm-approved tools — measured by genuine workflow integration, not just activation clicks.
Usage metrics tell you people clicked. They don't tell you people changed. The right metrics for a champion program measure behaviour, not activity:
- Adoption by team: Which teams are integrating AI into standard workflows versus just experimenting? Champions should correlate with higher team-level adoption.
- Workflow changes: Are teams approaching work differently than six months ago? Are AI-assisted processes becoming the default?
- Repeatable patterns: Are champions surfacing use cases that other teams adopt? This is the compounding effect — one champion's insight scales across the organisation.
- Time-to-value: How quickly do new AI tools go from "deployed" to "used daily" in champion-led teams versus others?
Worklytics recommends workplace analytics to track adoption rates by team, role, and department — correlating champion presence with measurable productivity signals. The goal isn't vanity metrics. It's answering: "Are our people actually doing their jobs differently because of AI?"
Common Mistakes That Kill Champion Programs
The three most common champion program failures: selecting for enthusiasm instead of influence, providing no ongoing training, and measuring logins instead of behaviour change. Each one is avoidable with the right structure.
1. Selecting for enthusiasm over influence. The person who won't stop talking about ChatGPT isn't necessarily your best champion. Look for people who already help colleagues informally, show curiosity about workflow improvements, and communicate across functions and seniority levels.
2. One-and-done training. A two-day bootcamp doesn't make a champion. AI capabilities change monthly. Champions who stop learning quickly become outdated advocates promoting yesterday's techniques. Daily, continuous skill-building is what separates effective programs from performative ones.
3. No governance layer. Champions without guardrails create risk. Pair your peer network with approved tools, explicit data boundaries, and clear use-case guidelines. The three-layer model works: top governance (policies), middle integration (embedded in existing tools), and culture building (recognition and collaboration).
4. Treating it as IT's job. Champion programs that live in IT departments fail. The whole point is peer influence from within business functions. IT enables the tools. Champions enable the people.
Start Small, Scale Fast
You don't need 4,000 champions on day one. Citi didn't start there either. Start with a pilot: 10-20 champions across 3-4 departments, a clear 30-day recruiting window, and a simple success metric — are champion-led teams using AI tools more effectively than others?
The organisations winning the AI adoption race aren't the ones with the biggest tool budgets. They're the ones investing in the people layer — trained, trusted peers who make AI practical for everyone around them.
Your champions don't need to be AI experts. They need to be curious, influential, and consistently learning. Give them the right training, the right support, and 30–60 minutes a week — and watch adoption compound.
Frequently Asked Questions
- What is an AI champions program?
- An AI champions program is a structured network of peer advocates who promote, support, and accelerate AI adoption within an organisation. Champions are influential employees — often from non-technical backgrounds — who demonstrate AI in real workflows, mentor colleagues, and create feedback loops between teams and leadership. The typical time commitment is 30-60 minutes per week.
- How many AI champions does an organisation need?
- The recommended ratio is 5-10% of your initial AI user base, with one lead for every 10-20 champions. A 1,000-person organisation would need 50-100 champions distributed across functions and departments. The goal is that every team can reach a champion without filing a support ticket.
- Why do AI rollouts fail without champions?
- Only 5% of enterprise AI systems progress from evaluation to production, according to MIT research. The primary reason isn't technology — it's that employees experiment with tools but never integrate them into how work actually gets done. Champions bridge this gap through peer influence, which drives behaviour change more effectively than top-down mandates.
- What does an AI champion do day-to-day?
- Champions spend 30-60 minutes weekly demonstrating AI in real tasks during team meetings, providing contextual help on specific use cases, sharing working examples and lessons learned, and flagging friction points to central teams. They act as part coach, part translator, and part feedback loop between frontline workers and AI leadership.
- How do you measure AI champion program success?
- Go beyond usage metrics to measure behaviour change: are teams approaching work differently? Are AI workflows becoming team norms? Are champions surfacing repeatable patterns others adopt? Citi achieved 70%+ adoption of firm-approved tools through their 4,000-champion network. Track adoption rates by team, correlation with productivity signals, and champion-led workshop attendance.
