Skip to main content
Daniel J Glover
Back to Blog

AI Enablement: Your 90-Day Roadmap

14 min read

This is the final article in a 7-part series on Business AI Enablement for IT Leaders. The series covered why enablement matters, shadow AI risks, building an enablement framework, employee training, tool selection, and governance controls.


This series has covered the why, what, and how of AI enablement. This final article addresses the when and the who - a practical roadmap for implementing everything discussed.

Ninety days is long enough to establish genuine capability but short enough to maintain momentum. By the end of this period, you should have the foundations of an enablement programme that can scale and sustain.

This is not a template to follow blindly. Adapt the timeline to your organisation's pace, existing capabilities, and specific constraints. But the sequence matters - earlier phases build foundations for later ones.

The 90-Day Transformation Framework

The roadmap divides into three phases:

PhaseFocusKey Outcomes
Days 1-30Assessment and FoundationClear understanding of current state; governance foundations; executive alignment
Days 31-60Pilot and TrainingInitial tools deployed; training developed and delivered; champions activated
Days 61-90Scale and RefineBroader rollout; feedback integration; sustained operation established

Each phase builds on the previous. Rushing through assessment undermines pilot effectiveness. Scaling before pilots prove value wastes resources on flawed approaches.

Days 1-30: Assessment and Foundation

The first month establishes the understanding and governance that enable everything that follows.

Week 1: Executive Alignment and Shadow AI Discovery

Objectives:

  • Secure executive sponsorship for AI enablement
  • Launch shadow AI assessment to understand current state

Executive alignment activities:

  • Schedule briefing with key executives (CIO, CISO, COO at minimum)
  • Present the business case for enablement versus restriction
  • Obtain commitment to sponsor the initiative
  • Agree on success metrics and review cadence
  • Establish steering committee or governance body

Shadow AI discovery activities:

  • Deploy anonymous employee survey about AI usage
  • Analyse network traffic for AI service connections
  • Review expense reports for AI-related subscriptions
  • Interview business unit leaders about team practices
  • Document initial findings

Deliverables:

  • Executive sponsor confirmed
  • Steering committee established
  • Shadow AI survey launched
  • Initial discovery report drafted

Week 2: Current State Analysis

Objectives:

  • Complete shadow AI assessment
  • Map AI use cases across business units
  • Identify highest-priority needs and risks

Analysis activities:

  • Compile survey results and network analysis
  • Categorise shadow AI by tool, department, use case
  • Assess risk levels of identified shadow usage
  • Prioritise use cases by business value and risk
  • Identify quick wins and critical gaps

Stakeholder engagement:

  • Meet with each business unit leader
  • Understand specific AI needs and pain points
  • Identify potential champions in each area
  • Gauge receptiveness to enablement programme

Deliverables:

  • Complete shadow AI assessment report
  • Use case prioritisation matrix
  • Business unit engagement summary
  • Champion candidate list

Week 3: Governance Foundation

Objectives:

  • Draft core governance documents
  • Establish data classification for AI
  • Define tool approval criteria

Policy development:

  • Draft AI acceptable use policy
  • Define data classification tiers for AI inputs
  • Establish output verification requirements
  • Create tool evaluation criteria
  • Document incident response procedures

Governance structure:

  • Define roles and responsibilities
  • Establish approval workflows
  • Create exception request process
  • Design monitoring approach

Stakeholder review:

  • Circulate drafts to legal, compliance, HR
  • Incorporate feedback
  • Prepare for steering committee review

Deliverables:

  • Draft AI policy
  • Data classification guide
  • Tool evaluation scorecard
  • Incident response procedures

Week 4: Tool Strategy and Training Planning

Objectives:

  • Evaluate and select initial approved tools
  • Design training curriculum structure
  • Prepare for Phase 2 launch

Tool selection:

  • Evaluate enterprise AI platforms against criteria
  • Assess specialised tools for priority use cases
  • Negotiate enterprise agreements
  • Plan technical deployment

Training development:

  • Design foundational AI literacy curriculum
  • Identify role-specific training needs
  • Select training delivery methods
  • Create training materials or identify vendors

Phase 2 preparation:

  • Identify pilot departments
  • Recruit and brief initial champions
  • Prepare communication plans
  • Finalise steering committee approval for Phase 2

Deliverables:

  • Approved tool catalogue (initial)
  • Enterprise agreements in progress
  • Training curriculum outline
  • Pilot plan approved

Month 1 Checkpoint

Before proceeding to Phase 2, validate:

  • [ ] Executive sponsor actively engaged
  • [ ] Steering committee meeting regularly
  • [ ] Shadow AI assessment complete with clear findings
  • [ ] AI policy drafted and reviewed
  • [ ] Initial tools selected and procurement started
  • [ ] Training approach defined
  • [ ] Pilot departments and champions identified
  • [ ] Success metrics established

If significant gaps exist, extend Phase 1 rather than proceeding with weak foundations.

Days 31-60: Pilot and Training

The second month activates enablement in controlled scope, building capability and learning before broad rollout.

Week 5: Pilot Launch

Objectives:

  • Deploy approved tools to pilot groups
  • Activate champions in pilot departments
  • Begin baseline measurement

Tool deployment:

  • Complete technical provisioning
  • Configure access controls and monitoring
  • Provide tool-specific quick-start guides
  • Establish support channels

Champion activation:

  • Conduct champion training session
  • Clarify champion role and expectations
  • Provide champion resources and materials
  • Establish champion communication channel

Measurement baseline:

  • Document current state metrics in pilot areas
  • Establish tracking mechanisms
  • Begin collecting usage data

Deliverables:

  • Tools deployed to pilot users
  • Champions trained and active
  • Baseline metrics documented

Week 6: Training Delivery Begins

Objectives:

  • Deliver foundational AI training to pilot groups
  • Begin role-specific training development
  • Gather initial feedback

Training activities:

  • Conduct foundational AI literacy sessions
  • Deliver security awareness components
  • Start role-specific training for pilot roles
  • Collect participant feedback

Iterative improvement:

  • Review feedback after each session
  • Adjust content and delivery as needed
  • Document lessons for broader rollout

Champion engagement:

  • Hold first champion community meeting
  • Share early use case successes
  • Address emerging questions and issues

Deliverables:

  • Foundational training delivered to pilots
  • Feedback collected and analysed
  • Training materials refined

Week 7: Pilot Operation

Objectives:

  • Support pilot users in active AI use
  • Monitor for issues and opportunities
  • Refine governance based on experience

Support activities:

  • Respond to user questions and issues
  • Track and resolve problems quickly
  • Document common issues for training and FAQ

Governance refinement:

  • Adjust policies based on practical experience
  • Clarify ambiguities that emerge
  • Update data classification guidance as needed

Use case development:

  • Document successful use cases
  • Identify unexpected applications
  • Prepare case studies for broader rollout

Deliverables:

  • Support issues tracked and resolved
  • Policy refinements documented
  • Use case library started

Week 8: Pilot Evaluation and Scale Planning

Objectives:

  • Assess pilot outcomes against success criteria
  • Plan adjustments for broader rollout
  • Prepare scale resources

Evaluation:

  • Compare metrics to baseline
  • Gather qualitative feedback from pilots
  • Assess policy and training effectiveness
  • Identify what worked and what did not

Scale planning:

  • Define rollout sequence for remaining departments
  • Estimate resource requirements
  • Identify additional tools needed
  • Plan champion recruitment for new areas

Resource preparation:

  • Finalise training materials
  • Scale support capacity
  • Expand champion network
  • Update communication plans

Deliverables:

  • Pilot evaluation report
  • Adjusted policies and training
  • Scale plan approved
  • Resources prepared for Phase 3

Month 2 Checkpoint

Before broad rollout, confirm:

  • [ ] Pilot demonstrated value with measurable outcomes
  • [ ] Critical issues identified and addressed
  • [ ] Training effective based on feedback
  • [ ] Governance practical and followed
  • [ ] Champions capable and engaged
  • [ ] Support model sustainable
  • [ ] Scale plan resourced and approved

If the pilot revealed significant problems, address them before scaling. Scaling broken approaches wastes resources and damages credibility.

Days 61-90: Scale and Refine

The final month extends enablement across the organisation while establishing sustainable operation.

Week 9: Rollout Wave 1

Objectives:

  • Extend enablement to next priority departments
  • Activate additional champions
  • Scale training delivery

Deployment:

  • Provision tools for Wave 1 departments
  • Activate champions in new areas
  • Deploy training to new groups

Lessons application:

  • Apply pilot learnings to rollout
  • Use refined materials and processes
  • Anticipate issues based on pilot experience

Support scaling:

  • Expand help resources
  • Activate additional support capacity
  • Monitor for scaling issues

Deliverables:

  • Wave 1 departments enabled
  • Training delivered
  • Support functioning at scale

Week 10: Rollout Wave 2 and Monitoring

Objectives:

  • Continue rollout to remaining departments
  • Establish ongoing monitoring
  • Address emerging issues

Continued rollout:

  • Extend to Wave 2 departments
  • Complete training delivery
  • Activate remaining champions

Monitoring establishment:

  • Implement ongoing usage monitoring
  • Establish compliance checking processes
  • Create regular reporting mechanisms

Issue management:

  • Track and resolve emerging issues
  • Update FAQ and guidance
  • Refine support processes

Deliverables:

  • Rollout substantially complete
  • Monitoring operational
  • Support processes stable

Week 11: Optimisation and Documentation

Objectives:

  • Optimise based on experience
  • Complete operational documentation
  • Prepare for steady state

Optimisation:

  • Analyse usage patterns for improvement opportunities
  • Adjust governance based on accumulated experience
  • Refine training based on broad feedback
  • Update tool catalogue based on identified needs

Documentation:

  • Finalise all policy documents
  • Complete training materials
  • Document support procedures
  • Create operational runbooks

Transition planning:

  • Transition from project to operational mode
  • Clarify ongoing roles and responsibilities
  • Establish continuous improvement processes

Deliverables:

  • Optimised processes
  • Complete documentation
  • Transition plan

Week 12: Handover and Future Planning

Objectives:

  • Complete transition to steady state
  • Establish improvement mechanisms
  • Plan future development

Handover activities:

  • Transfer to operational owners
  • Confirm ongoing resource allocation
  • Establish performance review cadence

Continuous improvement:

  • Create feedback collection mechanisms
  • Establish regular policy review process
  • Plan training updates
  • Schedule tool catalogue review

Future planning:

  • Identify next capability developments
  • Plan for emerging tools and use cases
  • Anticipate regulatory changes
  • Build business case for continued investment

Deliverables:

  • Operational handover complete
  • Improvement processes established
  • Future roadmap drafted

Month 3 Checkpoint

At day 90, assess:

  • [ ] Enablement available to all target employees
  • [ ] Training completed by all active users
  • [ ] Governance functioning as designed
  • [ ] Champions active across departments
  • [ ] Support operating sustainably
  • [ ] Metrics showing value delivered
  • [ ] Improvement mechanisms in place
  • [ ] Ownership clear for steady state

Measuring Success and Demonstrating Value

Throughout the 90 days and beyond, measurement proves value and guides improvement.

Key Metrics

CategoryMetricTarget Direction
AdoptionActive users as percentage of eligibleIncreasing
AdoptionUsage frequency per userIncreasing
AdoptionUse cases addressedExpanding
ValueTime saved on AI-enabled tasksIncreasing
ValueQuality improvement in outputsIncreasing
ValueEmployee satisfaction with AI toolsIncreasing
RiskShadow AI incidents detectedDecreasing
RiskPolicy violationsDecreasing
RiskSecurity incidents involving AIZero or decreasing

Demonstrating Value

Different stakeholders care about different outcomes:

For executives: Business impact - productivity gains, cost savings, competitive capability.

For business units: Practical value - tasks made easier, quality improved, time saved.

For security and compliance: Risk reduction - shadow AI controlled, governance functioning, incidents prevented.

For employees: Experience improvement - useful tools, clear guidance, effective support.

Tailor reporting to stakeholder interests while maintaining consistent underlying metrics.

Beyond 90 Days: Sustaining Enablement

The 90-day roadmap establishes capability. Sustaining it requires ongoing attention.

Continuous Improvement

AI enablement is not a project with an end date. Sustainable operation requires:

  • Regular policy review. Quarterly minimum, with interim updates for significant changes.

  • Training updates. New content for new tools, updated content for evolving capabilities.

  • Tool catalogue evolution. Adding new tools, retiring obsolete ones, renegotiating agreements.

  • Feedback integration. Systematically collecting and acting on employee input.

  • Industry monitoring. Tracking regulatory changes, emerging risks, and new capabilities.

Scaling Maturity

After establishing foundations, consider advancing to higher maturity:

  • Enhanced automation. Automated provisioning, compliance checking, reporting.

  • Advanced use cases. Moving from productivity to analysis to automation.

  • Deeper integration. AI embedded in workflows rather than used as separate tools.

  • Predictive governance. Anticipating needs and risks rather than reacting.

Building the Business Case for Continued Investment

The 90-day programme consumes resources. Continued investment requires demonstrated return:

  • Document productivity gains with before-and-after comparisons.

  • Calculate risk avoidance from shadow AI reduction.

  • Quantify efficiency improvements from streamlined processes.

  • Show capability development in employee skills and organisational knowledge.

This data supports budget requests and executive commitment.

Quick Reference: Week-by-Week Action Checklist

Phase 1: Assessment and Foundation

Week 1:

  • [ ] Secure executive sponsor
  • [ ] Establish steering committee
  • [ ] Launch shadow AI survey
  • [ ] Begin network analysis

Week 2:

  • [ ] Complete shadow AI assessment
  • [ ] Map use cases by business unit
  • [ ] Identify champion candidates
  • [ ] Prioritise needs and risks

Week 3:

  • [ ] Draft AI policy
  • [ ] Define data classification
  • [ ] Create tool evaluation criteria
  • [ ] Establish incident response procedures

Week 4:

  • [ ] Select initial approved tools
  • [ ] Design training curriculum
  • [ ] Identify pilot departments
  • [ ] Prepare for Phase 2

Phase 2: Pilot and Training

Week 5:

  • [ ] Deploy tools to pilot groups
  • [ ] Activate pilot champions
  • [ ] Establish baseline metrics
  • [ ] Open support channels

Week 6:

  • [ ] Deliver foundational training
  • [ ] Begin role-specific training
  • [ ] Collect initial feedback
  • [ ] Hold first champion meeting

Week 7:

  • [ ] Support active AI use
  • [ ] Refine governance based on experience
  • [ ] Document successful use cases
  • [ ] Address emerging issues

Week 8:

  • [ ] Evaluate pilot outcomes
  • [ ] Plan scale adjustments
  • [ ] Prepare rollout resources
  • [ ] Get approval for Phase 3

Phase 3: Scale and Refine

Week 9:

  • [ ] Deploy to Wave 1 departments
  • [ ] Activate additional champions
  • [ ] Scale training delivery
  • [ ] Apply pilot learnings

Week 10:

  • [ ] Deploy to Wave 2 departments
  • [ ] Establish ongoing monitoring
  • [ ] Complete training rollout
  • [ ] Stabilise support processes

Week 11:

  • [ ] Optimise based on experience
  • [ ] Complete documentation
  • [ ] Plan transition to operations
  • [ ] Update all materials

Week 12:

  • [ ] Complete operational handover
  • [ ] Establish improvement mechanisms
  • [ ] Draft future roadmap
  • [ ] Report final outcomes

Final Thoughts

This series has covered the complete journey from understanding AI enablement challenges to implementing solutions. The key themes throughout:

Enablement over restriction. The organisations succeeding with AI help employees use it well rather than trying to prevent use.

Balance over extremes. Effective programmes balance access with governance, speed with safety, standardisation with flexibility.

Systems over events. Sustainable enablement requires ongoing capability, not one-time projects.

Learning over perfection. Start, learn, improve. Waiting for perfect conditions means waiting forever.

The 95% AI ROI failure rate is not inevitable. It reflects organisations that deployed technology without developing capability. With systematic enablement, your organisation can be in the 5% that succeeds.

As I explored in my 2026 IT strategy review checklist, successful technology initiatives align with broader organisational strategy. AI enablement is among the most important strategic initiatives IT leaders will undertake this year.

The roadmap is here. The first step is yours.


Implementing Your AI Enablement Programme

Transforming AI adoption from chaos to capability requires experienced guidance and systematic execution. My IT management services help organisations plan and implement AI enablement programmes that deliver measurable value while managing risk.

Get in touch to discuss how to begin your AI enablement journey.


Previous: Part 6 - AI Governance: Controls That Work

This concludes the Business AI Enablement series. Return to Part 1 to start from the beginning.

Share this post

DG

Daniel J Glover

IT Leader with experience spanning IT management, compliance, development, automation, AI, and project management. I write about technology, leadership, and building better systems.

Let's Work Together

Need expert IT consulting? Let's discuss how I can help your organisation.

Get in Touch