The System That Proves Value and Prevents “Did It Work?” Arguments
Why Most Projects Can’t Prove Success
The amateur consultant approach:
Project launches. Three months later…
Client: “Did this actually deliver the benefits you promised?” Consultant: “Absolutely! The team is using it and things are better.” Client: “But how much better? Can you show me the numbers?” Consultant: “Well, we didn’t set up formal tracking, but feedback has been positive…” Client: “So you can’t actually prove ROI?” Consultant: [crickets]
Result: Client doubts value. Refuses to fund Phase 2. No reference. No repeat business.
The professional consultant approach:
Project launches. Three months later…
Client: “Did this actually deliver the benefits you promised?” Consultant: “Yes. Let me show you the data. [Opens dashboard] Onboarding cycle time dropped from 35 days to 16 days—54% improvement vs. our 57% target. That’s 288 customers × 19 days saved = 5,472 days of faster value delivery. Manual data entry eliminated—we’ve processed 72 integrations with 98.6% accuracy, zero manual entry. Customer satisfaction increased from 6.8 to 8.2. First-90-day churn dropped from 5% to 2.5%, saving $63K in retained LTV. ROI after 3 months: $216K in benefits vs. $435K investment, tracking to full payback by month 6 as projected. Here’s the monthly trend showing continuous improvement.”
Client: “This is exactly what we needed. Let’s talk about Phase 3.”
Result: Proven value. Phase 3 approved. Glowing reference. Partnership deepened.
What “Success Metrics and Measurement Framework” Actually Means
A Success Metrics and Measurement Framework is the systematic approach to:
- DEFINING success (specific, measurable outcomes aligned to objectives)
- BASELINE measurement (document current state before changes)
- TARGET setting (realistic improvement goals with timelines)
- DATA COLLECTION (automated where possible, manual where necessary)
- TRACKING progress (regular measurement cadence)
- REPORTING results (dashboards, trends, insights)
- COURSE CORRECTION (using data to improve)
- PROVING value (ROI demonstration with evidence)
This is NOT:
- ❌ Vague goals like “improve efficiency” or “better customer experience”
- ❌ Metrics you measure once at project end
- ❌ Data you can’t actually collect
- ❌ Vanity metrics that don’t tie to business outcomes
- ❌ Metrics only you (the consultant) can access
This IS:
- ✅ Specific, quantified success criteria agreed upfront
- ✅ Continuous measurement from baseline through post-launch
- ✅ Metrics tied directly to original problem quantification
- ✅ Automated data collection (sustainable long-term)
- ✅ Accessible dashboards stakeholders can view anytime
- ✅ Leading indicators (predictive) and lagging indicators (outcomes)
- ✅ Regular review cadence with action on insights
The goal: Create an evidence-based feedback loop that proves value, identifies issues early, and drives continuous improvement.
The Success Metrics Framework
Layer 1: The Metrics Hierarchy
┌─────────────────────────────────────────────────────────────┐
│ METRICS HIERARCHY PYRAMID │
└─────────────────────────────────────────────────────────────┘
╱╲
╱ ╲
╱ L1 ╲ STRATEGIC METRICS
╱──────╲ (Executive Dashboard)
╱ ╲ • Business outcomes
╱ L2 ╲ • Financial impact
╱────────────╲ • Customer impact
╱ ╲
╱ L3 ╲ OPERATIONAL METRICS
╱──────────────────╲ (Manager Dashboard)
╱ ╲• Process performance
╱ L4 ╲• Quality metrics
╱──────────────────────╲• Efficiency gains
╱ ╲
╱ L5 ╲ ACTIVITY METRICS
╱───────────────────────────╲ (Team Dashboard)
╱ ╲• Usage/adoption
╱ ╲• System health
╱__________________________________╲• User actions
LEVEL 1 (TOP): Strategic Business Outcomes
- What executives care about
- Tied to original business case
- Reported monthly/quarterly
- Examples: Revenue impact, Cost reduction, Customer retention
LEVEL 2: Operational Performance
- What managers need to run the business
- Process-level metrics
- Reported weekly/monthly
- Examples: Cycle time, Throughput, Error rates
LEVEL 3: System & Quality Metrics
- Technical and quality performance
- Platform health indicators
- Reported daily/weekly
- Examples: Integration success rate, Data accuracy, Uptime
LEVEL 4: User Adoption & Activity
- Are people using the system?
- How are they using it?
- Reported daily/weekly
- Examples: Active users, Feature usage, Workflow completion
LEVEL 5 (BOTTOM): System Activity Logs
- Raw data underlying everything else
- Automated capture
- Reported real-time
- Examples: API calls, Logins, Transactions
═══════════════════════════════════════════════════════════════
PRINCIPLE: Metrics flow upward
Lower levels aggregate into higher levels
All strategic metrics traceable to activity data
═══════════════════════════════════════════════════════════════
Layer 2: Metrics Selection – What to Measure
═══════════════════════════════════════════════════════════════
METRICS SELECTION FRAMEWORK
Project: Customer Onboarding Transformation
═══════════════════════════════════════════════════════════════
PRINCIPLE: Measure what matters, not what's easy.
STEP 1: Link Metrics to Original Problem Statement
─────────────────────────────────────────────────────────────────
Original Problem:
"Onboarding takes 35 days, costs $1.5M/year, causes 5% first-90-day
churn, prevents scaling."
Therefore, PRIMARY metrics must measure:
✓ Onboarding cycle time (35 days → target)
✓ Cost of onboarding process ($ → target)
✓ First-90-day churn rate (5% → target)
✓ Capacity/scalability (customers per CSR → target)
If you're not measuring these, you can't prove you solved the problem.
─────────────────────────────────────────────────────────────────
STEP 2: Apply SMART Criteria to Each Metric
─────────────────────────────────────────────────────────────────
Each metric must be:
S - SPECIFIC: Precisely defined, no ambiguity
M - MEASURABLE: Can be quantified with data
A - ACHIEVABLE: Realistic to collect and improve
R - RELEVANT: Ties to business objectives
T - TIME-BOUND: Target date for achievement
BAD METRIC: "Improve customer satisfaction"
└─ Not specific (how much? measured how?)
└─ Not measurable (no baseline, no target)
└─ Not time-bound (by when?)
GOOD METRIC: "Increase onboarding CSAT score from 6.8/10 to 8.5/10
within 3 months of Phase 2 go-live, measured via post-onboarding
survey sent to all customers"
└─ Specific (CSAT score, 6.8 → 8.5)
└─ Measurable (10-point scale, survey)
└─ Achievable (realistic 1.7 point improvement)
└─ Relevant (ties to customer impact problem)
└─ Time-bound (3 months post-launch)
─────────────────────────────────────────────────────────────────
STEP 3: Balance Leading and Lagging Indicators
─────────────────────────────────────────────────────────────────
LAGGING INDICATORS (Outcomes - backward looking):
• Onboarding cycle time (measures what already happened)
• Churn rate (outcome after 90 days)
• Cost per customer (historical)
• Customer satisfaction score (post-onboarding survey)
Pros: Direct measure of business outcomes
Cons: Delayed (takes time to see results), can't act on quickly
LEADING INDICATORS (Predictive - forward looking):
• Integration success rate (predicts future efficiency)
• Time to first contact (predicts customer experience)
• System adoption rate (predicts long-term success)
• Status inquiry volume (predicts customer frustration)
Pros: Early warning system, actionable (can intervene)
Cons: Indirect measure (correlation not guaranteed)
BALANCE: Need both
├─ Lagging indicators prove you achieved outcomes
└─ Leading indicators allow course correction during execution
Example Pairs:
• Leading: Integration success rate (daily)
Lagging: Manual data entry time eliminated (monthly)
• Leading: CSR system adoption (daily)
Lagging: CSR productivity improvement (monthly)
• Leading: Time to first contact (daily)
Lagging: Customer satisfaction score (post-onboarding)
─────────────────────────────────────────────────────────────────
STEP 4: Limit Metrics to Essential Set
─────────────────────────────────────────────────────────────────
TRAP: Measuring everything = managing nothing
TOO MANY METRICS:
• Overwhelm stakeholders
• Dilute focus
• Cost too much to collect
• Analysis paralysis
GUIDELINE:
• Strategic metrics: 5-7 (executive dashboard)
• Operational metrics: 10-15 (manager dashboard)
• Activity metrics: 20-30 (system logs, automated)
For Customer Onboarding project: 6 primary + 12 supporting = 18 total
(Plus automated activity logs)
─────────────────────────────────────────────────────────────────
STEP 5: Ensure Data Collectability
─────────────────────────────────────────────────────────────────
Before finalizing metrics, validate:
CAN WE ACTUALLY MEASURE THIS?
For each metric:
1. Where does data come from? (system logs, surveys, manual)
2. How often can we collect it? (real-time, daily, weekly)
3. Who collects it? (automated, specific person)
4. What's the cost/effort? (hours per month)
5. How reliable is the data? (accuracy, completeness)
Example:
Metric: "Average onboarding cycle time"
Data source: Workflow platform timestamps
├─ Start: Deal marked "Closed Won" in Salesforce (automated)
├─ End: Onboarding project status = "Complete" in workflow tool
├─ Calculation: End date - Start date = cycle time (days)
Frequency: Real-time (calculate on completion)
Collection: Automated (workflow platform calculates)
Reporting: Dashboard updates nightly
Cost: $0 (automated)
Reliability: HIGH (system timestamps, no manual entry)
✓ FEASIBLE - include this metric
Counter-example:
Metric: "Time CSR spends on manual coordination per customer"
Data source: ???
├─ No system tracks this today
├─ Would require CSRs to manually log time
├─ Time tracking system doesn't exist
├─ Cultural resistance to time tracking
Frequency: Would need daily time logs
Collection: Manual (CSRs self-report)
Reporting: Weekly aggregation (someone compiles)
Cost: ~2 hours/week CSR time to log + 2 hours/week to aggregate
Reliability: LOW (self-reported time notoriously inaccurate)
⚠ QUESTIONABLE - requires significant overhead
Alternative: Track # of status inquiry tickets (automated, proxy metric)
═══════════════════════════════════════════════════════════════
STEP 6: Define Baseline, Target, and Threshold
─────────────────────────────────────────────────────────────────
For EACH metric, specify:
BASELINE: Current state (measured before changes)
TARGET: Goal state (what success looks like)
THRESHOLD: Minimum acceptable (failure if below this)
Example:
┌─────────────────────────────────────────────────────────────┐
│ Metric: Onboarding Cycle Time (Days) │
├─────────────────────────────────────────────────────────────┤
│ BASELINE: 35 days (average, last 6 months) │
│ Range: 22-58 days (min-max) │
│ Std dev: 8.2 days │
│ Measured: 172 onboardings (Jan-Jun) │
│ │
│ TARGET: 15 days (57% improvement) │
│ Rationale: Industry benchmark 15-20 days │
│ Achievable with automation + process │
│ Timeline: 3 months post-launch │
│ │
│ THRESHOLD: 25 days (minimum acceptable) │
│ Rationale: Even 25 days is 29% improvement │
│ If can't achieve <25, something wrong │
│ Triggers escalation and root cause analysis │
│ │
│ STRETCH GOAL: 12 days (66% improvement) │
│ Optimistic case with perfect execution │
└─────────────────────────────────────────────────────────────┘
Why all three?
• BASELINE: Proves starting point (can't claim improvement without it)
• TARGET: Sets clear success criteria (what we're aiming for)
• THRESHOLD: Defines failure (triggers intervention)
• STRETCH: Motivates excellence (but don't penalize for missing)
═══════════════════════════════════════════════════════════════
Layer 3: Comprehensive Metrics Catalog
═══════════════════════════════════════════════════════════════
COMPLETE METRICS CATALOG
Project: Customer Onboarding Transformation
═══════════════════════════════════════════════════════════════
CATEGORY 1: STRATEGIC BUSINESS METRICS
═══════════════════════════════════════════════════════════════
METRIC 1.1: Total Onboarding Cycle Time
─────────────────────────────────────────────────────────────────
DEFINITION:
Days from Opportunity marked "Closed Won" in Salesforce to
Onboarding project status = "Complete" in workflow tool
CALCULATION:
Average of all completed onboardings in measurement period
DATA SOURCE:
Salesforce (Close Date) + Workflow Platform (Completion Date)
Automated calculation via dashboard query
BASELINE: 35 days (average, n=172 customers, Jan-Jun)
TARGET: 15 days (3 months post-Phase 2 launch)
THRESHOLD: 25 days (minimum acceptable)
STRETCH: 12 days (optimistic)
COLLECTION FREQUENCY: Real-time (calculated on each completion)
REPORTING FREQUENCY: Monthly (trending graph + table)
AUDIENCE: Executive team, Steering Committee
IMPORTANCE: ⭐⭐⭐⭐⭐ (PRIMARY SUCCESS METRIC)
This was the #1 problem identified. If this doesn't improve,
project failed regardless of other metrics.
─────────────────────────────────────────────────────────────────
METRIC 1.2: First 90-Day Customer Churn Rate
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of new customers who cancel within 90 days of
onboarding completion
CALCULATION:
(Customers churned within 90 days / Total new customers) × 100%
Measured monthly cohort
DATA SOURCE:
CRM churn records + Onboarding completion dates
Manual monthly reconciliation by Finance
BASELINE: 5.0% (n=288 customers, last 12 months)
TARGET: 2.0% (match industry benchmark)
THRESHOLD: 3.5% (30% improvement minimum)
MEASUREMENT LAG: 90 days (can't measure until 90 days pass)
REPORTING FREQUENCY: Monthly (cohort analysis)
AUDIENCE: Executive team, CFO, Board
IMPORTANCE: ⭐⭐⭐⭐⭐ (PRIMARY SUCCESS METRIC)
Direct financial impact: Each 1% reduction = ~$72K/year savings
Note: Attribution challenge (onboarding is one factor in churn)
Mitigation: Track cohorts before/after launch, control for other variables
─────────────────────────────────────────────────────────────────
METRIC 1.3: Annual Onboarding Cost (Labor + Overhead)
─────────────────────────────────────────────────────────────────
DEFINITION:
Total annual cost of onboarding operations including labor,
software, overhead
CALCULATION:
(CSR hours × rate) + (Implementation hours × rate) +
(Manager hours × rate) + (Software licenses) + (Overhead)
DATA SOURCE:
Time tracking (if available) or estimated allocation
Software costs from Finance
Quarterly calculation
BASELINE: $1,526,901/year (detailed breakdown in business case)
TARGET: $583,070/year (62% reduction)
THRESHOLD: $900,000/year (41% reduction minimum)
REPORTING FREQUENCY: Quarterly (with annual reconciliation)
AUDIENCE: CFO, Finance, Executive team
IMPORTANCE: ⭐⭐⭐⭐ (HIGH - ROI VALIDATION)
Challenge: Some costs are opportunity costs, not cash costs
Solution: Track separately: Cash costs + Opportunity costs
─────────────────────────────────────────────────────────────────
METRIC 1.4: CS Team Capacity (Customers per CSR per Month)
─────────────────────────────────────────────────────────────────
DEFINITION:
Number of customers each CSR can onboard per month
CALCULATION:
Total customers onboarded / Number of CSRs / Number of months
DATA SOURCE:
Workflow platform (assignments + completions)
Automated monthly report
BASELINE: 3.0 customers/CSR/month (24 customers ÷ 8 CSRs)
TARGET: 5.0 customers/CSR/month (67% increase)
THRESHOLD: 4.0 customers/CSR/month (33% increase minimum)
REPORTING FREQUENCY: Monthly
AUDIENCE: VP Customer Success, COO
IMPORTANCE: ⭐⭐⭐⭐ (HIGH - SCALABILITY PROOF)
This metric proves we can grow without proportional hiring
─────────────────────────────────────────────────────────────────
METRIC 1.5: Realized ROI
─────────────────────────────────────────────────────────────────
DEFINITION:
Cumulative benefits realized vs. cumulative investment
CALCULATION:
ROI = (Cumulative Benefits - Cumulative Costs) / Cumulative Costs × 100%
DATA SOURCE:
Benefits: Calculated from operational metrics (see below)
Costs: Project actuals from Finance
Updated monthly
BASELINE: N/A (pre-project)
TARGET: Break-even by Month 6, 100%+ ROI by Month 12
THRESHOLD: Break-even by Month 9 (minimum)
REPORTING FREQUENCY: Monthly (cumulative trend)
AUDIENCE: Executive team, CFO, Board
IMPORTANCE: ⭐⭐⭐⭐⭐ (CRITICAL - PROVES PROJECT VALUE)
Detail: Benefits calculation methodology documented separately
Includes: Labor savings + Churn reduction + Capacity gains
═══════════════════════════════════════════════════════════════
CATEGORY 2: OPERATIONAL PERFORMANCE METRICS
═══════════════════════════════════════════════════════════════
METRIC 2.1: Time to First Contact (Days)
─────────────────────────────────────────────────────────────────
DEFINITION:
Time from deal close to first CSR contact with customer
LEADING INDICATOR for customer satisfaction
DATA SOURCE: Workflow platform timestamps (auto-logged)
BASELINE: 2.3 days average
TARGET: <1 day (24 hours)
THRESHOLD: <2 days
COLLECTION: Real-time, Report: Daily
IMPORTANCE: ⭐⭐⭐⭐ (Leading indicator)
─────────────────────────────────────────────────────────────────
METRIC 2.2: Time to Kickoff Call Scheduled
─────────────────────────────────────────────────────────────────
DATA SOURCE: Workflow platform + Calendar integration
BASELINE: 8.5 days
TARGET: <5 days
THRESHOLD: <7 days
COLLECTION: Real-time, Report: Weekly
IMPORTANCE: ⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 2.3: Time to Technical Setup Complete
─────────────────────────────────────────────────────────────────
DATA SOURCE: Workflow platform task completion
BASELINE: 14 days
TARGET: <5 days
THRESHOLD: <8 days
COLLECTION: Real-time, Report: Weekly
IMPORTANCE: ⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 2.4: Onboarding Completion Rate
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of started onboardings that complete successfully
BASELINE: 95% (5% abandoned/stalled)
TARGET: 98%
THRESHOLD: 96%
COLLECTION: Monthly
IMPORTANCE: ⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 2.5: Manual Data Entry Time Eliminated
─────────────────────────────────────────────────────────────────
DEFINITION:
Hours per month saved by automation vs. manual process
CALCULATION:
Baseline: 20 min × 24 customers/month = 8 hours/month
Target: 0 hours (100% automated via integration)
BASELINE: 96 hours/year
TARGET: 0 hours (100% elimination)
THRESHOLD: <20 hours/year (80% elimination)
COLLECTION: Monthly (count manual overrides)
IMPORTANCE: ⭐⭐⭐⭐
═══════════════════════════════════════════════════════════════
CATEGORY 3: QUALITY & ACCURACY METRICS
═══════════════════════════════════════════════════════════════
METRIC 3.1: Integration Success Rate
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of Salesforce "Closed Won" deals that successfully
auto-create workflow projects
CALCULATION:
(Successful integrations / Total closed deals) × 100%
DATA SOURCE:
Salesforce (closed deals) vs. Workflow platform (projects created)
Automated reconciliation report (daily)
BASELINE: N/A (new integration)
TARGET: 99.5%
THRESHOLD: 95%
COLLECTION: Real-time (logged per transaction)
REPORTING: Daily dashboard, Weekly report
IMPORTANCE: ⭐⭐⭐⭐⭐ (CRITICAL - integration health)
─────────────────────────────────────────────────────────────────
METRIC 3.2: Data Accuracy Rate
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of integrated customer records with zero data errors
MEASUREMENT:
Weekly audit: Sample 10 random workflow projects
Compare to Salesforce source data
Count discrepancies across 22 mapped fields
BASELINE: 85% accuracy (15% manual entry errors)
TARGET: >98% accuracy
THRESHOLD: >95% accuracy
COLLECTION: Weekly audit (manual sample)
REPORTING: Weekly (trend over time)
IMPORTANCE: ⭐⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 3.3: Configuration Error Rate
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of technical setups requiring rework due to errors
BASELINE: 10% (29 of 288 setups had errors)
TARGET: <2%
THRESHOLD: <5%
COLLECTION: Monthly (from implementation team logs)
IMPORTANCE: ⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 3.4: SLA Compliance Rate
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of onboarding milestones meeting defined SLAs
EXAMPLE SLAs:
• First contact: Within 24 hours (SLA: 95%)
• Kickoff scheduled: Within 5 days (SLA: 90%)
• Setup complete: Within 5 days of kickoff (SLA: 85%)
BASELINE: N/A (no SLAs existed)
TARGET: >90% SLA compliance across all milestones
THRESHOLD: >75%
COLLECTION: Real-time (workflow platform calculates)
REPORTING: Weekly dashboard
IMPORTANCE: ⭐⭐⭐
═══════════════════════════════════════════════════════════════
CATEGORY 4: USER ADOPTION & ENGAGEMENT METRICS
═══════════════════════════════════════════════════════════════
METRIC 4.1: System Adoption Rate (CSRs)
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of CSRs actively using workflow platform for all
onboardings (vs. old personal systems)
MEASUREMENT:
Count CSRs who have:
• Logged in within last 7 days
• Created/updated projects in last 7 days
• Completed workflow tasks in last 7 days
BASELINE: 0% (system doesn't exist yet)
TARGET: >85% by Week 4 post-launch
THRESHOLD: >70% by Week 8
COLLECTION: Weekly (system usage logs)
REPORTING: Weekly (adoption trend)
IMPORTANCE: ⭐⭐⭐⭐⭐ (CRITICAL - adoption is top risk)
─────────────────────────────────────────────────────────────────
METRIC 4.2: Customer Portal Usage Rate
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of customers who log into self-service portal
BASELINE: 0% (portal doesn't exist)
TARGET: >60% within first 30 days of onboarding
THRESHOLD: >40%
COLLECTION: Daily (portal login logs)
REPORTING: Weekly
IMPORTANCE: ⭐⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 4.3: Status Inquiry Volume
─────────────────────────────────────────────────────────────────
DEFINITION:
Number of "what's my status?" inquiries CSRs receive per day
This SHOULD DECREASE as portal provides self-service visibility
BASELINE: 28 inquiries/day (3.5 per CSR × 8 CSRs)
TARGET: <8 inquiries/day (70% reduction)
THRESHOLD: <15 inquiries/day (45% reduction)
COLLECTION: Daily (manual count or ticket system)
REPORTING: Weekly
IMPORTANCE: ⭐⭐⭐⭐ (Leading indicator for CSR time savings)
─────────────────────────────────────────────────────────────────
METRIC 4.4: CSR Satisfaction with Process
─────────────────────────────────────────────────────────────────
DEFINITION:
CSR self-reported satisfaction with onboarding process (1-10 scale)
MEASUREMENT:
Monthly pulse survey: "Rate your satisfaction with the onboarding
process" (1-10, anonymous)
BASELINE: Establish in Month 0 (pre-launch)
TARGET: +3 points improvement
THRESHOLD: +2 points improvement
COLLECTION: Monthly survey
REPORTING: Monthly (trend + comments)
IMPORTANCE: ⭐⭐⭐
═══════════════════════════════════════════════════════════════
CATEGORY 5: CUSTOMER EXPERIENCE METRICS
═══════════════════════════════════════════════════════════════
METRIC 5.1: Onboarding CSAT Score
─────────────────────────────────────────────────────────────────
DEFINITION:
Customer satisfaction with onboarding experience (1-10 scale)
MEASUREMENT:
Post-onboarding survey sent when status = "Complete"
Question: "How satisfied were you with the onboarding process?"
BASELINE: 6.8/10 average (n=120 responses, 42% response rate)
TARGET: 8.5/10
THRESHOLD: 7.5/10
COLLECTION: Per customer (automated survey)
REPORTING: Monthly average + trend
IMPORTANCE: ⭐⭐⭐⭐⭐ (PRIMARY - customer impact proof)
─────────────────────────────────────────────────────────────────
METRIC 5.2: Onboarding NPS
─────────────────────────────────────────────────────────────────
DEFINITION:
Net Promoter Score for onboarding experience
MEASUREMENT:
Post-onboarding survey: "How likely are you to recommend us to
a colleague based on your onboarding experience?" (0-10)
CALCULATION:
% Promoters (9-10) - % Detractors (0-6)
BASELINE: -18 (22% promoters, 40% detractors)
TARGET: +40
THRESHOLD: +10
COLLECTION: Per customer (same survey as CSAT)
REPORTING: Monthly
IMPORTANCE: ⭐⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 5.3: Time to First Value
─────────────────────────────────────────────────────────────────
DEFINITION:
Days from purchase to customer using product productively
MEASUREMENT:
Survey question: "When did you start getting value from the product?"
OR: Product usage analytics (first meaningful activity)
BASELINE: 42 days
TARGET: 20 days
THRESHOLD: 30 days
COLLECTION: Monthly survey or usage analytics
REPORTING: Monthly
IMPORTANCE: ⭐⭐⭐
═══════════════════════════════════════════════════════════════
CATEGORY 6: TECHNICAL & SYSTEM HEALTH METRICS
═══════════════════════════════════════════════════════════════
METRIC 6.1: Integration Uptime
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of time integration is functioning (available)
TARGET: 99.5% uptime (max 3.6 hours downtime/month)
THRESHOLD: 99.0%
COLLECTION: Real-time monitoring
REPORTING: Monthly SLA report
IMPORTANCE: ⭐⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 6.2: Average Integration Processing Time
─────────────────────────────────────────────────────────────────
DEFINITION:
Time from Salesforce trigger to workflow project creation
TARGET: <3 minutes average
THRESHOLD: <5 minutes
COLLECTION: Real-time (logged per transaction)
REPORTING: Daily dashboard
IMPORTANCE: ⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 6.3: Error Rate (Integration Failures)
─────────────────────────────────────────────────────────────────
DEFINITION:
Percentage of integration attempts that fail
TARGET: <2%
THRESHOLD: <5%
COLLECTION: Real-time (error logs)
REPORTING: Daily dashboard, Weekly report
IMPORTANCE: ⭐⭐⭐⭐
─────────────────────────────────────────────────────────────────
METRIC 6.4: Mean Time to Resolution (MTTR) for Integration Issues
─────────────────────────────────────────────────────────────────
DEFINITION:
Average time to resolve integration errors
TARGET: <2 hours
THRESHOLD: <4 hours
COLLECTION: Issue tracking system
REPORTING: Weekly
IMPORTANCE: ⭐⭐⭐
═══════════════════════════════════════════════════════════════
METRICS SUMMARY
═══════════════════════════════════════════════════════════════
TOTAL METRICS: 28 tracked metrics across 6 categories
PRIMARY (Must-Have): 5 metrics
├─ Total Onboarding Cycle Time
├─ First 90-Day Churn Rate
├─ Realized ROI
├─ System Adoption Rate
└─ Onboarding CSAT Score
SECONDARY (Important): 12 metrics
SUPPORTING (Nice-to-Have): 11 metrics
COLLECTION:
├─ Automated (Real-time): 18 metrics
├─ Automated (Batch): 6 metrics
└─ Manual (Surveys/Audits): 4 metrics
REPORTING FREQUENCY:
├─ Real-time dashboards: System health, adoption
├─ Daily: Integration performance
├─ Weekly: Operational metrics, adoption trends
├─ Monthly: Business outcomes, CSAT, ROI
└─ Quarterly: Strategic review, cost analysis
═══════════════════════════════════════════════════════════════
Layer 4: Baseline Measurement – Before You Start
═══════════════════════════════════════════════════════════════
BASELINE MEASUREMENT PROTOCOL
═══════════════════════════════════════════════════════════════
CRITICAL PRINCIPLE: You can't prove improvement without baseline.
TIMING: Baseline must be established BEFORE solution launches
Ideally: During Phase 1 (weeks 1-4)
═══════════════════════════════════════════════════════════════
BASELINE DATA COLLECTION PLAN
═══════════════════════════════════════════════════════════════
METRIC CATEGORY 1: Cycle Time Metrics
─────────────────────────────────────────────────────────────────
DATA NEEDED:
• Last 6 months of completed onboardings (Jan-Jun)
• Salesforce: Opportunity Close Date
• Manual tracking: Onboarding completion date (email, spreadsheets)
• Calculate: Close Date → Completion Date for each customer
METHOD:
Week 1: Extract Salesforce closed deals (n=144 deals, 6 months)
Week 1: Interview CS Manager for completion dates (spreadsheet)
Week 2: Match deals to completions, calculate cycle times
Week 2: Analyze: Mean, median, std dev, min, max, percentiles
RESPONSIBLE: Project Manager
DURATION: 8 hours (over 2 weeks)
OUTPUT:
┌─────────────────────────────────────────────────────────┐
│ Onboarding Cycle Time Baseline │
├─────────────────────────────────────────────────────────┤
│ Sample size: 144 customers (Jan-Jun) │
│ Mean: 35.2 days │
│ Median: 34.0 days │
│ Standard deviation: 8.1 days │
│ Range: 22-58 days │
│ 25th percentile: 29 days │
│ 75th percentile: 41 days │
│ │
│ Distribution: │
│ <25 days: 12% |███ │
│ 25-35 days: 58% |█████████████████████ │
│ 35-45 days: 24% |████████ │
│ >45 days: 6% |██ │
│ │
│ Measured: January 1 - June 30 │
│ Documented: [Date] │
│ Validated by: Sarah Chen, VP Customer Success │
└─────────────────────────────────────────────────────────┘
─────────────────────────────────────────────────────────────────
METRIC CATEGORY 2: Customer Satisfaction
─────────────────────────────────────────────────────────────────
CHALLENGE: No current survey exists
SOLUTION: Implement baseline survey NOW (before solution launches)
Week 1-2: Design post-onboarding survey
Week 3-4: Send to last 50 customers (retroactive baseline)
Week 4-8: Send to all new customers (establish ongoing baseline)
Survey Questions (2 minutes):
1. Overall satisfaction with onboarding (1-10)
2. NPS: Likelihood to recommend based on onboarding (0-10)
3. Time to value: When did you start getting value? (days)
4. Open-ended: What could have been better?
TARGET RESPONSE RATE: 40% (typical for post-onboarding surveys)
n=50 retroactive + 48 new = 98 surveys sent
Expected responses: ~40 responses
RESPONSIBLE: Change Manager
DURATION: 4 hours (survey design) + ongoing
OUTPUT: Baseline CSAT = X.X/10, Baseline NPS = XX
─────────────────────────────────────────────────────────────────
METRIC CATEGORY 3: Churn Rates
─────────────────────────────────────────────────────────────────
DATA NEEDED:
• All customers onboarded in last 18 months
• Churn status and churn date
• Calculate: % churned within 90 days of onboarding complete
METHOD:
Week 1: Extract from CRM (Finance has this data)
Week 1: Calculate first-90-day churn by monthly cohort
RESPONSIBLE: Project Manager (with Finance support)
DURATION: 4 hours
OUTPUT:
┌─────────────────────────────────────────────────────────┐
│ First 90-Day Churn Rate Baseline │
├─────────────────────────────────────────────────────────┤
│ Sample: 288 customers (last 12 months) │
│ Total churned: 14 customers │
│ Churn rate: 4.9% (rounds to 5%) │
│ │
│ By cohort: │
│ Q1 2024: 6.5% (higher - new product issues) │
│ Q2 2024: 4.2% (improvement) │
│ Q3 2024: 4.1% │
│ Q4 2024: 5.0% (seasonal uptick) │
│ │
│ Industry benchmark: 2.0% │
│ Excess churn: 3.0 percentage points │
└─────────────────────────────────────────────────────────┘
─────────────────────────────────────────────────────────────────
METRIC CATEGORY 4: Operational Efficiency
─────────────────────────────────────────────────────────────────
CHALLENGE: Much of this is not currently measured
SOLUTION: Conduct time study BEFORE solution launches
Week 2-3: Shadow 3 CSRs for 2 hours each (6 hours observation)
Week 2-3: Interview all 8 CSRs (1 hour each, 8 hours)
Week 3: Analyze and document
MEASURE:
• Manual data entry time per customer (timer during observation)
• Daily status inquiry volume (count emails/Slacks, 1 week)
• Time to first contact (sample 20 recent customers)
• CSR satisfaction (baseline survey, all 8 CSRs)
RESPONSIBLE: Change Manager
DURATION: 16 hours (over 2 weeks)
OUTPUT: Documented baseline for 12 operational metrics
═══════════════════════════════════════════════════════════════
BASELINE DOCUMENTATION TEMPLATE
═══════════════════════════════════════════════════════════════
For EACH baseline metric, document:
┌─────────────────────────────────────────────────────────┐
│ BASELINE DOCUMENTATION │
├─────────────────────────────────────────────────────────┤
│ Metric Name: [Specific metric] │
│ │
│ Baseline Value: [Number with units] │
│ Sample Size: [n=X] │
│ Measurement Period: [Date range] │
│ Measurement Method: [How collected] │
│ Data Source: [System, survey, observation] │
│ Measured By: [Name] │
│ Date Measured: [Date] │
│ Validated By: [Stakeholder name] │
│ │
│ Statistical Details: │
│ • Mean: [If applicable] │
│ • Median: [If applicable] │
│ • Std Deviation: [If applicable] │
│ • Range: [Min - Max] │
│ • Confidence: [HIGH/MEDIUM/LOW] │
│ │
│ Notes: │
│ • [Any caveats, limitations, or context] │
│ • [Seasonal variations] │
│ • [Data quality issues] │
│ │
│ Attached: │
│ • Raw data (Excel file) │
│ • Analysis methodology │
│ • Stakeholder sign-off │
└─────────────────────────────────────────────────────────┘
═══════════════════════════════════════════════════════════════
BASELINE VALIDATION & SIGN-OFF
═══════════════════════════════════════════════════════════════
CRITICAL: Get stakeholder agreement on baseline BEFORE launch
Why? Prevents post-launch arguments:
"That's not what it was before!"
"You're measuring it differently now!"
"The baseline was calculated wrong!"
SIGN-OFF PROCESS:
1. Complete all baseline measurements (Week 4)
2. Compile Baseline Report (Week 4)
3. Review with stakeholders:
- VP Customer Success (primary)
- CFO (financial metrics)
- Sales VP (customer/revenue metrics)
4. Incorporate feedback, finalize
5. Formal sign-off (email or document signature)
BASELINE REPORT includes:
• Executive summary (1 page): All primary metrics
• Detailed methodology (5 pages): How each was measured
• Raw data (appendix): Full datasets
• Validation: Stakeholder quotes/sign-offs
DELIVERABLE: "Baseline Measurement Report"
DUE: End of Week 4 (Phase 1)
SIGNED BY: Project Sponsor, VP CS, CFO
This becomes the "source of truth" for all future comparisons.
═══════════════════════════════════════════════════════════════
Layer 5: Data Collection and Dashboard Infrastructure
═══════════════════════════════════════════════════════════════
DATA COLLECTION & DASHBOARD INFRASTRUCTURE
═══════════════════════════════════════════════════════════════
PRINCIPLE: Automate data collection wherever possible
Manual data collection doesn't scale and won't be maintained
═══════════════════════════════════════════════════════════════
DATA COLLECTION ARCHITECTURE
═══════════════════════════════════════════════════════════════
TIER 1: AUTOMATED SYSTEM DATA (18 metrics)
─────────────────────────────────────────────────────────────────
SOURCE: Workflow Platform, Salesforce, Integration Logs
Examples:
• Onboarding cycle time → Calculated from timestamps
• Integration success rate → Logged per transaction
• System adoption → Login and activity logs
• Time to first contact → Task completion timestamps
COLLECTION METHOD:
Option A: Native platform reporting (if sufficient)
Option B: Database queries (nightly aggregation)
Option C: BI tool (Looker, Tableau, PowerBI) pulling from APIs
RECOMMENDED: Option C (BI Tool)
Why?
✓ Flexible (can customize reports)
✓ Automated (scheduled refreshes)
✓ Accessible (stakeholders can view)
✓ Historical (data warehouse for trends)
IMPLEMENTATION:
Week 5-6 (Phase 2): Set up data pipeline
- Connect BI tool to Salesforce (read-only API)
- Connect BI tool to Workflow Platform (API)
- Create data warehouse tables (simple star schema)
- Schedule nightly ETL (Extract, Transform, Load)
Cost:
- BI tool license: $3,000/year (10 users)
- Setup effort: 20 hours (consultant or IT)
- Ongoing: Minimal (automated)
─────────────────────────────────────────────────────────────────
TIER 2: AUTOMATED SURVEY DATA (4 metrics)
─────────────────────────────────────────────────────────────────
SOURCE: Post-onboarding surveys, Pulse surveys
Examples:
• Customer CSAT → Survey sent on onboarding complete
• Customer NPS → Same survey
• CSR satisfaction → Monthly pulse survey
• Time to value → Customer survey question
COLLECTION METHOD:
Survey platform (SurveyMonkey, Typeform, Qualtrics)
AUTOMATION:
Trigger: Onboarding status = "Complete" in workflow tool
Action: Webhook → Survey platform → Send survey
Response: Auto-aggregated in survey platform
BI Tool: Pull survey results via API (nightly)
IMPLEMENTATION:
Week 4-5 (Phase 2): Set up surveys and automation
- Design surveys (2 hours)
- Configure triggers (4 hours)
- Connect to BI tool (4 hours)
Cost:
- Survey tool: $1,200/year
- Setup: 10 hours
─────────────────────────────────────────────────────────────────
TIER 3: SEMI-AUTOMATED DATA (3 metrics)
─────────────────────────────────────────────────────────────────
SOURCE: Manual tracking with structured input
Examples:
• Status inquiry volume → CSRs log in shared spreadsheet
• Manual data entry overrides → Log when integration bypassed
• Specific error types → Categorized in issue tracker
COLLECTION METHOD:
Structured logging (Google Sheet, Airtable, or custom form)
Why semi-automated?
- Activity happens outside systems (emails, calls)
- Requires human judgment (categorization)
- Low volume (sustainable to log manually)
IMPLEMENTATION:
- Create logging template (1 hour)
- Train team on logging (1 hour)
- Integrate with BI tool (simple CSV export/import)
Cost: Minimal (use existing tools)
Ongoing effort: 5 min/day per CSR (40 min total/day)
─────────────────────────────────────────────────────────────────
TIER 4: PERIODIC MANUAL DATA (3 metrics)
─────────────────────────────────────────────────────────────────
SOURCE: Manual audits, financial analysis
Examples:
• Data accuracy audit → Weekly sample audit (10 records)
• Annual cost calculation → Quarterly finance review
• Competitive benchmarking → Annual external research
COLLECTION METHOD:
Defined procedures, owner responsible
IMPLEMENTATION:
- Document audit procedures (2 hours)
- Assign owners (in RACI)
- Calendar reminders
Cost: Time cost only (budgeted in resource plan)
═══════════════════════════════════════════════════════════════
DASHBOARD DESIGN
═══════════════════════════════════════════════════════════════
THREE-TIER DASHBOARD APPROACH:
1. Executive Dashboard (Strategic)
2. Manager Dashboard (Operational)
3. Team Dashboard (Activity)
─────────────────────────────────────────────────────────────────
DASHBOARD 1: EXECUTIVE DASHBOARD
─────────────────────────────────────────────────────────────────
AUDIENCE: COO, CFO, VP Customer Success, Board
UPDATE FREQUENCY: Monthly (with real-time option)
CONTENT (Single Page):
┌─────────────────────────────────────────────────────────┐
│ EXECUTIVE DASHBOARD │
│ Customer Onboarding Performance │
│ Last Updated: [Date] │
├─────────────────────────────────────────────────────────┤
│ │
│ PRIMARY METRICS │
│ │
│ Avg Onboarding Time [16 days] ▼ 54% │
│ ████████████████▓▓▓▓▓▓▓▓▓ Target: 15d │
│ Baseline: 35d | Current: 16d | Trend: ↓ │
│ │
│ First 90-Day Churn [2.8%] ▼ 44% │
│ █████▓▓▓▓▓ Target: 2.0% │
│ Baseline: 5.0% | Current: 2.8% | Trend: ↓ │
│ │
│ CS Capacity [4.5 cust/CSR] ▲ 50% │
│ ████████████████▓▓▓▓ Target: 5.0 │
│ Baseline: 3.0 | Current: 4.5 | Trend: ↑ │
│ │
│ Customer Satisfaction [8.1/10] ▲ 19% │
│ ████████████████████▓▓ Target: 8.5 │
│ Baseline: 6.8 | Current: 8.1 | Trend: ↑ │
│ │
│ Cumulative ROI [87%] │
│ Investment: $435K | Benefits YTD: $378K │
│ Projected Breakeven: Month 6 | On Track: ✓ │
│ │
├─────────────────────────────────────────────────────────┤
│ TREND GRAPHS (Last 6 Months) │
│ │
│ [Line graph: Onboarding cycle time trending down] │
│ [Line graph: Churn rate trending down] │
│ [Line graph: Cumulative ROI trending up] │
│ │
├─────────────────────────────────────────────────────────┤
│ KEY INSIGHTS │
│ • 54% cycle time improvement achieved (target: 57%) │
│ • Adoption rate strong: 89% of CSRs fully transitioned│
│ • Remaining gap: Technical setup still 6.5d (target 5d)│
│ • Action: Working with Implementation to optimize │
└─────────────────────────────────────────────────────────┘
VISUALIZATION PRINCIPLES:
✓ One page (no scrolling)
✓ Visual (graphs > tables)
✓ Trends (not just point-in-time)
✓ Context (baseline, target, current)
✓ Traffic lights (red/yellow/green status)
✓ Insights (not just data - so what?)
─────────────────────────────────────────────────────────────────
DASHBOARD 2: MANAGER DASHBOARD (OPERATIONAL)
─────────────────────────────────────────────────────────────────
AUDIENCE: VP Customer Success, CS Manager, Implementation Manager
UPDATE FREQUENCY: Weekly (with daily view option)
CONTENT (2-3 Pages):
PAGE 1: Current Onboardings (Pipeline View)
• All active onboardings (status, owner, days in progress)
• SLA compliance (green/yellow/red by milestone)
• Bottleneck identification (where are delays?)
• At-risk customers (stalled >X days)
PAGE 2: Performance Metrics
• Cycle time by stage (breakdown)
• Error rates and types
• CSR workload (current assignments)
• System health (integration success, uptime)
PAGE 3: Team Performance
• CSR-level metrics (anonymized or named, per policy)
• Adoption tracking (who's using the system)
• Quality metrics (error rates by CSR if appropriate)
INTERACTIVITY:
✓ Drill-down (click metric → see details)
✓ Filters (by CSR, by customer type, by date range)
✓ Alerts (email when SLA violated)
─────────────────────────────────────────────────────────────────
DASHBOARD 3: TEAM DASHBOARD (ACTIVITY)
─────────────────────────────────────────────────────────────────
AUDIENCE: CSRs, Implementation Team, Support
UPDATE FREQUENCY: Real-time
CONTENT:
• My active onboardings (personal view)
• My tasks due today/this week
• Recent completions (celebrate wins)
• System status (integration health)
• Quick stats (my personal metrics)
PRINCIPLE: Actionable (not just FYI)
- Shows what I need to do today
- Surfaces problems I need to address
- Tracks my performance (if helpful, not punitive)
═══════════════════════════════════════════════════════════════
DASHBOARD IMPLEMENTATION PLAN
═══════════════════════════════════════════════════════════════
WEEK 5-6 (Phase 2): Build Dashboard Infrastructure
├─ Connect data sources (Salesforce, Workflow, Surveys)
├─ Create data warehouse/pipeline
├─ Build Executive Dashboard (MVP)
├─ Build Manager Dashboard (MVP)
└─ User acceptance testing
WEEK 7-8: Refine and Launch
├─ Incorporate feedback
├─ Build Team Dashboard
├─ Training (how to use dashboards)
├─ Launch
POST-LAUNCH: Iterate
├─ Monthly review: Are dashboards useful?
├─ Add/remove metrics based on usage
├─ Continuous improvement
BUDGET:
BI Tool License: $3,000/year
Dashboard Development: 40 hours @ $150/hr = $6,000
Data Pipeline Setup: 20 hours @ $150/hr = $3,000
TOTAL SETUP: $12,000 (one-time) + $3K/year
═══════════════════════════════════════════════════════════════
The Measurement Cadence and Review Process
═══════════════════════════════════════════════════════════════
MEASUREMENT CADENCE & REVIEW PROCESS
═══════════════════════════════════════════════════════════════
PRINCIPLE: Regular rhythm of measurement → review → action
═══════════════════════════════════════════════════════════════
DAILY MONITORING (Real-Time)
═══════════════════════════════════════════════════════════════
WHO: IT Operations, Project Manager (during first month)
WHAT TO MONITOR:
• Integration health (success rate, errors, downtime)
• System performance (response times, API limits)
• Critical issues (any failures requiring immediate action)
WHERE: Technical dashboard (always-on display)
ACTION TRIGGERS:
⚠ Integration success rate <95% → Alert IT, investigate
⚠ System downtime >5 minutes → Page on-call
⚠ Error spike (>5 failures/hour) → Escalate to Technical Lead
DURATION: Heavy monitoring first 2 weeks post-launch
Then: Shift to weekly reviews with alerts for issues
═══════════════════════════════════════════════════════════════
WEEKLY TEAM REVIEW (Tactical)
═══════════════════════════════════════════════════════════════
WHO: Project Manager, CS Manager, Tech Lead, Implementation Lead
WHEN: Every Friday, 10 AM, 30 minutes
AGENDA:
1. Review operational metrics (10 min)
- Onboarding cycle times this week
- SLA compliance
- Error rates and types
- System adoption
2. Discuss issues and blockers (10 min)
- What's not working?
- What patterns are we seeing?
- Any user complaints?
3. Action items (10 min)
- What do we need to fix this week?
- Who owns each action?
- When will it be resolved?
OUTPUT: Action item list, owners, due dates
═══════════════════════════════════════════════════════════════
MONTHLY BUSINESS REVIEW (Strategic)
═══════════════════════════════════════════════════════════════
WHO: Steering Committee (COO, CFO, VP CS, VP Sales, CTO, PM)
WHEN: First Tuesday of each month, 60 minutes
AGENDA:
1. Executive Dashboard Review (15 min)
- Primary metrics vs. targets
- Month-over-month trends
- ROI tracking
2. Deep Dive: One Metric (15 min)
- Rotate monthly (different metric each time)
- Example Month 1: Cycle time (why are we at 18d not 15d?)
- Root cause analysis
- Improvement plan
3. Wins and Challenges (15 min)
- What went well this month?
- What's concerning?
- User feedback highlights
4. Decisions and Actions (15 min)
- Any course corrections needed?
- Budget/scope changes?
- Next month priorities
PREPARATION:
Project Manager prepares:
- Updated Executive Dashboard (PDF)
- Month-over-month comparison
- 3 slides: Metrics, Insights, Recommendations
DELIVERABLE: Monthly Status Report (distributed 2 days before meeting)
═══════════════════════════════════════════════════════════════
QUARTERLY BUSINESS REVIEW (Strategic + Planning)
═══════════════════════════════════════════════════════════════
WHO: Executive Team + Board (if appropriate)
WHEN: End of each quarter, 90 minutes
AGENDA:
1. Quarter in Review (20 min)
- All primary metrics vs. baselines and targets
- Quarter-over-quarter trends
- Cumulative ROI and financial impact
2. Success Stories (15 min)
- Specific customer examples
- Team feedback and testimonials
- Quantified wins
3. Lessons Learned (15 min)
- What worked well?
- What didn't work?
- What would we do differently?
4. Benchmarking (10 min)
- How do we compare to industry?
- Where are we best-in-class?
- Where are we lagging?
5. Next Quarter Planning (20 min)
- Improvement priorities
- New initiatives
- Resource needs
6. Long-term Strategy (10 min)
- Is this sustainable?
- What's next? (Phase 3, adjacent problems)
DELIVERABLE: Quarterly Business Review (15-page report)
═══════════════════════════════════════════════════════════════
POST-IMPLEMENTATION REVIEW (6 Months)
═══════════════════════════════════════════════════════════════
WHO: Full project team + stakeholders
WHEN: 6 months post-go-live
PURPOSE: Comprehensive evaluation of project success
AGENDA (2-hour working session):
1. Metrics Review (30 min)
- Did we achieve targets?
- What exceeded expectations?
- What fell short?
2. Original Business Case Validation (30 min)
- Projected ROI vs. Actual ROI
- Projected benefits vs. Actual benefits
- Were assumptions correct?
3. Stakeholder Feedback (30 min)
- CSR team satisfaction
- Customer feedback
- Executive satisfaction
- What would they change?
4. Lessons Learned (20 min)
- Implementation lessons
- Adoption lessons
- Technical lessons
5. Sustainability Plan (10 min)
- Is solution being maintained?
- Is continuous improvement happening?
- Any concerns about regression?
DELIVERABLE: Post-Implementation Review Report
- Comprehensive success evaluation
- Validated ROI
- Lessons learned
- Sustainability recommendations
═══════════════════════════════════════════════════════════════
ACTION-ORIENTED REVIEW PRINCIPLES
═══════════════════════════════════════════════════════════════
Reviews should ALWAYS result in action:
❌ BAD REVIEW: "Cycle time is 18 days, target is 15 days. Moving on..."
✓ GOOD REVIEW: "Cycle time is 18 days vs 15 day target. Root cause:
Technical setup averaging 7 days vs 5 day target. Action:
Implementation team to analyze bottlenecks, report back next week."
EVERY METRIC WITH RED STATUS REQUIRES:
1. Root cause identification
2. Action plan with owner
3. Timeline for resolution
4. Follow-up scheduled
MEASUREMENT WITHOUT ACTION IS WASTE.
═══════════════════════════════════════════════════════════════
The Meta-Principle: What Gets Measured Gets Managed, What Gets Managed Gets Improved
Peter Drucker’s famous maxim: “What gets measured gets managed.”
The corollary: “What doesn’t get measured gets ignored.”
Success metrics and measurement frameworks are the difference between:
❌ Projects that “seem like they helped”
- Vague claims of improvement
- No proof of value
- Can’t justify follow-on work
- Stakeholders skeptical
✓ Projects that demonstrably delivered value
- Specific, quantified improvements
- Proven ROI with evidence
- Clear path to continuous improvement
- Stakeholders confident
The best measurement frameworks:
- Start with the problem (metrics aligned to original pain)
- Establish baseline before (can’t prove improvement without it)
- Automate collection (sustainability requires automation)
- Review regularly (create rhythm of measurement → insight → action)
- Tell the story (data + context + insights, not just numbers)
- Drive action (measurement that doesn’t lead to improvement is waste)
Create comprehensive metrics. Measure rigorously. Review regularly. Act on insights. Prove value.
That’s how you turn projects into partnerships.
What concerns you most about success metrics? Choosing the right metrics? Collecting baseline data? Automating data collection? Building dashboards? Getting stakeholder agreement? Proving ROI? Handling metrics that don’t improve? Creating sustainable measurement systems?