7 min read • Guide 693 of 877
Improving Team Performance with Metrics
Metrics illuminate team performance and guide improvement when used thoughtfully. GitScrum provides team analytics including velocity, cycle time, and quality metrics that help teams identify opportunities for improvement without fostering harmful competition.
Metrics Philosophy
Healthy Metrics Culture
METRICS PRINCIPLES:
┌─────────────────────────────────────────────────────────────┐
│ │
│ DO: │
│ ✓ Use metrics for team learning and improvement │
│ ✓ Focus on trends over time, not absolute values │
│ ✓ Let teams choose and own their metrics │
│ ✓ Discuss metrics in retrospectives │
│ ✓ Celebrate improvement │
│ ✓ Use multiple metrics (balanced scorecard) │
│ │
│ DON'T: │
│ ✗ Compare individuals using metrics │
│ ✗ Tie metrics directly to compensation │
│ ✗ Use metrics punitively │
│ ✗ Optimize for single metrics │
│ ✗ Ignore context when interpreting │
│ ✗ Set arbitrary targets │
│ │
│ GOODHART'S LAW: │
│ "When a measure becomes a target, it ceases to be │
│ a good measure." │
│ │
│ Example: If you measure lines of code, people will │
│ write more verbose code, not better code. │
└─────────────────────────────────────────────────────────────┘
Metrics Categories
BALANCED METRICS FRAMEWORK:
┌─────────────────────────────────────────────────────────────┐
│ │
│ DELIVERY: │
│ How much work is getting done? │
│ • Velocity (story points/sprint) │
│ • Throughput (items/week) │
│ • Sprint completion rate │
│ │
│ SPEED: │
│ How fast does work flow through? │
│ • Cycle time (start to done) │
│ • Lead time (request to done) │
│ • Review turnaround │
│ │
│ QUALITY: │
│ How good is the work? │
│ • Defect rate (bugs per feature) │
│ • Escaped defects (bugs in production) │
│ • Technical debt ratio │
│ │
│ TEAM HEALTH: │
│ How is the team doing? │
│ • Team satisfaction surveys │
│ • Turnover rate │
│ • Focus time (uninterrupted hours) │
│ │
│ BALANCE: Improving one shouldn't hurt others │
│ Example: Faster delivery shouldn't increase defects │
└─────────────────────────────────────────────────────────────┘
Key Metrics
Velocity Tracking
VELOCITY ANALYSIS:
┌─────────────────────────────────────────────────────────────┐
│ │
│ SPRINT VELOCITY TREND: │
│ │
│ Sprint 20: ████████████████░░░░ 32 pts │
│ Sprint 21: █████████████████░░░ 34 pts │
│ Sprint 22: ███████████████░░░░░ 30 pts (holidays) │
│ Sprint 23: ██████████████████░░ 36 pts │
│ Sprint 24: ███████████████████░ 38 pts │
│ │
│ Average: 34 pts | Range: 30-38 | Trend: Stable ↑ │
│ │
│ INTERPRETING VELOCITY: │
│ │
│ ✓ GOOD USE: │
│ • Capacity planning for next sprint │
│ • Tracking team's own improvement │
│ • Identifying unusual sprints (investigate) │
│ │
│ ✗ BAD USE: │
│ • Comparing teams (points aren't universal) │
│ • Setting velocity targets (gaming) │
│ • Judging individual performance │
│ │
│ VARIATION IS NORMAL: │
│ • Holidays, illness, context switching │
│ • Complex vs simple work mix │
│ • Team composition changes │
└─────────────────────────────────────────────────────────────┘
Cycle Time
CYCLE TIME ANALYSIS:
┌─────────────────────────────────────────────────────────────┐
│ │
│ CYCLE TIME BREAKDOWN: │
│ │
│ Stage │ Average │ Target │ Status │
│───────────────┼──────────┼─────────┼──────────────────────│
│ In Progress │ 2.5 days │ 3 days │ ✅ Good │
│ Code Review │ 1.8 days │ 1 day │ ⚠️ Slow (bottleneck)│
│ QA Testing │ 1.2 days │ 2 days │ ✅ Good │
│ Staging │ 0.5 days │ 1 day │ ✅ Good │
│───────────────┼──────────┼─────────┼──────────────────────│
│ TOTAL │ 6.0 days │ 7 days │ ✅ On target │
│ │
│ CYCLE TIME HISTOGRAM (last 50 items): │
│ │
│ 0-3 days: ████████████ 12 items │
│ 3-5 days: ████████████████████████ 24 items │
│ 5-7 days: ████████ 8 items │
│ 7+ days: ██████ 6 items │
│ │
│ INSIGHT: Code review is longest stage │
│ ACTION: Discuss in retro - review capacity? │
│ │
│ OUTLIERS: Items taking 7+ days │
│ • #234: Blocked on external API (not team issue) │
│ • #256: Complex feature (expected) │
│ • #267: Unclear requirements (process issue) │
└─────────────────────────────────────────────────────────────┘
Quality Metrics
QUALITY DASHBOARD:
┌─────────────────────────────────────────────────────────────┐
│ │
│ DEFECT METRICS: │
│ │
│ Bugs found in QA: 8 (vs 12 last sprint) ↓ Good │
│ Bugs found in prod: 2 (vs 4 last sprint) ↓ Good │
│ Bugs per feature: 0.4 (target: <0.5) ✅ │
│ │
│ BUG SEVERITY (Production): │
│ Critical: 0 | High: 1 | Medium: 1 | Low: 0 │
│ │
│ QUALITY TREND (6 sprints): │
│ Escaped defects per sprint: │
│ S19: ████ 4 │
│ S20: ████ 4 │
│ S21: ██████ 6 (new team member) │
│ S22: ███ 3 │
│ S23: ██ 2 │
│ S24: ██ 2 │
│ │
│ CODE COVERAGE: │
│ Overall: 78% (target: 80%) │
│ New code: 85% │
│ Critical paths: 95% │
│ │
│ TECHNICAL DEBT: │
│ Debt ratio: 12% (healthy: <15%) │
│ Sprint debt items: 3 completed, 2 added │
└─────────────────────────────────────────────────────────────┘
Using Metrics
Retrospective Integration
METRICS IN RETROSPECTIVES:
┌─────────────────────────────────────────────────────────────┐
│ │
│ SPRINT METRICS REVIEW (5-10 min of retro): │
│ │
│ 1. SHOW THE DATA: │
│ Display key metrics for the sprint │
│ Compare to recent average │
│ │
│ 2. NOTABLE PATTERNS: │
│ "Cycle time increased by 30%" │
│ "Zero escaped defects this sprint" │
│ "Review time doubled" │
│ │
│ 3. TEAM INTERPRETATION: │
│ "Why do we think review time increased?" │
│ - "Complex features this sprint" │
│ - "Two reviewers were out sick" │
│ - "New patterns needed more discussion" │
│ │
│ 4. ACTION IF NEEDED: │
│ "Should we do anything differently?" │
│ - Maybe: "Schedule review time blocks" │
│ - Maybe: "This was exceptional, no action" │
│ │
│ KEY: Team interprets their own data │
│ Management doesn't dictate meaning │
└─────────────────────────────────────────────────────────────┘
Improvement Experiments
DATA-DRIVEN EXPERIMENTS:
┌─────────────────────────────────────────────────────────────┐
│ │
│ PROBLEM: Code review taking too long (1.8 days avg) │
│ │
│ EXPERIMENT: │
│ "If we add daily 30-min review block, we expect │
│ review cycle time to drop to under 1 day." │
│ │
│ SETUP: │
│ • Try for 4 weeks │
│ • Measure: Review cycle time │
│ • Also watch: Overall velocity (shouldn't drop) │
│ │
│ RESULTS: │
│ │
│ Week 1: 1.5 days (↓ 0.3 days) │
│ Week 2: 1.2 days (↓ 0.6 days) │
│ Week 3: 1.0 days (↓ 0.8 days) │
│ Week 4: 0.9 days (↓ 0.9 days) ✅ Target achieved │
│ │
│ Velocity impact: Stable (no degradation) │
│ │
│ DECISION: Adopt practice permanently │
│ │
│ ANTI-PATTERN: Making changes without measuring │
│ "We think it's better" isn't data │
└─────────────────────────────────────────────────────────────┘