7 min read • Guide 119 of 877
Creating Effective Definition of Done
When "done" means different things to different people, you get incomplete work, surprised stakeholders, and technical debt. A clear Definition of Done (DoD) creates shared understanding of what completeness looks like, ensuring consistent quality and predictable delivery.
Why DoD Matters
| Without Clear DoD | With Clear DoD |
|---|---|
| "I thought it was done" | Shared understanding |
| Works on my machine | Works everywhere |
| Missing tests | Quality built in |
| No documentation | Knowledge captured |
| Bugs found in production | Bugs caught early |
DoD Components
Categories to Cover
DEFINITION OF DONE CATEGORIES
═════════════════════════════
CODE QUALITY:
├── Code compiles without errors
├── No linting warnings
├── Follows coding standards
├── No obvious code smells
└── Self-documenting code
TESTING:
├── Unit tests written and passing
├── Integration tests passing
├── No decrease in coverage
├── Edge cases tested
└── Manual testing complete
CODE REVIEW:
├── PR created with description
├── Review requested
├── Feedback addressed
├── At least 1 approval
└── No unresolved comments
DOCUMENTATION:
├── Code comments where needed
├── README updated if needed
├── API docs updated
├── User docs updated (if UI)
└── Changelog entry added
DEPLOYMENT:
├── Merged to main branch
├── Deployed to staging
├── Smoke test passed
├── No errors in monitoring
└── Ready for production
Sample Definitions
Basic DoD (Small Team)
DEFINITION OF DONE (Basic)
══════════════════════════
Before marking a task "Done":
CODE:
- [ ] Code compiles without errors
- [ ] No linting errors
- [ ] Follows team style guide
TESTING:
- [ ] Tests written for new code
- [ ] All tests passing
- [ ] Manual testing complete
REVIEW:
- [ ] PR approved by 1 reviewer
- [ ] Feedback addressed
DEPLOY:
- [ ] Merged to main
- [ ] Deployed to staging
- [ ] Verified working
Comprehensive DoD (Enterprise)
DEFINITION OF DONE (Enterprise)
════════════════════════════════
DEVELOPMENT COMPLETE:
- [ ] Feature complete per acceptance criteria
- [ ] Code compiles in CI pipeline
- [ ] No static analysis warnings
- [ ] Tech debt documented (if any)
- [ ] Feature flag configured (if applicable)
TESTING COMPLETE:
- [ ] Unit tests: >80% coverage on new code
- [ ] Integration tests written and passing
- [ ] E2E tests for user flows (if UI)
- [ ] Performance tested (if applicable)
- [ ] Security scan passed
- [ ] Accessibility tested (WCAG 2.1 AA)
REVIEW COMPLETE:
- [ ] PR description complete
- [ ] 2+ code reviews approved
- [ ] All review comments resolved
- [ ] Architecture approval (if major change)
- [ ] Security review (if data handling)
DOCUMENTATION COMPLETE:
- [ ] Inline code documentation
- [ ] API documentation (if API change)
- [ ] User documentation (if UI change)
- [ ] Runbook updated (if ops impact)
- [ ] ADR created (if architectural decision)
DEPLOYMENT COMPLETE:
- [ ] Merged to main branch
- [ ] CI/CD pipeline green
- [ ] Deployed to staging
- [ ] QA sign-off on staging
- [ ] Monitoring alerts configured
- [ ] Ready for production deployment
HANDOFF COMPLETE:
- [ ] Product owner verified
- [ ] Demo recorded (if major feature)
- [ ] Support team informed (if customer-facing)
- [ ] Release notes drafted
DoD by Task Type
DoD BY TASK TYPE
════════════════
BUG FIX:
├── Root cause identified
├── Fix implemented
├── Regression test added
├── No related issues
└── Deployed and verified
FEATURE:
├── All acceptance criteria met
├── Comprehensive tests
├── Documentation updated
├── Reviewed and approved
└── Product sign-off
TECHNICAL DEBT:
├── Problem addressed
├── No functionality change
├── Tests still passing
├── Performance maintained
└── Documented learnings
DOCUMENTATION:
├── Content accurate
├── Spell-checked
├── Links verified
├── Reviewed by SME
└── Published
SPIKE/RESEARCH:
├── Question answered
├── Findings documented
├── Recommendation made
├── Team informed
└── Follow-up tasks created
Implementing DoD
Creating Your DoD
DOD CREATION PROCESS
════════════════════
STEP 1: Team Workshop (60 min)
─────────────────────────────────────
- What problems have incomplete work caused?
- What do we always forget?
- What should "done" look like?
STEP 2: Draft Categories
─────────────────────────────────────
Group suggestions into:
├── Code
├── Testing
├── Review
├── Documentation
├── Deployment
STEP 3: Prioritize
─────────────────────────────────────
- Must have (non-negotiable)
- Should have (standard)
- Nice to have (aspirational)
STEP 4: Start Small
─────────────────────────────────────
Begin with 5-8 items. Expand as
team matures and capabilities grow.
STEP 5: Make Visible
─────────────────────────────────────
- Post in project wiki
- Link from GitScrum
- Reference in standup
- Review in retros
DoD in GitScrum
ENFORCING DoD IN GITSCRUM
═════════════════════════
TASK TEMPLATE:
─────────────────────────────────────
Include DoD checklist in task description
## Definition of Done
- [ ] Code reviewed and approved
- [ ] Tests written and passing
- [ ] Documentation updated
- [ ] Deployed to staging
- [ ] Verified working
AUTOMATION:
─────────────────────────────────────
Rule: Prevent moving to "Done" if
DoD checklist is not complete
VISIBILITY:
─────────────────────────────────────
Display DoD completion % on task card
Handling DoD Violations
WHEN DoD ISN'T MET
══════════════════
DON'T:
✗ Let it slide "just this once"
✗ Blame the person
✗ Add bureaucratic overhead
DO:
✓ Discuss in retro
✓ Understand why it happened
✓ Adjust process or DoD
✓ Provide support to meet it
VALID EXCEPTIONS:
├── Emergency production fix (document tech debt)
├── Experimental/prototype work
├── Explicitly agreed skip (documented)
EVEN THEN:
Create follow-up task to complete
the skipped DoD items
Evolution of DoD
Maturing Your DoD
DoD MATURITY LEVELS
═══════════════════
LEVEL 1: BASIC
├── Code compiles
├── Tests pass
├── Code reviewed
└── Merged
LEVEL 2: STANDARD
├── Everything in Level 1
├── Coverage requirements
├── Documentation
├── Staging deployment
└── Verification
LEVEL 3: MATURE
├── Everything in Level 2
├── Security scanning
├── Performance testing
├── Accessibility
├── Monitoring configured
└── Production ready
LEVEL 4: EXCELLENT
├── Everything in Level 3
├── Feature flags
├── A/B testing ready
├── Rollback plan
├── Support trained
└── Analytics configured
Progress through levels as team
capability and tooling improve.
Best Practices
For Definition of Done
- Start small — 5-8 items, not 20
- Make it visible — Everyone sees it constantly
- Enforce it — DoD is non-negotiable
- Review regularly — Quarterly at minimum
- Own it as a team — Everyone agrees
Anti-Patterns
DEFINITION OF DONE MISTAKES:
✗ DoD exists but nobody follows it
✗ Too many items (unrealistic)
✗ Not reviewed or updated
✗ Different teams, different DoDs (no standard)
✗ DoD violated "because deadline"
✗ DoD is just testing (ignores other aspects)
✗ No one knows where DoD is documented