Sprint Process Criteria
A diagnostic checklist for evaluating whether a team’s iteration process is working — independent of the specific methodology (Scrum, Kanban, or hybrid). Larson distills seven observable, measurable criteria from observing many engineering teams.
The Seven Criteria
- Velocity is measured and stable: The team tracks throughput per sprint and the number is consistent enough to be predictable. Wild variability signals broken estimation or planning
- Sprints finish what they start: Carryover work is a symptom of overcommitment or unclear scope
- Bugs are addressed before new features: The team has a bug prioritisation policy and follows it. Accumulating bug debt compounds faster than technical debt
- The team can articulate its priorities: Every engineer knows what the team is working on and why
- Meetings have clear owners and outcomes: Sprint ceremonies produce defined outputs. Meetings without outcomes erode morale
- Retrospectives drive visible improvement: Retros are held AND improvements are implemented. Retros without change are worse than no retros at all
- Oncall load doesn’t overwhelm sprint capacity: Operational burden is tracked and factored into commitments
Diagnostic Application
Use the criteria as a traffic-light scorecard:
- Score each criterion: green (working), yellow (degrading), red (broken)
- Address the reddest criterion first — fixing multiple things simultaneously dilutes focus
- Reassess after each sprint to track improvement trajectory
Connection to Team States
Different Four-States-of-a-Team correlate with different criteria failures:
- Falling Behind: Criteria 1 and 2 fail first (velocity unstable, sprints don’t close)
- Treading Water: Criteria 3, 6, and 7 fail (bugs accumulate, retros produce no change, oncall overwhelms)
- Repaying Debt: Criteria 4 and 5 are hardest to restore (team alignment and meeting discipline)
- Innovating: All seven criteria are green
Common Root Causes
- Velocity instability → ticket sizing inconsistent or scope undefined at sprint start
- Bugs not addressed → no explicit policy; see Work-the-Policy-Not-the-Exception
- Retros without improvement → action owners not assigned, no follow-up mechanism
- Oncall load ignored → operational work invisible in planning tools
- These criteria complement DORA-Four-Metrics — DORA measures delivery throughput; these criteria diagnose why throughput is where it is
Related Concepts
- Four-States-of-a-Team
- DORA-Four-Metrics
- Work-the-Policy-Not-the-Exception
- Team-Snippets-and-Directional-Metrics
- Larson-2019-An-Elegant-Puzzle
Sources
-
Larson, Will (2019). An Elegant Puzzle: Systems of Engineering Management. Stripe Press. ISBN: 978-1-7322651-8-9.
- Chapter 7.1 (Appendix): Seven criteria for effective sprint processes
-
Schwaber, Ken and Jeff Sutherland (2020). The Scrum Guide: The Definitive Guide to Scrum — The Rules of the Game. Scrum.org.
- Available: https://scrumguides.org/scrum-guide.html
- Foundational Scrum definition; Larson’s criteria extend and operationalise these concepts
-
Cohn, Mike (2005). Agile Estimating and Planning. Prentice Hall. ISBN: 978-0-13-147941-8.
- Chapter 3: Velocity as a planning tool; research-backed treatment of velocity stability and sprint commitment
-
Forsgren, Nicole, Jez Humble, and Gene Kim (2018). Accelerate: The Science of Lean Software and DevOps. IT Revolution Press. ISBN: 978-1-942788-33-1.
- Four key metrics of software delivery performance; complementary measurement framework to Larson’s sprint criteria
-
Denning, Stephen (2018). The Age of Agile: How Smart Companies Are Transforming the Way Work Gets Done. AMACOM. ISBN: 978-0-8144-3870-8.
- Chapter 2: Research on why retrospectives without implementation produce negative team outcomes
Note
This content was drafted with assistance from AI tools for research, organization, and initial content generation. All final content has been reviewed, fact-checked, and edited by the author to ensure accuracy and alignment with the author’s intentions and perspective.