Sprint Process Criteria
A diagnostic checklist for evaluating whether a team’s iteration process is working — independent of the specific methodology (Scrum, Kanban, or hybrid). Larson distills seven observable, measurable criteria from observing many engineering teams.
The Seven Criteria
- Velocity is measured and stable: The team tracks throughput per sprint and the number is consistent enough to be predictable. Wild variability signals broken estimation or planning, not just bad luck
- Sprints finish what they start: Carryover work is a symptom of overcommitment or unclear scope. Consistent sprint completion means the team is well-calibrated
- Bugs are addressed before new features: The team has a policy for bug prioritisation relative to new work — and follows it. Accumulating bug debt destroys trust and compounds faster than technical debt
- The team can articulate its priorities: Every engineer should know what the team is working on and why. If they can’t, planning and communication have broken down at the source
- Meetings have clear owners and outcomes: Sprint ceremonies (planning, retro, standup) produce defined outputs. Meetings without clear outcomes are waste that erode morale
- Retrospectives drive visible improvement: Retros are held AND improvements are implemented. Retros that don’t produce change are demoralising — worse than no retros at all
- Oncall load doesn’t overwhelm sprint capacity: Unpredictable operational burden is tracked and factored into commitments. Ignoring oncall load in sprint planning guarantees consistent underdelivery
Diagnostic Application
Use the criteria as a traffic-light scorecard:
- Score each criterion: green (working), yellow (degrading), red (broken)
- Address the reddest criterion first — fixing multiple things simultaneously dilutes focus
- Reassess after each sprint to track improvement trajectory
Connection to Team States
Different Four-States-of-a-Team correlate with different criteria failures:
- Falling Behind: Criteria 1 and 2 fail first (velocity unstable, sprints don’t close)
- Treading Water: Criteria 3, 6, and 7 fail (bugs accumulate, retros produce no change, oncall overwhelms capacity)
- Repaying Debt: Criteria 4 and 5 are the hardest to restore (team alignment and meeting discipline)
- Innovating: All seven criteria are green — the team has slack to improve and deliver
Common Root Causes
- Velocity instability → ticket sizing is inconsistent or scope is undefined at sprint start
- Bugs not addressed → no explicit bug policy; see Work-the-Policy-Not-the-Exception
- Retros without improvement → action owners not assigned, no follow-up mechanism
- Oncall load ignored → operational work is invisible in planning tools
- These criteria complement DORA-Four-Metrics — DORA measures delivery throughput, these criteria diagnose why throughput is where it is
Related Concepts
- Four-States-of-a-Team
- DORA-Four-Metrics
- Work-the-Policy-Not-the-Exception
- Team-Snippets-and-Directional-Metrics
- Larson-2019-An-Elegant-Puzzle
Sources
-
Larson, Will (2019). An Elegant Puzzle: Systems of Engineering Management. Stripe Press. ISBN: 978-1-7322651-8-9.
- Chapter 7.1 (Appendix): Seven criteria for effective sprint processes
- Original source for this framework
-
Schwaber, Ken and Jeff Sutherland (2020). The Scrum Guide: The Definitive Guide to Scrum — The Rules of the Game. Scrum.org.
- Available: https://scrumguides.org/scrum-guide.html
- Foundational Scrum definition; Larson’s criteria extend and operationalise these concepts for real teams
-
Cohn, Mike (2005). Agile Estimating and Planning. Prentice Hall. ISBN: 978-0-13-147941-8.
- Chapter 3: Velocity as a planning tool; research-backed treatment of velocity stability and sprint commitment
-
Forsgren, Nicole, Jez Humble, and Gene Kim (2018). Accelerate: The Science of Lean Software and DevOps. IT Revolution Press. ISBN: 978-1-942788-33-1.
- Four key metrics of software delivery performance; complementary measurement framework to Larson’s sprint criteria
- Available: https://itrevolution.com/accelerate-book/
-
Denning, Stephen (2018). The Age of Agile: How Smart Companies Are Transforming the Way Work Gets Done. AMACOM. ISBN: 978-0-8144-3870-8.
- Chapter 2: Research on why retrospectives without implementation produce negative team outcomes
Note
This content was drafted with assistance from AI tools for research, organization, and initial content generation. All final content has been reviewed, fact-checked, and edited by the author to ensure accuracy and alignment with the author’s intentions and perspective.