Seeing is believing.




Iteration Review

The Iteration Review is a cadence-based event, where each team inspects the increment at the end of every Iteration to assess progress, and then adjusts its backlog for the next iteration.

During the iteration review, each Agile Team measures and then demonstrates its progress by showing working stories to the Product Owner (PO) and other stakeholders to get their feedback. Teams demo every new Story, Spike, Refactor, and Nonfunctional Requirement (NFR). The preparation for the iteration review begins during Iteration Planning, where teams start thinking about how they will demo the stories to which they have committed. ‘Beginning with the end in mind’ facilitates iteration planning and alignment, fostering a more thorough understanding of the functionality needed before iteration execution.


The iteration review provides a way to gather immediate, contextual feedback from the team’s stakeholders on a regular cadence. The purpose of the iteration review is to measure the team’s progress by showing working stories to the Product Owner and other stakeholders to get their feedback. The iteration review serves three important functions:

  • It brings closure to the iteration timebox, to which many individuals have contributed to provide new value to the business
  • It gives teams an opportunity to show the contributions they have made to the business and to take some satisfaction and pride in their work and progress
  • It allows stakeholders to see working stories and provide feedback to improve the product


The iteration review starts by going over the Iteration Goals and discussing their status. It then proceeds with a walk-through of all the committed stories. Each completed story is demoed in a working, tested system—preferably in a staging environment that closely resembles the production environment. Spikes are demonstrated via a presentation of findings. Stakeholders provide feedback on the stories that are demoed, which is the primary goal of the review process.

After the demo, the team reflects on which stories were not completed, if any, and why the team was unable to finish them. This discussion usually results in the discovery of impediments or risks, false assumptions, changing priorities, estimating inaccuracies, or over-commitment. These findings may lead to further study in the Iteration Retrospective about how the next iterations can be better planned and executed. Figure 1 illustrates an iteration review in action.

Figure 1. Showing a working, tested team increment at the Iteration Review

In addition to reflecting how well it did within this latest iteration, the team also determines how it’s progressing toward its Program Increment (PI) Objectives. It finishes the event by refining the Team Backlog before the next iteration planning.


Attendees at the iteration review include:

  • The Agile Team that includes the Product Owner and the Scrum Master
  • Stakeholders who want to see the team’s progress, which may also include other teams.

Although Agile Release Train (ART) stakeholders may attend, their interests and level of detail are usually better aligned with the System Demo.


Below are some tips for the iteration review:

  • Limit demo preparation by team members to about one to two hours.
  • Timebox the meeting to about one to two hours.
  • Minimize the use of slides. The purpose of the iteration review is to get feedback on working software functionality, hardware components, etc.
  • Verify completed stories meet the Definition of Done (DoD).
  • Demo incomplete stories, too, if enough functionality is available to get feedback.
  • If a significant stakeholder cannot attend, the Product Owner should follow up to report progress and get feedback.
  • Encourage providing constructive feedback and celebration of the accomplishments.

Teams that are practicing Continuous Delivery or Continuous Deployment should also do a more frequent story or Feature reviews. Once functionality has reached the ready-for-deployment state, key stakeholders should review it.

Learn More

[1] Leffingwell, Dean. Agile Software Requirements: Lean Requirements Practices for Teams, Programs, and the Enterprise. Addison-Wesley, 2011.

[2] Leffingwell, Dean. Scaling Software Agility: Best Practices for Large Enterprises. Addison-Wesley, 2007.

Last update: 2 October 2018