April 17, 2015

Agile Software Development Best Practices: Iteration Reviews

It is common for agile teams to hold a meeting on the last day of each iteration in order to be accountable to their stakeholders for the work accomplished during that iteration. Scrum calls this meeting the “Sprint Review.” Other common names are “Sprint Demo” or “Iteration Review.”

Run well, the Iteration Review has the potential to be a great source of learning and feedback for all involved. However, there are two common mistakes I see teams make that prevent them from getting the kind of useful, actionable feedback from their reviews that they might.

Mistake # 1: Treating the Review as a Demo

The most common mistake I see is teams running their iteration reviews like a demo. One or more of the team members projects their computer on a screen (or over a video-conference) and proceeds to show the new functionality to the rest of the participants.

While this approach can generate some level of discussion and comments, my experience has taught me that you get a much deeper level of feedback if you set up the review to be more like a mini-User Acceptance Test. This means letting non-team members (i.e., other stakeholders) drive the software—ideally on their own computer and using their own data. The team already knows how the new features work and will likely cover only a couple of “happy path” scenarios in a demo. In contrast, you will learn a lot more if you let a customer exercise the new user stories by getting hands-on with the software.

The key to making a hands-on session work well—whether you are co-located or remote—is a well-crafted one- or two-page iteration review script. A good test script doesn’t just walk the users through the stories one-by-one, but instead immerses them in a using the new functionality in a couple of real-world scenarios. With a script in hand, a non-team member is free to explore the software at their speed, and to focus on the things that interest them most. Having team members close by (either physically or virtually) to answer questions and capture feedback is also useful, and it allows those team members to see the system through someone else’s eyes.

Mistake #2: Discouraging Feedback

The second mistake I see is teams that say they want feedback, but then make it clear through their actions that they really don’t. It takes only a couple of argumentative/defensive responses to a question or comment before honest feedback gets shut down. Teams sabotage themselves in this way when they are afraid of feedback. I think that the primary source of this fear is a belief that feedback automatically means more work. (This fear is understandable for individuals who have worked in an environment where they were treated as order takers, with little say over their own workload.)

To deal effectively with the fear of feedback in iteration reviews, make an explicit distinction between the gathering of feedback and the decisions about what to do with that feedback. The iteration review meeting is not a time to make new commitments—there are other agile rituals for that. Instead, make it clear that the review is an opportunity to listen and to understand and to learn in an open and uninhibited environment. Explain that all feedback will be heard and documented, but that decisions about new or modified stories will come later during regular planning and backlog grooming sessions.

To summarize, the main purpose of an Iteration Review meeting is to obtain feedback. Yes, it is a time to show off your work, to celebrate success, and to demonstrate that the team kept its commitments.  But all of these objectives and more can be accomplished if teams focus primarily on creating a setting that encourages high-bandwidth feedback and learning.

For more on designing effective iteration reviews, please see the video below: