Based on the Agile method you are using with the Lean | Agile Teams, the iteration time box can vary. However, when the Lean | Agile Teams are working together to deliver the same release, there are really two options to consider:

  1. All Lean | Agile Teams have the same cadence for their Iterations (e.g. 2 weeks).
  2. Some team work in a smaller iteration timebox (e.g. 1 week) and others work in a larger timebox (e.g. 2 weeks) – however, all teams are aligned with the larger timebox in terms of dependencies, goals, delivery, etc.

During the Iteration, backlog owners can review and accept work items like User Stories and defect fixes in real time as they get completed. This is considered most Agile versus accepting at the end of the Iteration or on a cadence in an Iteration Review/Demo ceremony. But, even if the Backlog Owner accepts in real time, there is still value in hosting an Iteration Demo at a planned interval with team and other stakeholders. This ensures continued alignment to expectations and what was delivered, disciple, and opportunity for feedback from stakeholders other than the Backlog Owner.

The Showcase

In the case of conducting an integrated demo at the Feature level, more coordination is involved. We call this the Showcase. The Showcase provides an integrated and aggregated view of all the new Features and Stories that have been implemented and tested by all the teams for the last two Sprints. The importance of this demo is to get immediate, program-level feedback from senior stakeholders and customers. Backlog Owners will facilitate the demo from the QA/UAT environment.

Team representatives and program stakeholders are highly encouraged to attend. Even if the Backlog Owner acceptance is happening in real time throughout the Iterations, the purpose of the Showcase is to provide program visibility to all Lean | Agile teams (collective work).

The Showcase will have the following objectives:

  1. The Product Managers will summarize what was accomplished from the last showcase to now. The Summary will be from a Feature level.
  2. Backlog Owners demonstrating working software (Features) to stakeholders.
  3. Collect feedback.
  4. Validate if the right product is being built in the right way.
  5. State the results from any Spikes and Experiments, and how it will influence the Product.

Some guidelines to help prepare for the Showcase:


The showcase represents at least 4 weeks worth of work across all Scrum Teams for a given Program Portfolio.  However, since the coordination and setup for the showcase can take some time, it can be scheduled in the Iteration after the prior 4 weeks. For example, A Showcase for work accomplished in Iteration 1 and 2 across all Lean | Agile Teams may be scheduled in Iteration 3.


2 hours is recommended but you can adjust it based on your context.

Prepare a script

This script should definitely include a logical sequence in which the software is going to be demoed. The Product Manager/Backlog Owners should also be aware of potential bugs, workarounds, incomplete functionalities and things not to do (which might potentially break the application). A deck with a couple of slides summarizing the sequence goes a long way in communicating this to the other members inside and outside the team. In addition, definitely do the demo to someone else in the Program Portfolio with data that you are going to be using in the Showcase. This lets us know if the script is logical as well as if there are unidentified deviations from expected behavior. 

Set up meeting rooms/video conferences

Make sure that you have all you need for a showcase. Ensure that whoever needs to be in the showcase is invited. This might mean pulling in people who have an indirect influence on the software or are affected by it in a sense. This helps ensure that the product is getting the right kind of attention and areas of concern are flagged immediately. You might want to consider recording the meeting & the demo so that this can be circulated to people who could not make it for some reason or the other. There are too many showcases which have not gone well (i.e. achieved their objectives) because either the conference facility was not good or the room was too small or the Wi-Fi connectivity was poor or some important stakeholder who should have been there, wasn’t.

Set the ground rules

Be very explicit about not using phones or laptops since the Timebox is only 2 hours, and their attention is critical for the success of the release.

Make notes

Have someone if needed to take down notes and feedback. Depending on the context, you might want to quickly prioritize some of the feedback for discussion if time permits.

Surprises will happen

Be ready for surprises. It is rare for a software demo to go smoothly. What matters is if you can explain to the stakeholders what has happened or not and why. Follow up on these surprises is critical to build up trust.

Summarize & Learn

Ensure that all the important points that came up have been covered along with what has to be done about them. Whatever learnings you have from a showcase, take it into the next one. This could include a different presentation format, better logistics, more time, etc.


Features can be released on demand or on a cadence. What is important to understand is that each Feature or a set of Features needs to meet the Definition of Done for Release, and your environment needs to be configured for deploying to production.


Teams should reflect on their past performance and issues to see what is working well, what is not, and how to improve. Retrospectives can occur at the Team and Program Portfolio level and on a agreed upon cadence. Please refer to the Visibility & Insights section of the A2F Framework for more detail on facilitating these retrospectives.

Definition of Done

The Definition of Done (DoD) is a list of criteria used to determine when a Story, Feature, Release or other product increment is considered done. Below is our of the Definition of Done at Story, Feature and Release level:


  • Acceptance criteria met.
  • Story acceptance tests written and passes.
  • Non-functional requirements met.
  • Unit tests (based on meaningful methods).
  • Cumulative unit tests passed.
  • Code checked in and merged, with successful build and deployment.
  • Coding standards followed.
  • Code peer reviewed.
  • No must-fix defects.
  • Story accepted by the Backlog Owner.


  • All stories for the feature done.
  • Code deployed to QA/staging and integration tested.
  • Functional regression testing complete.
  • No must-fix defects.
  • Non-functional requirements met.
  • Feature included in successful build deployment.
  • Feature documentation complete.
  • Feature accepted by Backlog Owner/Product Manager.

Releasable Feature Set

  • All features for the releasable set are done.
  • End-to-end integration and system testing done.
  • Full regression testing done.
  • Exploratory testing done.
  • No must-fix defects.
  • End-to-end system and performance testing done.
  • User, release, installation, and other documentation complete.
  • Localization/internalization updated.
  • Feature set accepted by Product Management.