Wednesday, 22 November 2017

A format for Release Planning

Summary

Output: An estimated backlog. A likely release date for the plan.

Overview: Start with creating understanding and clarifying the big picture with everyone. Use large macro sizing to roughly size everything. Refine these sizes using a selection. Allocate to a future sprint using historical velocities as a predictor of future performance.

In this article I will outline a format for release planning that I have successfully used on a number of occasions.

Release Planning Day Agenda

Agenda is what the room people see. Focus on the results of each part of the day. At the start people want to know what they will achieve, not how they will achieve it.
  1. Introduction
  2. Vision
  3. Estimation - First pass
  4. Estimation - second pass
  5. Backlog prioritisation
  6. Release plan - first pass
  7. Review and agree
  8. Retrospective

Release Planning Plan

Release planning is a necessary part of scaled software development. The customer might take your software every sprint, but they want a clear indication about how much you will deliver over the next 6 to 12 months. You will probably find that sprint planning and backlog grooming alone won't scale to this length of time.

Output

  • A fully estimated and refined backlog
  • A likely release plan
  • Identified Risks, Issues, Assumptions and Dependencies and what to do for each.
  • A better understanding of what is required by the team in the medium term

Preparation

Required people 

  1. Whole team including SM x 2. This ceremony will scale up to many teams, but be mindful of venue capacity.
  2. Product Owner
  3. Project Manager
  4. Facilitator x 2. I have found it useful to use pair facilitators.

Required inputs

  1. A Vision
    prepared by the PO. What would excite out users? What would delight our customers?
  2. Averaged velocities and predictability metrics.
    Prepared by scrum masters.
  3. Drop Plan
    Sprint names, dates. Prepared by Project manager, Scrum Of Scrums or Product Owner)
  4. Printed out backlog
    Broken down by the product owner as best they can. Each story on A4 Sheet. (Product Owner or Scrum Master)
  5. A1 flipchart paper
    Facilitator or scrum master should bring this
  6. Postits
    Facilitator or scrum master should bring these
  7. Markers
    Facilitator or scrum master should bring these
  8. Pens
    Facilitator or scrum master should bring these

Room layout

  1. A large round table per team
  2. Facilitator table
  3. Projector
  4. Lots of Wall space
  5. 1-2 flipchart stands per team

Day plan

For the facilitators. This is how you will achieve your objectives in step-by-step way.
StartEndActivity
09:0009:15Arrive at venue - arrange venue. Scone & Fruit for warmup. People mingle, chat and get into an open frame of mind.
09:1509:45Intro, set the scene. 2 truths one lie. Set the objective for the day. Talk about estimation - high level and how we will do it. Talk about commitment. Set the ground rules.
09:4510:00Product owner presents the vision.
10:0010:30First pass on backlog. Read the backlog. Each story. Clarify doubts, document RAID.
10:3010:45Tea break, fresh air.
10:4511:15Read the backlog. Talk to PO. Write any clarifications directly upon the A4 pages. The output of this phase are a clarified backlog that everyone as read.
11:1512:30Size the backlog using relative estimation. Take each story and compare it to the ones on the table. Smaller stories on the left. Larger stories on the right. Approximately the same size, place it on an existing story. The output of this phase is a number of piles of stories, approximately the same size.
12:3013:15Lunch. Fresh air.
13:1513:30Reset Exercise.
13:3013:45Review Piles of stories and RAID.
13:4515:00Select a story from each pile. Story point these 5-8 stories using planning poker or whatever method the team is used to. Points are cascaded to similar sized stories automatically by writing on the A4 page. We now have n estimated backlog.
15:0015:15Tea break. Fresh Air.
15:1516:00Scrum master presents team sprint history. Their previous velocity and predictability. This will be used for release planning. Review our current Definition of Done. Product owners should re-organise the piles of stories into the order of priority, highest priority stories at the top.
16:0016:45Using A1 sheets to represent sprints, allocated prioritised stories according to the sprint capacities as per history. Discuss RAIDs. Plan prudently. Note any dependencies. Stories should be places in sprints where they will finish, if they must start in an earlier sprint, note that in writing on the story.
16:4517:15Stand back and review the plan. Talk to project manager and PO. Ask can we commit to this plan? Is this a likely plan or a committed plan?
17:1517:30Wrap up. Thank everyone. Make sure POs and SM's are bringing the stories. Note what sprint each thing is landing. Tidy up room.

Friday, 10 November 2017

Testing definitions

There are lots of test terms bandied about. I find there is general inconsistency among software development practitioners in what each one means. Here are my definitions of the different types of test. I have tried to establish simple, clear definitions for each test term.

Manual test
One that is executed by a human against the target system.

Automated test
One that is run using some sort of scripting, with the scripting kicked off by a continuous integration system.

Unit test
A single test case where the target piece of functionality under test is running in a contained or mocked environment. Automation is implied.

Basic Integration test
The target code runs in a "basic" environment that includes live instances of all it's immediate dependencies such as databases, file systems, network I/O, messaging buses etc. Automation is implied.

Integration test
A test case is one that runs towards a software entity, and that entity is running on the actual environment it is meant to run on in production. An Integration test may be automated or manual.

Acceptance test
These are tests that are understood by the user of the output of the system. That user could be human or machine.

White box test
A white box test is one that is written by some one with full knowledge and access of the solution source code and architecture.

Black box test
A black box test is one that is written by some one that has no or limited knowledge of the solution source code and architecture.

Test Spec or Test Specification
A list of tests. It should specify the use case, the inputs and the expected behaviour or output. For automated tests, the specification is a description of the test function.

Test suite
A collection of automated tests. Should be equivalent to a test specification.

Test Report
This is a summary of the output