Export (0) Print
Expand All

Process 1: Stabilize a Release Candidate

Published: April 25, 2008

 

The first process in stabilizing is for the team to find and resolve bugs in the solution and to prepare a release candidate.


Figure 3. Stabilize a release candidate

Activities: Stabilize a Release Candidate

During the initial parts of stabilizing, Test and Development work together to find and resolve bugs. They hold regularly scheduled bug meetings to triage bugs, and team members report and track the status of each issue by using the issue-tracking procedures developed during planning.

As the project progresses, the team begins to prepare release candidates. Building a release candidate involves ensuring its fitness for release, including whether all of the components are present. Typically, teams will create multiple release candidates with each release candidate being an interim milestone. Testing after each release candidate indicates whether the candidate is fit to deploy to a pilot group.

To complete this process, teams must allow stakeholders to interact with the solution while it is in the development environment. For example, stakeholders can test the solution in a lab or training environment. But test needs a separate environment for testing. Members of the deployment team, operations, support, and users are typical candidates for reviewing pilot readiness.

The following table lists the activities involved in this process. This includes:

  • Writing the test specification document.
  • Identifying and resolving bugs until the solution stabilizes.
  • Testing the release candidate.
  • Completing user acceptance testing.

Table 4. Activities and Considerations for Stabilizing a Release Candidate

Activities

Considerations

Write the test specification document

Key questions:

  • What is the typical path through the solution?
  • What are the key scenarios?
  • Does the solution work as expected?
  • How can the solution be tested?
  • What requirements must be tested?
  • What does success look like?

Inputs:

  • Functional specification
  • Test scenarios and test cases

Outputs:

  • Test specification document

Identify and resolve bugs until the solution stabilizes

Key questions:

  • Has the solution achieved bug convergence?
  • Has the solution achieved zero bug bounce?
  • What are the major responsibilities for Test in this project?
  • Do any features or functionality have higher risks or priorities than others?
  • Does the solution interact with other infrastructure or organizations?
  • Is a daily build process running that gives Test fresh code each day?
  • Is the issue-tracking database dynamic enough to allow for agile development? For example, is a notification system available that can alert the project team when new bugs are added to the issue-tracking database or if their status changes?
  • Which features or functionalities have the highest risk?

Inputs:

  • Test specification document
  • Master project plan, including the test plan
  • Test scenarios and test cases
  • Lab environment
  • Interim builds, including:
    • Solution deliverables.
    • Documentation.
  • Issue-tracking database
  • Issue-tracking policies and procedures

Outputs:

  • Release candidate

Best practices:

  • Use a formal issue-tracking system to track and report status of bugs.
  • Document issue-tracking and reporting procedures in planning.
  • Create a test matrix to identify testing tasks and assign them to testers.
  • Divide the test matrix by functional areas of the solution.

Test the release candidate

Key questions:

  • Is this solution feature complete or are components still missing?
  •  Is the release candidate pilot ready or does the team need to build another?

Inputs:

  • Release candidate
  • Master project plan, including the test plan
  • Test scenarios and test cases
  • Issue-tracking database
  • Issue-tracking policies and procedures

Outputs:

  • Pilot-ready release candidate
  • Master project plan updated, including:
    • Backup and recovery plan.
    • Deployment plan.
    • Support plan.
    • Monitoring plan.
    • Operations plan.
    • Training plan.

Best practices:

  • Focus Test and Development efforts on discovering bugs that represent problems so serious that they have to be fixed.
  • Define and agree on success criteria for testing the release candidate.
  • Do not release the candidate until the entire team signs off on its suitability.

Complete user acceptance testing

Key questions:

  • Who will do the user acceptance testing?
  • What scenarios are most critical to test?
  • Who determines whether the test passes?
  • Who represents the users?
  • Are all user categories represented?

Inputs:

  • Pilot-ready release candidate
  • Training or lab environment (non-production training environment)

Outputs:

  • User acceptance
  • Solution archive in the definitive software library (DSL)

Best practices:

  • Conduct user acceptance testing to ensure that the solution meets customers needs before moving on to the pilot test.
  • Give support and users the chance to practice the new technology safely.
  • Use this opportunity to identify issues that prevent successful deployment.
  • Identify success criteria first to keep the focus on the agreed-to requirements and to prevent scope creep.
  • Have a way to collect ideas for future users that came up during testing.

This accelerator is part of a larger series of tools and guidance from Solution Accelerators.

Download

Get the Microsoft Operations Framework 4.0

Solution Accelerators Notifications

Sign up to learn about updates and new releases

Feedback

Send us your comments or suggestions

Was this page helpful?
(1500 characters remaining)
Thank you for your feedback
Show:
© 2014 Microsoft