Developing

Figure 4 shows the activities that occur during the Developing Phase. During this phase, the Test feature team prepares for the testing done in the Stabilizing Phase by reviewing the BDD 2007 implementation architecture and documents. If possible, key Test feature team members must attend reference architecture and core team meetings to fully understand the direction of the solution’s technological development. In addition, this preview assists the team with preparing its test cases.

Figure 4. Activities during the Developing Phase

Figure 4. Activities during the Developing Phase

On This Page

Roles and Responsibilities Roles and Responsibilities
Create the Test Lab Create the Test Lab
Milestone: Test Labs Complete Milestone: Test Labs Complete
Define Test Case Types Define Test Case Types
Milestone: Test Cases Complete Milestone: Test Cases Complete

Roles and Responsibilities

All six role clusters from the MSF Team Model play a role in the Developing Phase of the initiative. Table 3 lists those roles and defines the focus areas for each role cluster.

For more information about MSF team role clusters, see Microsoft Solutions Framework at https://www.microsoft.com/technet/itsolutions/msf/default.mspx.

Table 3. Roles and Responsibilities During the Developing Phase

Role

Focus

Product Management

  • Business requirements analysis

  • Communications plan

  • Conceptual design

Program Management

  • Budget

  • Conceptual and logical design

  • Functional specification

  • Project plan and project schedule

Development

  • Creating and testing the USMT control files

User Experience

  • Localization and accessibility requirements

  • Schedules

  • Training plans

  • Usage scenarios and use cases

  • User documentation

  • User requirements

Test

  • Test plan and schedule

  • Testing requirements definition

Release Management

  • Application and hardware inventory

  • Design evaluation

  • Network discovery

  • Operations requirements

  • Pilot and deployment plan and schedule

  • Working with IT Operations and the Security feature team

Create the Test Lab

Effective testing requires creating a lab in which data can be assembled, stored, analyzed, tested, and manipulated in an isolated, secure environment. Establishing and maintaining a lab environment early in the project is key to the solution’s success. The lab shown in Figure 5 has the following basic characteristics:

  • It is a permanent, dedicated space.

  • It is isolated from the production corporate network and the Internet by firewalls.

  • It has Internet access.

  • It uses a common, 100-megabits-per-second (Mbps) switched Ethernet infrastructure.

  • It provides common service components, such as:

    • Dynamic Host Configuration Protocol (DHCP).

    • Windows Internet Naming Service (WINS).

    • Domain Name System (DNS).

    • Microsoft Windows Deployment Services (Windows DS).

    • File share services.

    • Tape backups.

    • CD-ROM or DVD recorders.

    • Printers.

    • Optionally, a system and configuration management tool and an application monitoring tool.

Note These last two components, which the Infrastructure Remediation feature team identifies, depend on the scope of the BDD 2007 implementation.

  • It provides a dedicated workspace for each feature team.

  • It allows the production environment to be realistically simulated for services such as the Microsoft Active Directory® directory service.

  • It contains the same target hardware found in the production environment. (This hardware can be limited to the actual computers targeted for deployment. There is no reason for the actual server hardware to be duplicated here so long as all the services provided in production are duplicated.)

    Figure 5. Sample test lab

    Figure 5. Sample test lab

Isolating network shares and providing specifically allocated workstation equipment creates an environment that enables each feature team to work independently when appropriate. However, because the feature teams share a common infrastructure of network, servers, and services, each team can easily collaborate with other feature teams when necessary without having to create redundant infrastructures or affect the production network. For example, the Application Management and the Application Compatibility feature teams can potentially share workspaces, because their tasks are related to the same applications. A common lab also facilitates more timely communication, because all the developers are in the same physical area. Because the life cycle of each feature team may be slightly different, the equipment assigned to each team workspace can be dynamically allocated based on needs.

The number of workstations required differs for each organization and for each feature team based on its size and complexity. A key consideration is that these teams are often expected to work in parallel. Therefore, having workspaces for each team helps reduce the confusion and wasted time that may occur while one team waits for another to finish using shared equipment.

For its test lab, a fictional company called Woodgrove Bank chose to use two types of servers:

  • Lab server. The lab server is a computer placed in a workgroup. This computer holds the BDD 2007 package files. A typical server has several hundred gigabytes (GB) of disk space to hold the files that the various teams require. Additional space is required to host the simulated production servers.

  • Simulated production servers. The simulated production servers (which can be VMs) are similar to those in the production environment, so developers can test on a test network that matches the production environment, including such items as directory designs, policies, and security permissions. The entire lab is isolated from the production network to avoid collision of its services with the production servers.

Note The lab need not use the same domain names used in production. Identical domain names are not a required component of the testing strategy, but identical forest structures are.

Build the Test Lab

Test lab construction includes several activities: base server installation; creation of the file servers; installation of the VM software; and creation of the VMs that simulate production servers. Several of these activities are manual, but many can be automated. For example, after creating the base VM, team members can use scripts to generate the various VMs required to reproduce the production services.

The Infrastructure Remediation feature team is responsible for the initial setup of the test lab. After setup is complete, the Test feature team moves to the finalization of the lab. During the Developing Phase, the Test feature team finalizes the test lab, which includes installing all software and test tools as well as bug tracking and reporting software.

For example, in testing this solution, the Woodgrove Test feature team made extensive use of virtual server technology. Doing so reduced the physical size of the lab by consolidating a large number of servers on a small number of computers and, in some cases, on the same computer. In fact, the Microsoft BDD 2007 Test feature team used only two physical servers to host approximately 25 VMs.

As a rule for testing, each VM is allotted 512 megabytes (MB) of dedicated random access memory (RAM). In some cases, where the virtual server is expected to host several memory-intensive services, this amount can be increased appropriately.

In addition to housing servers and client computers on which deployment is carried out, the test lab environment also requires a great deal of storage space. The Test feature team must plan for storage space, keeping in mind the following requirements:

  • Space for storing the BDD 2007 component files

  • Space for the volume-license media

  • Space for source files for various application software media

  • Space for the images built during testing

  • Space for backing up virtual hard disks (at a minimum, space for three sets of virtual server environment backups must be available: one for golden baseline, one for temporary backup, and one for a fully configured environment)

Depending on the scope of the BDD 2007 implementation, a considerable amount of disk space may be needed to satisfy the requirements mentioned above. On average, a minimum of 200 GB of space is required, but this number is affected by the number of disk images, VMs, and application packages the BDD 2007 project includes.

A staging area that hosts all the source files needed for configuring the VM must be set up on a host computer that all the VMs requiring access can reach. Design the test lab in a way that allows testing as much of the proposed logical and physical production environment as possible—including computer hardware, network topology, wide area network (WAN) connections, domain architecture, services, databases, business applications, administrative tools, security model, application deployment methodology, and network server storage methods.

Design the client computer portion of the test lab to test the same functions and features currently in use or planned for use in the production environment. Include the same types of hardware, applications, and network configurations.

For more information about finalizing the test lab, see “Appendix: Finalize the Test Lab.”

Note When complete, the lab should be fully tested on its own to ensure that it does indeed reproduce the production environment and that it will support testing of the entire BDD 2007 implementation, including image and software deployment.

Milestone: Test Labs Complete

Milestones are synchronization points for the overall solution. For more information, see the Plan, Build, and Deploy Guide. At this milestone, shown in Table 4, the test lab is in place and ready to accommodate testing.

Table 4. Deliverables

Deliverable ID

Description

Test Labs

These labs include all the required software, hardware, and services to simulate the production environment and integrate all the elements of the BDD 2007 project.

Define Test Case Types

A test case is a detailed procedure that fully tests a feature or an aspect of a feature. Whereas the test plan describes what to test, a test case describes how to perform a particular test. Develop a test case for each test listed in the test plan.

A test case includes:

  • The purpose of the test.

  • Special hardware requirements, such as a modem.

  • Special software requirements, such as a development or user state migration tool.

  • Specific setup or configuration requirements.

  • A description of how to perform the test.

  • The expected results or success criteria for the test.

Test cases must be written by a team member who understands the function or technology being tested, and each test case must be submitted for peer review.

Organizations take a variety of approaches to documenting test cases, ranging from developing detailed, recipe-like steps to writing general descriptions. In detailed test cases, the steps describe exactly how to perform the test. In descriptive test cases, the tester decides at the time of the test how to perform the test and what data to use. Most organizations prefer detailed test cases, because it is usually easier to identify pass or fail criteria and to reproduce this type of case.

Develop Test Scenarios

Developing the test scenarios is crucial to successful validation of the BDD 2007 implementation. In addition, the test scenarios form the basis of the test cases the team develops.

The BDD 2007 build process documentation helps determine the test scenarios and test cases the Test feature team develops. Typically, this documentation is ready to be released at the end of the Developing Phase; however, it is recommended that the Test feature team not wait to prepare the test scenarios. The team can also base the scenarios on the functional specification and the BDD 2007 guides. If the project’s functional specification is not available in the early stages of the project, the Test feature team can begin with the sample specifications in BDD 2007.

One of the advantages of basing high-level test case design on the functional specification document is that doing so exposes the functional specification document to critical interpretation from an entirely different perspective—the test perspective. Accordingly, when the Test feature team prepares the high-level test cases and works with developers to review and obtain their sign-off, differences in interpretation are revealed. These differences not only help the test engineers better understand the purpose of various elements of the functional specification but also give the developers an opportunity to reevaluate the purpose of other elements.

In the context of a BDD 2007 project, the following high-level scenarios were the most relevant for determining a valid solution design:

  • Lite Touch Installation (LTI) deployment

  • Zero Touch Installation (ZTI) deployment

  • Zero Touch Reporting

  • ZTI Administration Database

Develop Test Cases

To prepare relevant test cases, team members must first review all deliverables from the various feature teams. The test cases must address each high-level scenario identified in the Developing Phase. At this stage, the Test feature team may still introduce new scenarios; however, doing so is not recommended because of the risk presented to the Test feature team’s schedule. (If some Test feature team members participated in the development of the solution or worked with the developers early on to understand the solution design, the team is now well positioned for developing test cases for project.) These test cases are documented in the Test Cases Workbook. See the workbook provided with BDD 2007 for the detailed test cases that the BDD 2007 Test feature team at Microsoft developed and tested against the solution.

The following detailed description of the test scenarios listed in the section “Develop Test Scenarios” is based on the various activities addressed in the Computer Imaging Systems Feature Team Guide, Deployment Feature Team Guide, Lite Touch Installation Guide, and Zero Touch Installation Guide:

  • LTI deployment:

    • BitLocker™ Drive Encryption

    • Build server setup and configuration

    • Core application installation

    • Data migration

    • Image capture CD creation

    • Language pack installation

    • Locale and time zone configuration

    • Master image creation

    • New computer scenario

    • Operating system deployment

    • Post-deployment configuration

    • Refresh computer scenario

    • Removable media, such as DVD

    • Replace computer scenario

    • Security update

    • Solution installation

    • Testing the usage of the customsetting.ini file

    • Upgrade computer scenario

    • Using Microsoft Operations Manager (MOM) 2005 for monitoring

    • Windows DS preparation

  • ZTI deployment:

    • Application compatibility

    • Build server setup and configuration

    • Core application installation

    • Data migration

    • Deployment to VMs

    • Image capture CD creation

    • Image deploy CD creation

    • Language pack installation

    • Master image creation

    • MOM 2005 preparation

    • Monitoring

    • New computer scenario

    • Operating system deployment

    • Post-deployment configuration

    • Pre-operating system deployment tasks

    • Refresh computer scenario

    • Replace computer scenario

    • SMS Operating System Deployment (OSD) Feature Pack–based capture

    • SMS package execution

    • SMS 2003 preparation

    • Solution installation

    • Testing the usage of custom settings in the customsettings.ini file

    • Upgrade computer scenario

    • Windows DS preparation

  • Zero Touch Reporting:

    • Baseline attributes

    • Data source configuration

    • Migration of user state data

    • Reporting operations

    • Security testing to ensure user access rights are ensured

    • Testing the reporting user interface (UI) functionality

  • ZTI Administration Database:

    • Configuring the database graphical user interface (GUI)

    • Installation of a new database

    • Using a preexisting database

    • Using the .csv file

    • Using the database GUI

Test Case Review

When prepared, the test cases must be reviewed by the respective feature teams, whose input is valuable in the proper preparation of test cases.

Milestone: Test Cases Complete

Milestones are synchronization points for the overall solution. For more information, see the Plan, Build, and Deploy Guide. The deliverables for this milestone are shown in Table 5.

Table 5. Deliverables

Deliverable ID

Description

Test Scenarios

This document outlines the various testing scenarios that the Test feature team covered.

Test Cases

These documents outline step-by-step procedures for performing a test as well as the pass or fail criteria for the test.

Download

Get the Microsoft Solution Accelerator for Business Desktop Deployment 2007

Update Notifications

Sign up to learn about updates and new releases

Feedback

Send us your comments or suggestions