Export (0) Print
Expand All

Capability: Desktop, Device and Server Management

On This Page

Introduction Introduction
Requirement: Automated Operating System Distribution Requirement: Automated Operating System Distribution
Requirement: Automated Tracking of Hardware and Software for Desktops Requirement: Automated Tracking of Hardware and Software for Desktops
Requirement: Latest Two OS Versions and Service Packs on Desktops Requirement: Latest Two OS Versions and Service Packs on Desktops
Requirement: Latest Versions of Microsoft Office on Desktops Requirement: Latest Versions of Microsoft Office on Desktops
Requirement: Compatibility Testing and Certification of Software Distributions Requirement: Compatibility Testing and Certification of Software Distributions
Requirement: Patch Management for Servers Requirement: Patch Management for Servers
Requirement: Guaranteed Secure Communications with Mobile Devices Requirement: Guaranteed Secure Communications with Mobile Devices
Requirement: Access to Web Applications Using WAP or HTTP for Mobile Devices Requirement: Access to Web Applications Using WAP or HTTP for Mobile Devices
Requirement: Server Consolidation and Virtualization Requirement: Server Consolidation and Virtualization
Requirement: Layered Imaging for Desktops Requirement: Layered Imaging for Desktops

Introduction

Desktop, Device and Server Management is the second Core Infrastructure Optimization capability. The following table describes the high-level challenges, applicable solutions, and benefits of moving to the Rationalized level in Desktop, Device and Server Management.

Challenges

Solutions

Benefits

Business Challenges

Little protection from unauthorized mobile network access

Limited security options for mobile e-mail

Inability to delegate standard mobility-related support incidents to help desk

IT Challenges

Deployments are partially manual and PCs are exposed to attacks or virus infections

Not using automated tools for desktop testing, deployment, and support

Inaccurate knowledge of hardware, software, and desktops increases maintenance costs

Securing servers, PCs, and mobile devices from wired or wireless networks with varying security levels

Projects

Automate OS distribution and installation

Automate asset life-cycle management of hardware and software

Install latest two OS versions on desktops

Implement standard compatibility testing and certification of new software

Extend automated patch management to servers

Guarantee secure communications with mobile devices

Provide access to Web applications via WAP or HTTP

Begin using virtualization to consolidate servers

Implement a layered-image approach for desktop deployment

Business Benefits

Mobile, secure, centrally managed desktop environment

Reduced user downtime by maintaining patch and operating system updates

Users spend less time with first-line support, resulting from application testing

Highly automated IT services lead to lower costs and improved consistency

IT Benefits

Automated deployment of new desktops, desktop rebuilds, and user migrations

More effective desktop security

Consistent security and stability of desktop and mobile environments inside and outside the organizational firewall

The Rationalized level of optimization requires that your organization has procedures and tools in place to improve automation and flexibility of desktop, server, and device management tasks, while beginning to incorporate new technologies for virtualization and mobility.

Requirement: Automated Operating System Distribution

Audience

You should read this section if you do not have an automated software distribution solution for operating system (OS) deployment.

Overview

In the Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized guide you read about defining and deploying standard images for desktops. To move from the Standardized level to the Rationalized level, you need to automate the deployment of your operating systems to desktops in your organization.

Phase 1: Assess

The goal of the Assess phase is to examine the current procedures in the organization used to deploy disk images to desktops. The Standardized level assumes that tools and procedures are already in place to perform fully functional image deployment to desktop and laptop computers with minimal interaction at the targeted computer; fully functional is defined as installing appropriate applications, drivers, language packs, and updates as well as migration of any existing user state during a deployment sequence. Review Lite Touch Installation (LTI) concepts in the Solution Accelerator for Business Desktop Deployment (BDD) 2007 Deployment Feature Team Guide for more details.

Phase 2: Identify

The objective in moving to the Rationalized level is to completely automate existing desktop deployment procedures, enabling a Zero Touch Installation (ZTI) of desktop images, role-based applications, required drivers, language packs, updates, and migration of user state without any interaction at the targeted computer. In this phase, you should identify what is necessary to enable ZTI in your desktop environment.

BDD 2007 is the recommended resource for identifying deployment options and end-to-end planning of deployment projects. BDD 2007 provides guidance for Zero Touch Installation (ZTI) using Systems Management Server (SMS) 2003 with the Operating System Deployment Feature Pack.

Phase 3: Evaluate and Plan

The Evaluate and Plan phase requires you to take into account all steps needed prior to performing fully automated deployments. Evaluating and planning for a Zero Touch deployment differs from most other themes in that it requires many action items typically associated with a Deploy phase. In order to successfully deploy to desktops, there are several required steps prior to kicking off an image installation. BDD 2007 outlines these steps as Feature Team Guides; they include:

  • Application Compatibility

  • Infrastructure Remediation

  • Application Management

  • Computer Imaging System

  • User State Migration

  • Securing the Desktop

Application Compatibility

Application compatibility is often the initial action to prepare for approved desktop deployment projects. In the Application Compatibility step, your organization should:

  • Collect and analyze the application inventory in your organization to build your application portfolio.

  • Test your mitigation strategies to create your application mitigation packages.

  • Resolve any outstanding compatibility issues to report compatibility mitigation to management.

  • Deploy compatibility mitigation packages with core application deployment or after core application deployment.

See the Application Compatibility Feature Team Guide in BDD 2007 for more details.

Infrastructure Remediation

Understanding the network environment is critical for any deployment project. As part of your planning and preparation, you must understand the current status of your organization's environment, identify other sources of change that may affect the project, and develop a risk-mitigation approach to the changes before incorporating them. In the Infrastructure Remediation step, your organization should:

  • Inventory hardware assets for centralized management and research of gathered data. Organizations may consider Systems Management Server 2003, Windows Vista Hardware Assessment, or the Application Compatibility Toolkit to help inventory client hardware and devices on a network.

  • Optimize the infrastructure for deployment of new operating systems and applications.

  • Employ technologies to provide for accurate and timely infrastructure-management information.

See the Infrastructure Remediation Feature Team Guide in BDD 2007 for more details.

Application Management

Application management focuses on the tasks required to package applications and enable scripted installation. Application packaging results in consistent, reliable, and more serviceable application deployment. In the Application Management step as defined by BDD 2007, your organization should:

  • Identify, inventory, and prioritize core and supplemental applications.

  • Develop and test deployment packages.

  • Add applications to the deployment sequence.

See the Application Management Feature Team Guide in BDD 2007 for more details.

Computer Imaging System

Standard desktop imaging is covered in the Core Infrastructure Optimization Implementer Resource Guide: Basic to Standardized. The Standardized level assumes that your organization already employs a standard and tested image for provisioning new users, upgrading operating systems, or refreshing corrupted PCs. The Rationalized level requires a layered-image approach using thin OS images and adding the drivers, updates, language packs, and applications as part of the deployment sequence. See the Layered Imaging for Desktops requirement as part of the Rationalized level.

User State Migration

Although User State Migration is a deployment preparation step that should be tested prior to the actual image deployment, it occurs during the deployment sequence. User state is the combination of users' data files and their operating system and application settings. Settings include items such as screen saver preferences, My Documents, Web browser favorites, Office Outlook data, and so on. Migrating users' data files and settings means that those users will have minimal interruption after the deployment process. In the User State Migration step, your organization should:

  • Capture and store the user data and application information.

  • Build the new desktop and install the company-standard image.

  • Restore the user data and application information to the new desktop.

See the User State Migration Feature Team Guide in BDD 2007 for more details.

Securing the Desktop

For most organizations, securing the computing environment is the highest priority of the IT department. When deploying new operating systems or computers, making sure that the new deployments are at least as secure as the current environment is critical. In fact, any process for deploying new computers must include deploying secure systems. By using constantly updated baselines and images, you can keep the environment secure while still allowing quick deployment of new workstations. When securing the desktop for deployment, your organization should:

  • Choose and enable the security configuration of your desktops.

  • Manage security updates.

  • Maintain desktop security.

See the Security Feature Team Guide in BDD 2007 for more details or read the Windows XP Security Guide and the Windows Vista Security Guide.

Phase 4: Deploy

After your organization has gone through all of the steps required for pre-deployment, you are ready to start testing and deploying desktop images. All of the pre-deployment steps mentioned above are necessary for a Lite Touch Installation (LTI) or Zero Touch Installation (ZTI). See the Zero Touch Installation Guide in BDD 2007 for more details. The following table highlights the differences you need to consider when upgrading from Lite Touch to Zero Touch deployment.

LTI Deployment

ZTI with SMS 2003 Deployment

Provides configuration settings that are common to a group of target computers.

Provides all necessary configuration settings for each target computer.

Requires less up-front configuration time.

Requires more up-front configuration time.

Can be used with slow-speed connections or in instances where no network connectivity exists.

Requires a high-speed, persistent connection.

Requires little or no infrastructure to support deployment.

Requires an infrastructure sufficient to deploy operating system images by using SMS 2003 OS Deployment (OSD) Feature Pack.

Supports deployment over the network or locally.

Supports only network deployments.

Target computers are not required to be managed by SMS 2003 (or other software management tools).

Target computers must be managed by SMS 2003.

Supports security policies where automatic software installation is prohibited.

Supports only security where automatic software installation is allowed.

A BDD 2007 Zero Touch Installation process performs the following tasks:

  • Collects hardware and software inventory information by using SMS 2003 with Service Pack 2 (SP2).

  • Migrates existing user profile information by using User State Migration Tool (USMT) version 3.0.

  • Configures Windows Deployment Services to start the Windows Preinstallation Environment (Windows PE).

  • Installs an operating system image on target computers automatically by using the SMS OSD Feature Pack and the deployments scripts for ZTI.

  • Optionally, monitors the deployment process by using Microsoft Operations Manager (MOM) 2005 and Zero Touch Installation Management Pack.

  • When replacing or refreshing a computer, copies existing user data and preferences.

  • Optionally, creates a backup image of the user computer to a network deployment server.

  • When used for new users and replacement scenarios, repartitions and formats the existing primary hard drive.

  • Dynamically installs applications that are specific to the target computer.

  • Automatically installs previously packaged software specific to the user of the target computer.

  • When replacing or refreshing a computer, restores the user data and preferences.

All steps and process should be thoroughly tested and validated in a pre-production lab environment and pilot program prior to being implemented into production.

BDD 2007 Zero Touch Installation is built and tested for Systems Management Server (SMS) 2003 with the Operating System Deployment Feature Pack. If your organization is using another technology for operating system deployment, it will require functionality to enable zero touch image installation for new users, computer replacements, and refreshes to meet the requirement of the Rationalized level. The additional requirement of the Rationalized level for layered imaging will also require a task sequencing mechanism used for pre- and post-deployment configuration.

Further Information

For more information on automating OS distribution, visit Microsoft TechNet and search for “OS Deployment” or “Zero Touch Installation.”

To see how Microsoft uses SMS for OS distribution, go to http://www.microsoft.com/technet/desktopdeployment/depprocess/default.mspx.

Checkpoint: Automated Operating System Distribution

Requirements

 

Identified tools and technologies required to enable automated operating system deployment.

 

Performed necessary pre-deployment tasks for application compatibility and packaging, infrastructure remediation, imaging, user-state migration, and desktop security.

 

Tested and validated Zero Touch Installation in a lab environment and pilot program.

 

Performed automated OS deployment to end users.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Automated Operating System Distribution capabilities of the Infrastructure Optimization Model. We recommend that you follow the guidance of additional best practice resources for automated OS deployment in the Solution Accelerator for Business Desktop Deployment 2007.

Go to the next Self-Assessment question.

Requirement: Automated Tracking of Hardware and Software for Desktops

Audience

You should read this section if you do not have automated tracking of hardware and software assets on 80 percent or more of your desktops.

Overview

Automated tracking of hardware and software assets addresses and resolves change and configuration needs. By understanding the installed application base and its usage, applying automation helps lower software costs and helps improve configuration compliance. As hardware and software assets comprise an increasing portion of the IT budget, organizations are becoming more focused on finding ways to reduce these costs while continuing to stay compliant with licensing policies.

Phase 1: Assess

In the Assess phase, you are looking for what processes and tools are currently in place to help automate hardware and software tracking or asset life-cycle management. At the Standardized level, the Infrastructure Optimization Model assumes that your organization has implemented some form of configuration management and incorporates best practices and automation for software update management. See the Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized for more details. If your organization currently has systems management software managing the majority of your desktops or an automated configuration management database in place, your organization most likely has the software available to implement automated tracking of assets as defined in the Infrastructure Optimization Model.

Phase 2: Identify

The goal of the Identify phase is to define what is necessary to expand your current capabilities in order to achieve automated tracking of desktop assets. The Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized guide includes guidance for tracking hardware and software for mobile devices, desktop patch management, and general configuration management required for moving to the Standardized level. These principles can be extended and applied to all IT assets in your organization. The following attributes are the basic high-level requirements for Automated Tracking of Hardware and Software for Desktops as defined by the Infrastructure Optimization Model:

  • Asset Inventory

  • Application and Operating System Deployment

  • Software Usage Tracking

  • Security Patch Management

  • System Status Monitoring

At the Standardized level, tools to automate asset inventory are required as part of the patching process, and software update or patch management itself is also a requirement at the Standardized level. Automated tracking of desktop assets then only introduces requirements to automate deployment of applications and operating systems, track usage, and report system status. The Rationalized level requires that all of these tasks are integrated into a common process methodology and toolset. The Evaluate and Plan phase will discuss the specific requirements and recommended tools to implement automated deployment, usage tracking, and system status reporting.

Phase 3: Evaluate and Plan

The Evaluate and Plan phase identifies specific requirements needed to implement Automated Tracking of Hardware and Software Assets. The recommended solution for automating and integrating these tasks is Systems Management Server (SMS) 2003, and this requirement will focus on how SMS 2003 can be used. Alternatively, there are tools available from Microsoft’s partners to enable the concepts and requirements discussed for automated tracking of hardware and software assets.

System Center Configuration Manager 2007 is available as a beta release at the time of this publication. As the next version of a change and configuration management solution from Microsoft, it will add features for desired configuration management, integrated operating system deployment, network access protection for non-compliant computers, and improved mobile device support.  

Asset Inventory

Unless you have a baseline inventory of hardware and software assets in your organization, you have very little chance of effectively managing those assets. Often hardware and software can be added or removed without appropriate authorization or using defined acquisition workflows.

There are three primary solutions available from Microsoft that generate an inventory of hardware and software applications. SMS 2003 has built-in functionality to generate detailed hardware and software inventories. The Microsoft Application Compatibility Toolkit (ACT) is a free tool that can also generate detailed software and hardware inventories using an agent provided with the toolkit. Windows Vista Hardware Assessment is a free tool that provides an agent-less inventory of computers across networks.

For more information on SMS 2003 hardware and software inventory, go to http://www.microsoft.com/technet/prodtechnol/sms/sms2003/opsguide/ops_7675.mspx.

For more information on the Application Compatibility Toolkit, go to http://technet.microsoft.com/en-us/windowsvista/aa905102.aspx.

Application and Operating System Deployment

Automated deployment of applications and operating systems along with usage tracking are the core attributes of automated tracking of desktop assets. This section briefly discusses application and operating system deployment.

Application Deployment

The Automated Operating System Distribution requirement discussed how applications are packaged for scriptable installation. Packaged applications can be deployed to desktops using software distribution technologies, such as SMS 2003, or via policy enforcement mechanisms, such as Group Policy in Windows Server 2003.

Automated application deployment eliminates the inefficient process of providing CD, DVD, or USB media and installation instructions to end users. By using deployment automation, desktops can successfully install applications remotely with minimal chance for user error. Automated deployment allows you to define and control how and when programs run on desktops.

For more information on how SMS 2003 automates application deployment (recommended), see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/cpdg/plan47zc.mspx.

For more information on how software is installed using Group Policy in Windows Server 2003, see http://technet2.microsoft.com/WindowsServer/en/library/b238ecdb-cda5-402b-9b3d-f232045a30fa1033.mspx 

Operating System Deployment

Automated operating system deployment using Zero Touch Installation is discussed in the Automated Operating System Distribution requirement for the Rationalized level. The recommended approach uses the SMS 2003 Operating System Deployment Feature Pack to enabled fully automated desktop deployment.

Software Usage Tracking

Software usage tracking or software metering allows you to monitor program usage on desktops. By using SMS 2003 software metering, you can collect data about software usage in your organization. Software metering data can be summarized to produce reports to help you monitor licensing compliance and plan software purchases in your organization. Software metering collects detailed information about the programs that you choose to monitor. This includes information about program usage, program users, the program start time, and the length of time it is used. The following diagram shows how the Software Metering feature in SMS 2003 captures and reports software usage information to a central site.

Figure 4. How the Software Metering feature in SMS 2003 captures and reports software usage information to a central site

Figure 4. How the Software Metering feature in SMS 2003 captures and reports software usage information to a central site

When the Software Metering feature is enabled, SMS 2003 collects information about program activity on desktops. For more information on the Software Metering feature in SMS 2003, see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/cpdg/plan931z.mspx.

Security Patch Management

Automated patch distribution for desktops was covered in the Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized guide. To move to the Rationalized level, you need to automate tracking of patch compliance on your organization’s desktops. SMS 2003 can track vulnerabilities and report the status of deployed updates.

For more information on security patch management using SMS 2003, see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/cpdg/plan7lo2.mspx.

System Status Monitoring

System status as defined by the Infrastructure Optimization Model monitors both product compliance and desktop system status. Product compliance ensures that software complies with your organization’s guidelines. If your organization mandates that a certain version of a product is used or has guidelines to restrict certain products, product compliance will help you ensure that software installed on desktops complies with these guidelines. Your organization should also centrally monitor desktop system status. SMS 2003 can be used to monitor product compliance and generate status messages to report the activity of components on desktops.

For more information on product compliance reporting using SMS 2003, see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/cpdg/plan7zlk.mspx.

For more information on desktop system status reporting using 2003, see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/cpdg/plan1bzh.mspx.

System Center Configuration Manager 2007

System Center Configuration Manager 2007 is the next generation systems management solution for change and configuration management for the Microsoft platform, enabling organizations to provide relevant software and updates to users quickly and cost-effectively.

System Center Configuration Manager 2007 provides the following features:

  • Collecting hardware and software inventory.

  • Distributing and installing software applications.

  • Distributing and installing updates to software—for example, security fixes.

  • Restricting computers from accessing the network if they do not meet specified requirements—for example, having certain security updates installed.

  • Deploying operating systems.

  • Specifying what a desired configuration would be for one or more computers and then monitoring adherence to that configuration.

  • Metering software usage.

  • Remotely controlling computers to provide troubleshooting support.

All of these features require that System Center Configuration Manager 2007 client software to be installed on the Windows-based computers you want to manage. System Center Configuration Manager 2007 client software can be installed on regular desktop computers, servers, portable computers such as laptops, mobile devices running Windows Mobile or Windows CE, and devices running Windows XP Embedded such as automated teller machines. Microsoft partners can write additional client software to manage computers running non-Windows operating systems.

System Center Configuration Manager 2007 sites provide a way to group clients into manageable units with similar requirements for feature sets, bandwidth, connectivity, language, and security. System Center Configuration Manager 2007 sites can match your Active Directory sites or be totally independent of them. Clients can move between sites or even be managed from remote locations such as home offices.

Phase 4: Deploy

In the Evaluate and Plan phase, you determined the additional attributes needed to enable the Automated Tracking of Hardware and Software on Desktops requirement of the Rationalized level. In the Deploy phase, you are tasked with implementing these additional tasks and integrating them into a common process.

We will again assume that your organization has already implemented automated asset inventory (Standardized level, Patch Management), automated desktop operating system deployment (Rationalized level), and security patch management (Standardized level). Hence, the Deploy phase focuses on automated application deployment, software usage tracking, and system status monitoring.

For information on how to perform automated application deployment using SMS 2003, see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/opsguide/ops_51wl.mspx.

For information on how to deploy software usage tracking using SMS 2003, see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/opsguide/ops_2019.mspx.

For information on how to deploy product compliance reporting as part of system status monitoring using SMS 2003, see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/opsguide/ops_9u0n.mspx.

For information on how to deploy desktop system status monitoring using SMS 2003, see http://www.microsoft.com/technet/prodtechnol/sms/sms2003/opsguide/ops_4g8n.mspx.

Further Information

For more information on tracking hardware and software using SMS 2003, visit Microsoft TechNet and search for “hardware and software inventory.”

To see how Microsoft deploys SMS 2003, go to http://www.microsoft.com/technet/itshowcase/content/depsms03.mspx.

Checkpoint: Automated Tracking of Hardware and Software for Desktops

Requirements

 

Deployed tools and procedures to automate desktop asset inventory.

 

Implemented procedures and technologies to automate application and operating system deployment.

 

Implemented tools and procedures to perform and analyze software usage tracking reporting.

 

Implemented best practice automated software update management.

 

Deployed tools and procedures to monitor desktop system status, including product compliance and system status monitoring.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Automated Tracking of Hardware and Software capabilities of the Infrastructure Optimization Model. We recommend that you follow the guidance of additional best practice resources for tracking of hardware and software.

Go to the next Self-Assessment question.

Requirement: Latest Two OS Versions and Service Packs on Desktops

Audience

You should read this section if 80 percent of your desktops are not running one of the two most recent versions of the Microsoft operating system.

Overview

In the Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized guide, you read about the importance of limiting to two the number of different operating systems in your organization. To move from the Standardized level to the Rationalized level, your organization’s two standard operating system images need to be the most recent versions, with the latest service packs installed. The general benefits of having the two most recent operating systems in production are supportability, ease of maintenance and troubleshooting, and reduced complexity of the desktop environment. More specific benefits and tradeoffs are called out in the Evaluate and Plan phase.

Phase 1: Assess

During the Assess phase, you need to take inventory of the desktop operating systems your organization has in production. Automating desktop asset inventory is discussed in the Automated Tracking of Hardware and Software for Desktops requirement.

Phase 2: Identify

In the Identify phase you will use the results of the Assess phase to determine which desktop assets should be upgraded or refreshed. At this time, you should also examine your options for upgrading users to the most current operating system or the previous version.

Phase 3: Evaluate and Plan

The Evaluate and Plan phase will discuss the advantages and trade-offs of each operating system and examine the requirements for hardware upgrades to support your selected operating system strategy. The two latest versions of operating systems available are Microsoft Windows XP with Service Pack 2 and Windows Vista™. The following sections discuss the advantages and trade-offs of the Windows desktop operating system versions from the perspective of desktop administration and deployment.

Windows XP and Windows Vista Advantages

Although earlier versions of Windows may still be functional for specific and limited applications within your organization, they do not contain the features that make the latest operating systems more secure, more efficient, and easier to manage. By moving your desktop computers to Windows XP or Windows Vista, you can realize the following core advantages over previous versions of Windows desktop operating systems:

Windows XP and Windows Vista

  • Improved wireless network support

  • Improved data protection and recovery

  • Enhanced Web security

  • Integrated firewall

Windows Vista

  • BitLocker™ drive encryption

  • Hardware abstraction layer (HAL) independence

  • Language neutrality

  • Improved deployment experience

Improved Wireless Network Support

Windows XP and Windows Vista simplify the tasks necessary for setting up a network computer and joining it to a network. Universal Plug and Play is designed to support zero-configuration, "invisible" networking, and automatic discovery for a variety of device categories from a wide range of vendors. In addition, Windows Vista supports both Internet Protocol version 4 (IPv4) and Internet Protocol version 6 (IPv6) to meet all your future networking needs.

For more information on the networking capabilities of Windows XP and Windows Vista, visit the following Web sites:

Improved Data Protection and Recovery

Windows XP and Windows Vista provide significant advancements in data recovery and protection and private key recovery. Encrypting File System (EFS), available in both operating systems, supports the use of data recovery agents (DRA) to decrypt files that have been encrypted by other users.

For more information on EFS, visit the following Web sites:

Enhanced Web Security

Security features and enhancements in Windows XP and Windows Vista provide protection to your organization’s desktops and servers that earlier versions of the Microsoft operating system did not and could not address. Implementing the latest operating systems on your desktops is the only way to ensure that you are protected against the latest attempts to breach your network. For more information on the security features of Windows XP and Windows Vista, visit the following Web sites:

Integrated Firewall

Windows XP and Windows Vista both provide an integrated firewall for desktops. Not all earlier versions of the Microsoft operating system have this capability. The Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized guide outlines the need for personal firewalls on desktop computers. The latest operating systems from Microsoft provide this capability integrated, so implementation and setup is easier.

For more information on personal firewalls, visit the following Web sites:

BitLocker Drive Encryption (Windows Vista)

BitLocker drive encryption helps protect data on a client computer. The entire Windows volume is encrypted to help prevent unauthorized users from breaking Windows file and system protections or from viewing information offline on the secured drive. Early in the startup process, BitLocker checks the client computer's system and hardware integrity. If BitLocker determines an attempt has been made to tamper with any system files or data, the client computer will not complete the startup process.

BitLocker is available in the Windows Vista Enterprise and Ultimate editions of the operating system for client computers.

For more information on BitLocker Drive Encryption, visit the following Web sites:

Hardware Abstraction Layer Independence (Windows Vista)

Windows Vista introduces hardware abstraction layer (HAL) independence to make it easier to consolidate standard desktop images. For versions prior to Windows Vista, technical restrictions prevented the creation of a single image that could be deployed to all desktop hardware types. Different HALs meant you had to maintain multiple images. When managing Windows XP and Windows Vista, most organizations will need at least two or three images for the Windows XP desktops. With Windows Vista, you can consolidate your desktop images to one per hardware platform (x86 and x64). The Windows Vista operating system is also able to detect which HAL is required and automatically install it.

For more information on hardware abstraction layer independence in Windows Vista, visit the following Web sites:

Language Neutrality (Windows Vista)

In Windows Vista, the entire operating system is language-neutral. One or more language packs are added to this language-neutral core to create the image that is deployed. Servicing of Windows Vista is also language-neutral, so in many cases only one security update is needed for all languages. Configuration is also language-neutral, so one unattend.xml can be used for all languages.

With Windows XP, you can either deploy localized versions of Windows XP, requiring a different image for each language, or you can deploy an English Multilanguage User Interface (MUI) version with additional language packs. There are advantages and disadvantages to each approach, but in most cases organizations that require support of multiple languages should select using the MUI option.

For more information on language neutrality in Windows Vista and to find out which versions support multiple language instances per PC, visit the following Web sites:

Improved Deployment Experience (Windows Vista)

Windows Vista deployment tools are available for all stages of deployment, and several are integrated into a single deployment procedure using the Business Desktop Deployment (BDD 2007) Solution Accelerator, Microsoft's proven methodology and guidance on how to optimally deploy Windows Vista and the 2007 Office system. The tools available for Windows Vista deployment include the following:

For an in-depth discussion of these technologies and tools, read the “Windows Vista Deployment Enhancements white paper as well as the step-by-step instructions for a basic Windows Vista install.

For more information on the Improved Deployment Experience, visit the following Web sites:

Trade-Offs

Along with all of the advantages highlighted for Windows Vista and eased deployment, there are also a few trade-offs. From the desktop administrator perspective, the primary trade-offs for Windows Vista compared to Windows XP are that deployment images are larger and system requirements increase.

Image Size

With Windows XP and Windows 2000, it is possible to create images that fit easily on a single CD (less than 700 MB). If your organization adds applications and drivers to that image, the size typically approaches 2–3 GB. With Windows Vista, image size begins at about 2 GB compressed. Adding applications will often expand compressed images to 4–5 GB. This will have implications on the network used to deploy Windows Vista images. Fortunately, enhancements have been made to accommodate offline deployment in case your organization’s network is not capable of deploying larger images. You can now create self-contained Lite Touch DVD or USB media to manage the deployment process offline with minimal user interaction.

Hardware Requirements

Windows Vista can be installed to enable two levels of user experience. The standard experience can be installed on certified Windows Vista Capable PCs, and the premium experience can be installed on Windows Vista Premium Ready PCs. The standard experience delivers the Windows Vista core attributes such as innovations in security and reliability and in organizing and finding information. Premium features may require advanced or additional hardware. The Windows Vista Premium Ready hardware can deliver premium experiences, including Windows Aero, a productive, high-performing desktop interface. The recommended tools for assessing existing collections of hardware for Windows Vista are the Windows Vista Hardware Assessment  or Systems Management Server 2003 with Service Pack 3.

Detailed information in the following table outlines requirements of Windows Vista Capable and Premium Ready hardware.

Windows Vista Capable PC

Windows Vista Premium Ready

Processor

Modern processor (at least 800 MHz1)

CPU manufacturer information:

Intel 

AMD 

Via

1 GHz 32-bit (x86) or 64-bit (x64) processor 1

System Memory

512 MB

1 GB

GPU

Microsoft DirectX® 9 Capable (WDDM driver support recommended)

Windows Aero Capable

DirectX 9-class GPU that supports:

A WDDM driver

Pixel Shader 2.0 in hardware

32 bits per pixel

Adequate graphics memory2

Graphics Memory

 

128 MB

HDD

 

40 GB

HDD Free Space

 

>15 GB

Optical Drive

 

DVD-ROM Drive3

1Processor speed is specified as the nominal operational processor frequency for the device. Some processors have power management that allows the processor to run at a lower rate to save power.
2Adequate graphics memory is defined as:
  – 64 MB of graphics memory to support a single monitor at 1,310,720 or less.
  – 128 MB of graphics memory to support a single monitor at resolutions 2,304,000 pixels or less.
  – 256 MB of graphics memory to support a single monitor at resolutions higher than 2,304,000 pixels.
  – Graphics memory bandwidth, as assessed by Windows Vista Upgrade Advisor, of at least 1,600 MB per second.
3A DVD-ROM may be external (not integral, not built into the system).

Choosing the Appropriate Mix

The result of the Evaluate and Plan phase is to determine the appropriate desktop strategy and mix of Windows XP versus Windows Vista production operating systems for your organization.

Phase 4: Deploy

Automated deployment of operating system images is discussed in the Automated Operating System Distribution requirement of the Core Infrastructure Optimization Model Rationalized level.

Further Information

For more information on current desktop operating system versions, visit the following Web sites:

To see how Microsoft plans deployment of operating systems, go to the following Web sites:

Checkpoint: Latest Two OS Versions and Service Packs on Desktops

Requirements

 

Inventoried existing production operating systems.

 

Determined new computer and refresh strategies in order to phase out older operating systems.

 

Deployed two most recent operating system versions to at least 80 percent of all desktops.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Latest Two OS Versions and Service Packs on Desktops capabilities of the Infrastructure Optimization Model. We recommend that you follow the guidance of additional best practices for maintaining OS versions addressed in the Computer Imaging System Feature Team Guide found in BDD 2007.

Go to the next Self-Assessment question.

Requirement: Latest Versions of Microsoft Office on Desktops

Audience

You should read this section if 80 percent of your desktops are not running Microsoft Office 2003 or the 2007 Microsoft Office system.

Overview

Today, more than ever, people and organizations are using software tools to process information. These applications are being used to create materials critical to an organization’s success and therefore require careful consideration when configuring, deploying, securing, and managing them. Microsoft Office—along with supporting tools and System Center products—offers technology and guidance for IT professionals to perform these important tasks. This guide focuses primarily on the Office Resource Kit, underlying tools, and corresponding products to enable setup, deployment, and management of Office 2003 and the 2007 Office system.

Phase 1: Assess

In the Assess phase, you are primarily taking an inventory of your current environment and determining which versions of Microsoft Office are present on desktops. The versions present on desktops may be different than the versions initially deployed or those maintained as part of the standard desktop image. It is recommended that you use automation to centrally inventory your environment, such as Systems Management Server 2003, the Windows Vista Hardware Assessment, or the Application Compatibility Toolkit (ACT).

Phase 2: Identify

In the Identify phase you should begin determining which users should be updated. This process weighs the needs of the business and related costs and is applicable to everyone in the user environment. In the Evaluate and Plan phase, you will look at all of the deployment, security, and management features and determine the appropriate mix given your organizational needs.

Phase 3: Evaluate and Plan

In the Evaluate and Plan phase, you examine the manageability features of both versions of Microsoft Office and determine from the IT perspective which users should receive each version of the software. This section will only discuss the features of each version as applicable to the IT department in your organization. For more detail on Microsoft Office usability and advantages for end users, please visit Microsoft Office Online.

The core implementer-focused considerations for Microsoft Office include:

  • Planning and Architecture

  • Security and Protection

  • Deployment

  • Operations

We will briefly discuss each of these considerations and how they are represented in the Office 2003 editions as well as the 2007 Microsoft Office system.

Planning and Architecture

When updating to a new productivity suite, the first area for evaluation—once all computers have been inventoried—is to examine the impacts of migration and what tools are available to successfully migrate users from one suite to another. Both Microsoft Office 2003 and the 2007 Microsoft Office system offer tools and guidance to evaluate and plan for the migration.

Migrating to the 2007 Office System

The 2007 Microsoft Office system offers many improvements and new features in response to customer needs. Changes such as the new file format and new Setup architecture require careful planning and preparation before upgrading. Your migration planning will include evaluating the files in your environment, identifying potential conversion issues, and reviewing migration considerations for each program within the 2007 Office release.

The Office Migration Planning Manager (OMPM) enables you to examine the files in your environment and decide whether to archive them, convert them in bulk with the Office File Converter available in OMPM, or convert them manually. You will also determine the approach to upgrade and migration within your organization.

Planning a migration to 2007 Office release includes the following:

Migrating to Office 2003

Migrating users and files from Office 97, Office 2000, and Office XP to Office 2003 is a relatively straightforward process. These versions of Microsoft Office share many common file formats and a common version of the Office Converter Pack, which can be found on the Office 2003 Resource Kit Office 2003 Resource Kit Downloads page.

The Office Converter Pack bundles together a collection of file converters and filters that can be deployed to users. The Converter Pack can be useful to organizations that use Microsoft Office 2003 in a mixed environment with earlier versions of Office and other applications, including Office for the Macintosh and third-party productivity applications. Although these converters and filters have been available previously, they are packaged together here for convenient deployment.

For details about how to install and use the Office Converter Pack, visit the Office 2003 Resource Kit Toolbox: Office Converter Pack Web site.

Security and Protection

The security and protection of files is becoming increasingly scrutinized by organizational policies and government regulations. The 2007 Office system addresses this by offering new functionality aligned with Group Policy and improved setup using the new Office Customization Tool. Office 2003 also added functionality over legacy versions with improvements in file encryption, macro security, ActiveX controls, and Trusted Publisher management.

Security in the 2007 Office System

The 2007 Microsoft Office system has many new security settings that can help you mitigate threats to your organization's resources and processes. In addition, the 2007 Office release has many new privacy options that help you mitigate threats to users' private and personal information. Determining which new settings and options are appropriate for your organization can be a complex task involving numerous critical planning decisions. To help you minimize the time spent planning settings and options, use the four-step security planning process described in this guide. This systematic decision-making approach is designed to help you choose settings and options that maximize protection and productivity in your organization.

The new Office Customization Tool (OCT) replaces the Custom Installation Wizard and is the main deployment tool for configuring and managing security settings. Additionally, Administrative Templates (.adm files) can be loaded directly into the Group Policy Object Editor and applied to client computers as local policies or domain-based policies.

The Office Resource Kit outlines a four-step approach to securing the 2007 Office system:

  1. Determine which tools you need in order to deploy security settings and privacy options in your organization.

  2. Identify the threats that pose a risk to your organization.

  3. Evaluate the default settings and options that mitigate those threats.

  4. Determine which additional settings and options you need to deploy to minimize risks to your organization's resources and processes.

The following process diagram depicts these steps.

Figure 5. The four-step approach to securing the 2007 Office system

Figure 5. The four-step approach to securing the 2007 Office system

For more detail on these steps, visit the Office Resource Kit Security and Protection guidance on Microsoft TechNet.

Security in Office 2003

To help address growing concern about the security of information and systems, several new features were included in Office 2003 for administrators and users. Some of the new improvements are described in the following sections.

Revised Macro Security

Although the legacy macro security methods helped to address many security-related issues, a few subtle improvements have been made to how documents, attachments, and linked references are opened. For more information on the effects of these improvements on users, as well as how the administrator can configure security settings in the Custom Installation Wizard, see Macro Security Levels in Office 2003.

Revised Trusted Publishers Store Management

When administrators accept certificate of trusts from external vendors, they can now more easily roll out those certificates to others by using Active Directory. It is also possible to remove an installed and trusted certificate of trust if you no longer require it or suspect it was compromised.

For more information on managing the Trusted Publishers store, see Working with Trusted Trust Publishers.

Revised Microsoft ActiveX® Controls

The concern about how ActiveX controls the Start and Run buttons on users' computers is more important than ever. Office 2003 administrators have more control to defend against unknown or ill-defined controls that may possess security flaws. This allows you to set the degree of risk you are willing to accept from an unknown ActiveX control when it starts. For more information on ActiveX controls as they relate to security, see ActiveX Controls and Office Security.

New Encryption Types

Office 2003 added new encryption types and the ability to set all Office applications to use a specific encryption type as its default. This does not mean that every document will have encryption when it is saved; it only means that if a password is set to encrypt the document, the user does not have to select an encryption type to use. For more information on configuring Office 2003 for encryption, see Important Aspects of Password and Encryption Protection.

Revised Core Office Programming Objects

Due to the security review of all Office applications, the core objects were updated to help eliminate the classic buffer overflow attack to any data entry points. Improvements were also made to handling user IDs and passwords stored within code. For more information on Office code objects as related to security, see Important Aspects of Password and Encryption Protection.

Determining the Appropriate Office Versions for Your Users

This guide highlights the configuration, security, and deployment considerations for Office 2003 and the 2007 Office system. Determining the appropriate version distribution strategy for your users will be consistent with any new software acquisition and should take into account the benefits and trade-offs for end users, file security and protection, file version compatibility, license costs, deployment costs, and operations costs. The result of this exercise will enable you to determine your organization’s strategy for managing a single Office version or multiple Office versions and determine which users will receive upgrades.

Phase 4: Deploy

The overall deployment methodologies for Office 2003 and the 2007 Office system are for the most part consistent with each other. The tools used to automate software distribution and installation of Microsoft Office are the same as those called out in the requirement for Automated Tracking of Hardware and Software for Desktops for the Core Infrastructure Optimization Rationalized level. The Rationalized level in Core Infrastructure Optimization requires that the organization has fully automated software and operating system distribution using such technologies as Systems Management Server 2003.

Office installations can occur as stand-alone application installations using technologies such as Systems Management Server 2003 (or, in limited cases, Group Policy) or at the time of desktop image deployment. All three methods require careful attention during the planning and configuration phases.

Business Desktop Deployment 2007 guidance generally treats Office installations as part of the core desktop image. Based on the image strategy in your organization, this can mean that the Office package is integrated as a component of the core image itself in a thick image strategy, or it can be installed after a thin operating system image is present on the targeted machine. In the instance of thin image strategy, the Office deployment task is essentially the same as installing Office on a legacy system, with a notable exception being the uninstall of the legacy Office version.

Deploying the 2007 Office System

Whether part of a desktop image deployment or as a stand-alone application deployment, the 2007 Office system deployment process should follow specific milestones and objectives. For more detailed guidance, visit the Business Desktop Deployment 2007 Office Deployment Guide. The key milestones and objectives are depicted in the following diagram and listed below:

Figure 6. Key milestones and objectives of the 2007 Office system deployment

Figure 6. Key milestones and objectives of the 2007 Office system deployment
  • Creating a project plan. As with any project, careful planning leads to greater chances of success. In this phase, the team analyzes current Microsoft Office deployments, plans migration of documents and settings, determines optimal placement of deployment servers, and acquires resources for completing the project.

  • Creating an installation point. The first phase of development, creation of the 2007 Office system installation point, creates a shared source location containing the 2007 Office release installation files.

  • Customizing installation. Most organizations require some changes to the default settings for the 2007 Office release. Consolidate these settings into a Microsoft Office customization file that can then be applied to the installation point.

  • Testing Microsoft Office deployment. Before releasing the 2007 Office system to production, careful testing of the 2007 Office system deployment process ensures that the deployment occurs as planned.

  • Deploying to production. Activating deployment procedures to provide the 2007 Office release to client computers.

  • Transitioning to IT Operations. After the deployment plan has been executed, the deployment infrastructure is handed off to IT Operations for long-term operation and management.

For more guidance on 2007 Office system deployment, visit the Desktop Deployment Center on Microsoft TechNet and read the Office Deployment Guide.

Deploying Office 2003

The Office 2003 deployment will follow the same general milestones and guidelines as the 2007 Office System. The tools used to customize the installation are different than with the 2007 release and provide a fair amount of flexibility for customizing the installation to your organization’s needs.

Detailed guidance for Deploying Office 2003 Using Systems Management Server 2003 is available on Microsoft TechNet. If you are not managing application installations using SMS 2003, you can also use Group Policy to deploy Office 2003 to client computers. Using Group Policy software installation features, you can assign or publish Office 2003 to all the users or computers in a designated group.

For large or complex organizations, Systems Management Server 2003 offers more sophisticated functionality, including inventory, scheduling, and reporting features. However, using Group Policy to deploy Office 2003 can be a good choice in the following settings:

  • Small- or medium-sized organizations that have already deployed and configured the Active Directory directory service.

  • Organizations or departments that comprise a single geographic area.

  • Organizations with consistent hardware and software configurations on both clients and servers.

For detailed guidance about deploying Office 2003 Editions by using Group Policy, see http://office.microsoft.com/en-us/ork2003/HA011402011033.aspx.

Operations

Operations management for Microsoft Office is consistent with recommendations defined in the Automated Tracking of Hardware and Software for Desktops requirement at the Rationalized level in the Core Infrastructure Optimization Model. Operations specific to Office 2003 and the 2007 Office system for applying software updates (patches) and enforcing Group Policy configuration standards are highlighted below.

Updating Office (Office 2003 and the 2007 Office System)

Your organization should follow the desktop patch management guidelines discussed in the Core IO Implementer Resource Guide: Basic to Standardized. Verified updates are distributed directly to the client to ensure that existing Office installations have the latest software updates.

Enforcing Group Policy Settings (2007 Office System)

In a Microsoft Windows-based network, administrators can use Group Policy settings to help control how users work with the 2007 Microsoft Office system. Administrators can use Group Policy settings to define and maintain an Office configuration on users' computers. Unlike other customizations—for example, default settings distributed in a Setup customization file—policy settings are enforced and can be used to create highly managed or lightly managed configurations.

You can use the 2007 Office release policy settings to:

  • Control entry points to the Internet from the 2007 Office release applications.

  • Manage security settings in the 2007 Office release applications.

  • Hide settings and options that are unnecessary for users to perform their jobs and that might distract users or result in unnecessary calls for support.

  • Create a highly managed standard configuration on users' computers.

You can set policy settings that apply to the local computer and every user of that computer, or that apply only to individual users. Per-computer policy settings are set under the Computer Configuration node of the Group Policy Object Editor Microsoft Management Console (MMC) snap-in and are applied the first time any user logs on to the network from that computer. Per-user policy settings are set under the User Configuration node and are applied when the specified user logs on to the network from any computer. Group Policy is also applied periodically in the background after it is initially processed at startup and logon.

For detailed information about the Group Policy infrastructure, see Group Policy Technical Reference on the Microsoft TechNet site.

Further Information

For more technical information on Microsoft Office, go to the Microsoft Office System TechCenter on TechNet.

To learn how Microsoft deployed Office Professional Edition 2003 to desktops, go to http://www.microsoft.com/technet/itshowcase/content/deskdeployoffice2003.mspx.

Checkpoint: Latest Versions of Microsoft Office on Desktops

Requirements

 

Evaluated the latest versions of Office and defined plan to consolidate Office versions on production workstations.

 

Deployed latest versions of Office to desktops.

 

Defined plan for managing Office configurations.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Latest Versions of Microsoft Office on Desktops capabilities of the Infrastructure Optimization Model.

Go to the next Self-Assessment question.

Requirement: Compatibility Testing and Certification of Software Distributions

Audience

You should read this section if you do not test and certify application compatibility on 80 percent of new or updated applications before deploying them to your desktops.

Overview

In general, applications are highly optimized for a specific operating system or operating system version. Application compatibility problems can arise when you have applications that were designed to run under earlier versions of Microsoft Windows operating systems. The rationale for testing applications is to ensure that deployment of any new software component does not affect end-user productivity or result in downtime. Compatibility testing is also mentioned as a required process for patch management and operating system deployment in the Core Infrastructure Optimization Model.

Even with these advanced compatibility features included in Microsoft products, you need to ensure that all your applications function properly under the latest Microsoft Windows operating systems before you distribute those applications to your organization’s desktops. This guidance is based on the Application Compatibility Toolkit and the Application Compatibility Feature Team Guide in Business Desktop Deployment 2007, both free resources from Microsoft that assist in identifying and managing your overall application compatibility management.

Phases 1: Assess

The Assess phase is important for the deployment process whether you are deploying new operating systems and intend to overlay applications as part of the deployment, deploying the applications themselves to existing machines, or deploying updates to applications. In the Assess phase, your organization should gather an inventory of the applications and data components in the environment that affect application compatibility. These items include:

  • Operating system version

  • Service-pack level

  • Geographic location

  • Computer manufacturer model and type

  • Applications installed

  • Business unit in which the computer is used

  • Organizational role that the user perform

Inventory and Collect Data

The Application Compatibility Toolkit provides a way to gather inventory data through the use of distributed compatibility evaluators and the developer and tester tools. Data can be collected for operating system changes of various magnitude, from large events (such as an operating system upgrade), to medium events (such as a browser upgrade), to smaller events (such as a Windows Update release). Having the ability to collect compatibility data into a single centralized store has significant advantages in reducing organizational risk during platform changes. As a component of the Application Compatibility Toolkit, the Inventory Collector examines your organization's computers to identify the installed applications and system information.

Systems Management Server (SMS) 2003 Software Inventory can also be used to inventory or audit software installed on computers within the organization. Typically, you would use SMS 2003 to identify applications with known compatibility issues. However, it is possible to create scripts or other executables to perform customized application compatibility inventorying, and then use the SMS Software Inventory to report the results to SMS.

Phase 2: Identify

In the Identify phase, you collect the proposed applications or operating systems to be tested and validated. For example, if you are deploying the 2007 Office system, your organization will identify the edition and selected applications to be tested in a pre-production environment. You will also ensure that all targeted operating systems or dependent applications are present in your pre-production test environment.

If your organization does not currently have a test environment that emulates the production environment, the Identify phase will also include building the test environment. This environment can be built using an adequate sample of physical machines with appropriate hardware and software components installed; however, it is generally recommended that the environment incorporate the use of virtualization technologies. For guidance on building a virtual test environment, please see Windows Server System Reference Architecture Virtual Environments for Development and Test (WSSRA-VE).

Phase 3: Evaluate and Plan

Once you have inventoried your environment, identified all of the applications to be deployed, and built a test environment, you can begin evaluating your applications, certifying them, and planning for deployment. The Application Compatibility Toolkit includes several components to evaluate compatibility.

Common Compatibility Problems

There are several reasons why an application written specifically for a different version of Windows—especially the Microsoft Windows 2000 Professional, Windows Me, Windows NT® Workstation 4.0, Windows 98, and Windows 95 operating systems—may manifest problems when run under Windows XP or Windows Vista. Most problems fall into the following categories:

  • Setup and installation

  • Kernel-mode drivers

  • Permissions

  • Heap management

  • Firewall

  • Distributed Component Object Model (DCOM)

  • Internet Explorer

For detailed information about common compatibility problems, see the following:

Windows XP: http://www.microsoft.com/technet/prodtechnol/winxppro/deploy/appcom/default.mspx 

Windows Vista: http://www.microsoft.com/technet/windowsvista/appcompat/entguid.mspx 

Compatibility Evaluators

The Application Compatibility Toolkit and Application Compatibility Toolkit Data Collector (ACT-DC) use compatibility evaluators to collect and process your application information. Each evaluator, described in the following list, performs a set of functions, providing a specific type of information to ACT:

  • User Account Control Compatibility Evaluator (UACCE). Enables you to identify potential compatibility issues that are due to permission restrictions enforced by the User Account Control (UAC), formerly known as Limited User Accounts (LUA). Through compatibility logging, UACCE provides information about potential application permission issues and ways to fix the problems so that you can deploy a new operating system.

  • Update Compatibility Evaluator (UCE). Provides insight and guidance about the potential effects of a Windows operating system security update on your installed applications. The UCE dynamically gathers application dependencies and is deployable to both your servers and client computers in either a production or test environment. The compatibility evaluator collects information about the modules loaded, the files opened, and the registry entries accessed by the applications currently running on the computers and writes that information to XML files uploaded to the ACT database.

  • Internet Explorer Compatibility Evaluator (IECE). Enables you to identify potential Web application and Web site issues that occur due to the release of a new operating system. IECE works by enabling compatibility logging in Internet Explorer, parsing logged issues, and creating a log file for uploading to the ACT Log Processing Service.

  • Windows Vista Compatibility Evaluator. Enables you to identify issues that relate to the Graphical Identification and Authentication (GINA) DLLs, to services running in Session 0 in a production environment, and to any application components deprecated in the Windows Vista operating system.

Test Plan

It is important to approach testing and certifying applications systematically. Develop a test plan in which you will exercise all the features of an application that your organization is likely to use. To simplify discovering potential conflicts between applications and the operating system, you can include Microsoft application compatibility technologies as a key part of your test plan. The Application Compatibility Toolkit provides tools for developers to test setup packages, Web sites and Web applications with Internet Explorer 7, and applications.

Compatibility Analysis and Mitigation

After collecting your compatibility data, you then analyze your findings and begin to plan for application compatibility mitigation, if needed. The Application Compatibility Toolkit also provides features and tools to help you organize, rationalize, and prioritize the data. The result of Compatibility Analysis should be to determine whether the applications are compatible on the user machines targeted for software distribution and, if necessary, which fixes need to be made to make them compatible.

The key deliverables from the analysis and mitigation processes are application mitigation packages for the deployment. You can create these packages automatically by using application compatibility tools, such as ACT. For other application mitigation methods, you can create the packages and the installation scripts or executables (including .msi files) manually.

The Application Compatibility Manager from ACT packages the solutions for deployment in the test and production environments. The Solution Builder contains a component called the Packager, which can take as input any number of Analyzer files or Compatibility Administrator files and create a self-extracting executable package.

It is often necessary to manually create the packages to be installed. In some instances, this process may include creating scripts or .msi files to facilitate installation.

See the Application Compatibility Feature Team Guide in Business Desktop Deployment 2007 requirement for more information.

Application Certification

After all applications are tested and mitigation strategies are in place, you can record it as certified in your test results. Any compatibility issues that you discover during testing must be well documented and reported to stakeholders and those responsible for application release in your organization.

Phase 4: Deploy

The Deploy phase discusses the deployment of mitigation packages alone. For information about application deployment and recommended tools, see the requirement for Automated Tracking of Hardware and Software for Desktops at the Rationalized level in the Core Infrastructure Optimization Model.

You can deploy the application mitigation packages much like deploying software updates, by using automated software distribution such as:

  • SMS 2003. Use the software distribution feature in SMS 2003 to deploy the mitigation packages when the organization has an existing SMS 2003 infrastructure.

  • Group Policy Software Installation. Use the Group Policy Software Installation feature in Active Directory to deploy the mitigation packages when the organization has an existing Active Directory infrastructure.

For each method, create an installation package (such as an .msi file) to automate the installation of the mitigation package. Use the defined application deployment process for your organization to deploy the installation packages.

Further Information

For more information on application testing, visit Microsoft TechNet and search for “application compatibility.”

To see how Microsoft does application compatibility testing, go to http://www.microsoft.com/technet/itshowcase/content/appcompattcs.mspx.

Checkpoint: Compatibility Testing and Certification of Software Distribution

Requirements

 

Collected and analyzed the application inventory in your organization to build your application portfolio.

 

Implemented standard testing of your mitigation strategies to create your application mitigation packages.

 

Implemented standard processes to resolve any outstanding compatibility issues to report compatibility mitigation to management.

 

Implemented automated deployment of all compatibility mitigation packages.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Compatibility Testing and Certification of Software Distributions capabilities of the Infrastructure Optimization Model. We recommend that you follow the guidance of additional best practice resources for software certification and testing so that applications are maintained to a known standard.

Go to the next Self-Assessment question.

Requirement: Patch Management for Servers

Audience

You should read this section if you do not have a patch management solution for 80 percent of your servers.

Overview

In the Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized guide, you read about patch management and distribution to desktops. To move from the Standardized level to the Rationalized level, you need to extend patch management to your servers. The tools and procedures for updating Windows-based servers are common with those used to update Windows-based desktops.

Although many of the processes are shared, there are several notable exceptions when patching servers and dependencies on other requirements in the Core Infrastructure Optimization Model. Servers often provide mission-critical functions with service level agreements (SLAs), depending on the server’s availability. Minimizing unplanned server downtime is a key operational and server patch management requirement because, unlike desktop downtime, server downtime can often impede an entire IT service or sometimes the entire organization from running. The Rationalized level begins to introduce SLAs, which often stipulate allowable maintenance intervals, especially when maintaining servers.

Patch management guidance in the Core Infrastructure Optimization Model is based on Patch Management Solution Accelerator content. Specific guidance is available for patch management using Systems Management Server (SMS) 2003, SMS 2.0, and Software Update Services (SUS) 1.0. The core concepts in these guides also apply to Windows Server Update Services (WSUS) and System Center products. For the latest information on patch management, visit the Update Management Solution Center on Microsoft TechNet.

As with desktop patch management, the phases of Assess, Identify, Evaluate and Plan, and Deploy and corresponding deliverables of each phase are identical to those described in the Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized guide as a requirement for the Standardized level in the Core Infrastructure Optimization Model.

Further Information

For more detailed information on patch management, see the “Automated Patch Distribution” section in Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized. You can also visit Microsoft TechNet and search for “patch management.”

To see how Microsoft addresses patch management, go to http://www.microsoft.com/technet/itshowcase/content/sms03spm.mspx.

Checkpoint: Patch Management for Servers

Requirements

 

Implemented process and tools to inventory hardware and software assets.

 

Implemented process and tools to scan client computers for software updates.

 

Established a process to automatically identify available patches.

 

Established standard testing for every patch.

 

Implemented patch distribution software.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Patch Management for Servers in the Infrastructure Optimization Model. We recommend that you follow additional best practices for patch management addressed in the Patch Management Solution Accelerator for SMS 2003.

Go to the next Self-Assessment question.

Requirement: Guaranteed Secure Communications with Mobile Devices

Audience

You should read this section if you do not have a secured and guaranteed way to verify secure communications between your corporate network and mobile devices.

Overview

As organizations consider mobile enterprise solutions, a key evaluation point is security. Mobile communication solutions need to be safe and reliable, whether they involve personal information or confidential transactions in the workplace. Personal digital assistants (PDAs) and smart phones are as important as laptop PCs when it comes to an organization's security plan.

In Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized ,  you read about establishing security policies as part of your overall management of mobile devices. To move to the Rationalized level, you need to automate enforcement of those security policies, especially in the area of remote communications. When you have put passwords and data encryption in place, you have taken the first steps to secure communications between your corporate network and mobile devices.

Phase 1: Assess

As discussed in the Basic to Standardized guide, during the Assess phase it is important to take an inventory of the mobile devices connected to your infrastructure and how people are currently using mobile devices. Organizations need to track and manage several areas of mobile-device use.

Phase 2: Identify

In the Identify phase, your organization needs to determine an appropriate mobile-device security level. Depending on your business and data security needs, end users may connect to your network with loosely controlled personal devices or with managed, company-provided devices. The next steps primarily focus on mobile-device authentication and how to deploy that capability in your organization.

Phase 3: Evaluate and Plan

In the Evaluate and Plan phase, your organization should consider which tools or technologies can be used to guarantee secured communications with your mobile users. There are many mechanisms and solutions available to help administrators take the next step in guaranteeing secure mobile communications. Windows Mobile 5.0 and Windows Mobile 6.0 can be combined with Microsoft Systems Management Server (SMS) 2003 to provide a centralized mobile device provisioning, management, and policy enforcement solution. In addition, numerous third-party software providers offer systems management solutions. Using these solutions, a centralized IT organization can maintain an asset inventory of the devices that connect to the corporate network and can automatically fix configuration settings and distribute software updates as they become available. For a sample list of systems management solutions for Windows Mobile-powered devices, please visit the Windows Mobile Solutions Provider Web site and search the Software Solutions section under the Systems Management category. The IT Management category in the Vertical Market Solutions section might also provide valuable information.

The next step in securing information on a handheld device is to protect the device against unauthorized access.

Authentication

Various mechanisms are available to identify and authenticate users. Windows Mobile certificate authentication was developed to help mitigate security risks on your mobile device. Current versions of Windows Mobile software support X.509 certificates, which provide a means for authenticating applications, users, operators, and servers. The certificates may be protected, stored, managed, and deleted on the mobile device.

To achieve a higher level of protection, Microsoft recommends that you use two of the following three approaches (often referred to as two-factor authentication):

  • Something the user knows (for example, a password).

  • Something the user has (for example, a security certificate in a smartcard or a SecurID token).

  • Something that is part of the user (for example, a fingerprint).

In certain cases, additional authentication may be required. These include:

  • Applications that require user authentication before they will run. This requirement may apply if the application hasn’t been used for a certain length of time, or it may occur on a repeating basis (for example, every 15 minutes).

  • A data storage card that has its own authentication mechanism to decrypt its data.

  • Additional authentication for accessing an organization’s private network. For example, Exchange Server 2003 and Exchange Server 2007 use Active Directory authentication to provide access to corporate e-mail, calendar, and contacts from mobile devices. With Exchange Server 2003 Service Pack 2 (SP2) and newer, devices can use a digital certificate for authentication so that users do not need to provide their network credentials. This reduces the risk that users’ credentials will be compromised.

  • An additional logon step for accessing a protected shared file server.

  • Additional sign-on credentials to access certain Web sites.

Other protections against unauthorized access to the devices available in Windows Mobile software include several forms of physical device security technology:

  • Power-on passwords. Current versions of Windows Mobile devices support power-on passwords to help protect access to the device. Windows Mobile Pocket PC also supports strong alphanumeric power-on passwords—that is, passwords requiring at least seven characters, including a combination of uppercase and lowercase letters, numerals, and punctuation. A four-digit password can also be associated with the phone card (Subscriber Identity Module, or SIM) for GSM devices. For greater protection, all passwords are hashed (converted into a different form, making them harder to break) before being stored. When a user attempts to access the device with an incorrect password, the system imposes a time delay before allowing access again—a delay that increases exponentially with each attempt. In addition, Pocket PC File Explorer software requires user authentication for accessing shared Windows-based file servers. For further protection, an organization can set up automatic enforcement of its authentication policies using centralized management software.

  • Cabinet (.cab) file signing. This uses third-party software to digitally sign a file using an X.509 digital certificate. This provides a way to determine the origin of the file and whether the file has been altered after it was signed.

  • Device management security technology. This enables over-the-air changes to be made in a way that helps protect against hackers. Most Windows Mobile devices that include wireless data functionality include some degree of device management security to help prevent arbitrary applications from being downloaded and executed over the air.

  • Application-level security. For such applications as Microsoft Internet Explorer Mobile, ActiveSync®, e-mail attachments, and infrared beaming. This type of security can use any of a number of approaches, ranging from requiring users to enter their passwords in order to gain access to applications, to authentication mechanisms such as biometrics or smart cards. The requirement could be set up to apply to each access attempt, to apply only when the application hasn't been used within a certain time period, or to require reauthorization at periodic intervals.

Phase 4: Deploy

Once your organization has determined the appropriate security controls and mechanisms to deliver and enforce the selected security strategy, the Deploy phase includes all of the processes to implement and maintain your security strategy.

Further Information

For more information on authentication and digital certificates for mobile devices, go to the following Web sites:

To see how Microsoft addresses secure mobile communications, go to http://www.microsoft.com/technet/itshowcase/content/trustmes.mspx.

Checkpoint: Guaranteed Secure Communications with Mobile Devices

Requirements

 

Inventoried mobile devices connecting to your network.

 

Determined a communication security strategy appropriate for your needs.

 

Implemented mobile device authentication to all connected devices.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for the Guaranteed Secure Communications with Mobile Devices requirement of the Infrastructure Optimization Model. We recommend that you follow the guidance of additional best practices for secure communications with mobile devices addressed in the Microsoft TechNet Windows Mobile Center.

Go to the next Self-Assessment question.

Requirement: Access to Web Applications Using WAP or HTTP for Mobile Devices

Overview

The integration of mobile devices, the Internet, and wireless connectivity provides an exciting opportunity for organizations to extend the reach of their information and services to mobile professionals. The potential results include improved productivity, reduced operational costs, and increased customer satisfaction. The ability to access the Internet and Internet application from mobile devices is the key to this increase in productivity.

Phase 1: Assess

During the Assess phase, you are looking for mobile devices in your organization and the Web applications in your organization accessible via the Internet. The Web applications referred to in this level of the model are limited to those accessible via the secured or unsecured Internet access and not LOB applications specific to your organization’s Intranet.

Phase 2: Identify

In the Identify phase, you begin to look at the Web applications that would potentially benefit end-user productivity and at the devices discovered in the Assess phase. You will also identify the devices capable of supporting HTTP or WAP browsing as well as meet the requirements you previously defined for secured communication.

Phase 3: Evaluate and Plan

During the Evaluate and Plan phase, the goal is to determine which Web applications can be currently used with mobile device browsers, the usability of these applications per device, which devices perform the best when accessing these applications, and finally which investments should be made to either tailor Web applications to mobile devices or standardize mobile device hardware.

Standardizing Devices

As the use of mobile devices increases in your organization, the need to control types of mobile devices also increases. Without standardization, the mix of mobile devices connecting to your corporate network would be nearly impossible to manage. User authentication, standardization of operating systems, patch management, and other everyday administrative controls can only be effectively managed when you have established an organizational standard for each type of mobile device. For more information on managing mobile devices, go to http://www.microsoft.com/technet/solutionaccelerators/mobile/evaluate/mblmange.mspx.

You need to consider many issues and device features when planning a mobile device solution for your organization. For guidance in planning a mobile device solution, go to http://www.microsoft.com/technet/archive/itsolutions/mobile/deploy/mblwirel.mspx?mfr=true.

There are several operating systems available for mobile devices. Windows Mobile devices offer access to Web-based applications with extensive security and authentication features. For additional information on deploying, maintaining, and supporting Windows Mobile devices, visit http://www.microsoft.com/technet/solutionaccelerators/mobile/default.mspx.

Internet Access

There are numerous reasons for mobile devices to have the ability to access the Internet. Among these are:

  • Software upgrades and patches.

  • Corporate data access and synchronization.

  • Access to Web-based applications.

This section of the guide addresses access to Web-based applications through Hypertext Transfer Protocol (HTTP) and Web Access Protocol (WAP).

WAP

WAP is a communications protocol that can be thought of as being similar to the combination of HTTP and HTML, but optimized to account for the low memory, low bandwidth, and limited resolution of mobile devices. For more information on WAP, go to http://www.wirelessdevnet.com/channels/wap/training/wapoverview.html.

Web-based Applications

Employees who use mobile devices in their day-to-day job functions often need access to information most readily available on the Internet. This information is usually dynamic (time-sensitive or constantly changing), or it is retrievable based on search criteria. Examples of applications that present this type of information are stock quotes and transactions, e-mail, sports scores, real estate listing services, and map services.

These services can be accessed through a mobile device if that device has a WAP-designed browser that simplifies the content to account for the restrictions of mobile devices. For more information on developing applications for mobile devices, read about Microsoft ASP.NET mobile controls on MSDN.

Phase 4: Deploy

Once your organization has determined the appropriate plan for providing Web application access to mobile device users, the Deploy phase includes all of the processes to implement and maintain your plans.

Further Information

For more information on mobile devices, visit Microsoft TechNet and search for “mobile device” or “WAP.”

Checkpoint: Access to Web Applications Using WAP or HTTP for Mobile Devices

Requirements

 

Inventoried mobile devices connecting to your network and Web applications currently consumed or potentially consumed by mobile device users.

 

Developed and implemented a strategy to optimize Web applications for mobile device users, update mobile device hardware, or both.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Access to Web Applications using WAP or HTTP for Mobile Devices requirement of the Infrastructure Optimization Model. We recommend that you follow the guidance of additional best practices for secure communications with mobile devices addressed by the Microsoft TechNet Windows Mobile Center.

Go to the next Self-Assessment question.

Requirement: Server Consolidation and Virtualization

Audience

You should read this section if you do not have a plan for server consolidation with virtualization.

Overview

Consolidation of physical infrastructure, in general, is an effective business strategy. Consolidation of locally situated physical servers has proven effective in reducing server sprawl and, thereby, improving IT efficiency, enhancing flexibility, and reducing Total Cost of Ownership (TCO).

Virtualizing applications or services means installing and running an application or service using virtual machines on a physical computer; the physical computer is running a host operating system as well as a virtual or guest operating system to implement the virtual machines. The virtual machine runs its own operating system, which can either be migrated to a later operating system or, for short-term solutions, can be the same operating system as that used before virtualization.

Virtualization takes consolidation to a new level, breaking the 1:1 relationship between application and server. Virtualization is a consolidation technique that yields additional benefits by abstracting the applications from the physical server and placing them on virtual machines (VMs), many of which can reside on a single physical host. This requirement calls out the virtualization best practices highlighted in the Solution Accelerator for Consolidating and Migrating LOB Applications. Additional guidance for using virtualization in the context for development and test can be found in the Windows Server System Architecture Virtual Environments for Development and Test guide.

Phase 1: Assess

The goal of the Assess phase again is to take an inventory of the applications, services, and infrastructure in your organization. You may have generated such an inventory already for other requirements in the Infrastructure Optimization Model. Systems Management Server (SMS) 2003 Inventory can also be used to inventory the server applications and infrastructure in your organization. Additionally, you will need to collect software usage information for use in the Identify phase.

Phase 2: Identify

After you have generated another inventory of applications and services with corresponding usage details, the Identify phase will analyze the information to determine which applications or services are candidates for virtualization. You should identify the applications or services, especially those requiring limited system resources, on older hardware or non-standard operating systems and applications. Some of the key points to remember when identifying your initial virtualization candidates are described in the following sections.

Infrastructure Reduction

In this case, you are identifying where hardware can be consolidated by running multiple applications or services using virtual machines on a single host operating system and single piece of hardware. Virtual machines are still isolated from one another as with physical infrastructure, but you can increase overall system utilization, reduce the number of physical systems to manage, and reduce facility requirements, such as floor space, rack space, power, and cooling.

Hardware Independence

If an application or service is running on hardware that is currently obsolete, this may be a good candidate for virtualization. If the application or service does not warrant the investment to be updated for newer hardware, virtualization may be an excellent option to mitigate this expense by continuing to run a virtual machine form of the obsolete hardware on a newer computer.

Software Independence

As with hardware independence, the virtual machine can continue to run on an older operating systems and with older applications. This option should only be used when the investment to upgrade the operating system or applications is unwarranted or unfeasible as there are many security, maintenance, and support trade-offs for allowing this standardization exception to continue.

Phase 3: Evaluate and Plan

The Evaluate and Plan phase defines the planning goals, options, and processes for consolidating and migrating application servers onto virtual machines. The following sections briefly explain evaluation and planning relevant to your virtualization strategy; this guidance is derived from the Solution Accelerator for Consolidating and Migrating LOB Applications.

Development of Strategies for Consolidation Through Virtualization

The consolidation strategy should be based on your goals and a comprehensive assessment of your environment, requirements, options, and potential impacts, including the following factors:

  • Results of the analysis of each application and server, including all characteristics that are relevant to determining consolidation options.

  • Consolidation options available for each application, and the advantages and disadvantages of each option.

  • Any service-level impacts related to each application (including any mitigation techniques for minimizing the impact on the changed run-time environment).

  • Impacts of an application on the business unit, including mitigation techniques for minimizing impacts on application owners.

  • IT organization impacts related to each application (including those related to domains, networks, server, storage, applications, and clients).

  • End-state impacts for each consolidated application (focusing on the virtual machine environment).

  • The security impacts for each consolidated application (especially any mitigating factors related to running applications in virtual machines).

These factors are critical to determining the appropriate consolidation strategy for your organization.

Operations Management Planning

Consolidation of application servers typically requires improvements in operational processes to offset the technical complexities and operational risks associated with consolidation, with special consideration required for the impacts of running applications in virtual machines. There are operational planning areas directly affected by consolidation and virtualization. Ensuring proper management of these areas requires stringent analysis and frequently requires modifications to operational processes.

Backup and Restore Strategies

When applications or services are consolidated using virtual machines, you need to consider not only the backup and restore strategies for the application servers running on the virtual machine, but also the backup and restore strategies for the host operating system and Virtual Server.

Change and Release Management

The change and release management plan should address the physical server and host operating system, as well as the virtual machines being deployed. The goals of effective change and release management should include:

  • Minimizing service downtime and changes to the user configuration.

  • Providing a smooth transition to the production environment.

  • Meeting all business objectives.

A good design and management plan can help ensure that these goals are met.

Monitoring

Planning for effective monitoring of a virtual machine environment requires real-time assessment and response to application-specific issues. In general, this requires completion of the following planning tasks:

  • Agreeing on what the monitoring requirements are.

  • Identifying and deploying appropriate processes and technologies for monitoring the host operating systems and guest operating systems.

  • Deciding how to optimize the performance of the processes and technologies.

  • Putting plans in place that provide for performing continuous monitoring of the infrastructure, as well as turning monitoring off and on when necessary.

Service Support

Service support includes the processes, procedures, tools, and personnel that identify, assign, diagnose, track, and resolve incidents, problems, and requests within the approved SLA. During consolidation, the key objectives of support resources are to ensure that the business goals of consolidation are met and that the incidents and problems are resolved quickly.

Service Optimization

Service optimization includes defining optimization requirements to be included in the SLA, managing capacity and availability, managing service continuity, managing staff, and financial planning to ensure that the cost of consolidation is justified and budgeted.

Deployment Planning

Effective planning for a consolidation solution using virtualization requires not only establishing appropriate consolidation, virtualization strategies, and operations plans, but also defining processes for testing, piloting, and rolling out the solution. This effort includes deciding how, when, and where to implement the solution, as well as how to roll back the solution, if required, at any phase of the implementation. Deployment planning should include:

  • Testing

  • Pilot program

  • Rollout programs

  • Rollback methods

Creation of Logical and Physical Designs

The design of each instance of the virtualization technology and the platform on which it resides should include designing for reliability, availability, security, and scalability. It should address both the operations environment and infrastructure considerations. The creation of logical and physical designs should include:

  • Domain infrastructure design

  • Server sizing and performance

  • Virtual machine design

  • Server mapping

Evaluation and Planning Summary

The evaluation and planning guidance included a high- level overview of the tasks required to plan for the consolidation of application servers onto virtual machines; these are typically the initial services that are consolidated and migrated using virtualization technologies.

Virtual Server 2005

Microsoft Virtual Server 2005 provides server virtualization features described above that support the complex requirements of enterprise server applications and administration.

For more information about Microsoft Virtual Server 2005, visit: http://www.microsoft.com/technet/prodtechnol/virtualserver/default.mspx 

Phase 4: Deploy

The intent of this requirement in the Core Infrastructure Optimization Model is to have the plan in place for consolidation of IT services and applications using virtualization technologies. For guidance on deploying virtualization to consolidate applications using Microsoft Virtual Server 2005, see the Solution Accelerator for Consolidating and Migrating LOB Applications: Implementation Guide for the Virtual Server 2005 Solution.

Further Information

For more information, visit Microsoft TechNet and search for “virtualization.”

To see how Microsoft implements virtualization, go to http://www.microsoft.com/technet/itshowcase/content/virtualserver2005twp.mspx.

Checkpoint: Server Consolidation and Virtualization

Requirements

 

Inventoried all IT services and applications in your organization, including performance and traffic data.

 

Developed a plan to consolidate server infrastructure by implementing virtual machine technologies.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Server Consolidation and Virtualization capabilities of the Infrastructure Optimization Model. We recommend that you follow the guidance of additional best practice resources for server consolidation and virtualization addressed in the Solution Accelerator for Consolidating and Migrating LOB Applications.

Go to the next Self-Assessment question.

Requirement: Layered Imaging for Desktops

Audience

You should read this section if you do not have a layered-image strategy for managing your desktop images.

Overview

In Core Infrastructure Optimization Resource Guide for Implementers: Basic to Standardized ,  you read about creating, deploying, and maintaining standard images for desktops, and the three approaches to creating disk images—thick, thin, and hybrid. Creating a thick image is a viable approach for standardizing image deployment, but using a thin or hybrid image can increase efficiency and reduce deployment and maintenance costs.

The layered-image approach advocates the thin and hybrid image strategies, meaning that only the OS itself or OS with limited standard core applications is deployed to target machines. Supplemental applications, drivers, or language packs are added via an installation sequence separate from the main image at deploy time. The impact of this is that there are fewer core images to maintain and more flexibility for adding components outside the core image at deployment. To help with determining and implementing image strategies, the Business Desktop Deployment (BDD) 2007 Computer Imaging System Feature Team Guide discusses the options for desktop imaging and goes into detail for creating desktop images using Microsoft technologies.

For more information on the cost benefits of consolidating images, read the Optimizing Infrastructure: The Relationship between IT Labor Costs and Best Practices for Managing the Windows Desktop white paper.

Phase 1: Assess

The Standardized level in Core Infrastructure Optimization requires a standard image strategy and a maximum of two desktop operating system versions. In most cases, organizations that have achieved standard images are using techniques for sector-based imaging, or taking a snapshot of a known well-configured machine with all the required applications, components, and settings.

In the Assess phase, you are looking for the number of standard images actually maintained by the organization. This includes anything developed for foreign language operating systems, various hardware abstraction layer (HAL) types, or images maintained for certain unique services in the organization, such as call centers or media labs. You should catalog the components of every image and, during the Identify phase, seek threads of commonality among these images to identify how to get to a thinner and more consolidated set of desktop images. To automate the gathering of software, hardware, and OS information on machines used currently for standard image capturing, it is recommended to use the Application Compatibility Toolkit, SMS 2003 hardware and software inventory, or the Windows Vista Hardware Assessment.

Phase 2: Identify

In the Identify phase, you should look at the common elements of the images cataloged. Thin or layered images may not be the best solution in every operation of your organization, but there are several distinct benefits to having thinner images; these are discussed in the next phase. Certain limiting factors, such as platform type (x64 or x86) or multiple HAL-types when using Windows XP or earlier operating systems, will force you to maintain multiple images. These limitations are relieved somewhat with Windows Vista, where one image can support multiple HAL-types, but unique images are required for x86 and x64 platforms.

The requirement for layered images drives toward further image consolidation, which will reduce overall maintenance requirements for managed images. Using your catalog of standard images, isolate where the opportunities are in order to consolidate operating systems as much as possible. This can start with things like standardizing on a single language core OS with Multilingual User Interface (MUI) packs or opting for a language-neutral OS with add-on language packs.

After consolidating language versions, examine the applications, if any, required for every user in the organization; these can include such items as antivirus applications, productivity suites, or special LOB applications required as standard applications on every user’s machine in the organization. Then look for the applications that can be packaged and available for installation sequences after the core thin images are installed; these can be LOB applications unique to certain departments, such as foreign language software used in limited locations. Once you have isolated the candidates for the core image and those components that can be added as layers after the core image installation, you are ready to begin evaluating how to incorporate a layered-image strategy into your organization.

Phase 3: Evaluate and Plan

Now that you have identified the commonalities of your managed images and isolated opportunities to further consolidate images and move closer to a layered-image approach, you are ready to begin evaluating and planning where it makes the most sense to use the layered-image approach in your organization and where it isn’t possible. After evaluating the appropriate usage of layered images in your organization, you can begin planning for the transition. The following sections define the possible image types and discuss the advantages of layered images to help evaluate the best strategy for your organization.

Thick Images

Thick images contain the operating system, applications, and other corporate standard files. When you use a thick image, everything is loaded onto the client computer in one step. The disadvantage of thick images is that all computers receive the same configuration.

Layered Images

Thin images contain few if any applications. Applications are installed individually on the client computers. Thin images allow flexibility in customizing each client computer, but deployment time increases significantly.

A hybrid image approach combines the benefits of thick and thin image approaches. You create a baseline image consisting of the operating system and any company-standard applications and corporate data files that are used on a majority of desktops within your organization. You then create secondary images that contain applications and data files that are specific to the various organizations that you support. For more information on image types, go to http://technet2.microsoft.com/WindowsServer/en/library/b5a36970-0de1-4386-a824-529b0272a3171033.mspx?mfr=true.

Advantage of Layered Images

Layered images have several advantages over both thick images. These are:

  • Deployment time.

  • Maintenance.

  • Flexibility.

Deployment Time

Deployment of layered images represents a compromise between thick and thin image deployment. Loading two layered images takes slightly longer than a single thick image, but it is considerably faster than loading a thin image and numerous applications. Planning the deployment is also simplified. Your focus is on organizational deployment rather than on individual computers.

Maintenance

You have several areas to consider in image maintenance. The first is image creation and storage. Another is image content patching and updating.

Depending on the number of operating systems and application versions you must support, you could have numerous distinct thick images to create, store, and maintain. You need to rebuild an entire image each time there is an update to any single component of the image.

When using a layered-imaging strategy, you will need to create and store fewer thin images, and updates are simplified because you have only the operating system and a few applications to patch. All other applications must be updated on each computer on which they are installed. This is far more time consuming than updating and redeploying a layered application image.

Hybrid images combine the benefits of thick and thin images. You need to build fewer overall images, and updates need be applied only to the images that contain the application to be updated. You can then deploy a smaller image to fewer systems.

Flexibility

You have far greater flexibility in planning and deploying layered images than you do with the thick image approach. You can customize your disk images based on organizational structure, system configurations, data security needs, and many other criteria that match your organization’s needs.

Planning and Building Standard Images

Once you have determined the right level of core applications and components to be added into the core operating system image and which applications will be installed outside the operating systems image, it is time to begin building standard desktop images. The process outlined in BDD 2007 contains four primary steps for building images using the BDD 2007 Deployment Workbench console in conjunction with the ImageX command-line imaging utility. Desktop imaging is an iterative process focused on creating, testing, and revising images until they are determined to be stable for deployment. The four key steps in this process are described below in conjunction with the Computer Imaging System Feature Team Guide in BDD 2007.

Add Applications to a Distribution Share

As a first step, you will add applications, including hardware-specific applications, and specify dependencies between applications to a distribution share. These components will be used to create or configure builds in the following step.

Configure Builds

Builds are operating system configurations that include unattended setup answer files and, in the case of layered images, a task sequence. Builds associate an operating system with a configuration and contain the tasks required to take place outside the operating system deployment. A task sequence should be defined in a task sequencing engine or, in some cases, via scripting for required pre- and post-installation routines.

Configure Deployment Points

Deployment points contain a subset of the distribution share’s source files and builds; they also specify how to install the builds they contain. Updating a deployment point generates the images necessary to connect to the deployment point and begin installation.

Capture Operating System Images

The final step associated with desktop imaging is the image capture of the reference computer. You will specify whether to prompt for image capture when creating a build in Deployment Workbench. Then, the process will ask whether to capture an image of the reference computer during the initial interview.

Phase 4: Deploy

Image deployment is not covered in the imaging process. Refer to the requirement of Automated Operating System Distribution at the Rationalized level in Core Infrastructure Optimization.

Further Information

For more information on layered images, visit Microsoft TechNet and search for “layered imaging.”

To see how Microsoft has simplified disk imaging with Vista, go to http://www.microsoft.com/technet/itshowcase/content/vistadeploy_twp.mspx.

Checkpoint: Layered Imaging for Desktops

Requirements

 

Inventoried and rationalized the current set of managed desktop images in your organization.

 

Developed and implemented a strategy to consolidate desktop images by using thin or hybrid layered imaging for desktop deployment.

If you have completed the steps listed above, your organization has met the minimum requirement of the Rationalized level for Layered Imaging for Desktops capabilities of the Infrastructure Optimization Model. We recommend that you follow the guidance of additional best practices for layered disk images addressed in the Computer Imaging System Feature Team Guide found in BDD 2007.

Go to the next Self-Assessment question.

Was this page helpful?
(1500 characters remaining)
Thank you for your feedback
Show:
© 2014 Microsoft