Desktop Image Management: Build a Better Desktop Image

Building and maintaining corporate desktop images is a core IT task, but it doesn’t have to be overwhelming. Here are some ways to streamline the process.

Mitch Tulloch

Most large organizations have hundreds of desktop images they have created and use for deployment. There are a number of reasons why a single organization would create so many images. Often it’s due to factors beyond anyone’s control, like changes in platforms and hardware configurations. The lack of a standardized image engineering methodology can also lead to image proliferation.

Then there’s internal politics—competing IT fiefdoms demanding control over what gets deployed on their systems. Some other considerations may be unavoidable, like the specific requirements of high-security environments. Whatever the cause, maintaining a massive library of corporate desktop images can be difficult, time-consuming and expensive. How can you mitigate the problem?

The Single-Image Goal

There’s no one-size-fits-all solution and there are many tradeoffs to consider. However, new features in Windows 7 and enhancements in Microsoft Deployment Toolkit (MDT) 2010 and System Center Configuration Manager (SCCM) 2007 R2 simplify the task of building, maintaining and deploying numerous Windows images. Choose your strategy for building and maintaining Windows 7 images based on the business needs, budget, hardware and level of IT sophistication of your organization. There are also a number of common “gotchas” to watch out for and to avoid.

The Windows Imaging (WIM) file-based disk image format first introduced in Windows Vista, has made the goal of a single corporate desktop image more attainable. The new Deployment Image Servicing and Management (DISM) tool introduced in Windows 7 simplifies maintaining images by letting you service them offline. This can greatly decrease the time required to update a stale image. For example, you can use DISM to quickly add software updates and packages, add or remove Windows features, and add third-party INF-based device drivers to an existing image.

While the Windows Automated Installation Kit (WAIK) for Windows 7 includes tools for manually building and maintaining images, the new features and enhancements found in MDT 2010 make your job easier. The new Sysprep and Capture task sequence lets you capture an image of an existing, fully customized reference computer on your network. MDT 2010 Windows PowerShell support also lets you write your own scripts to automate the task of building and maintaining reference images. You can organize your OSes, drivers, packages and applications by creating custom folders in the Deployment Workbench folder hierarchy. You can then use Selection Profiles to control which drivers and packages you’ll inject into your boot images and deploy using your task sequences.

These improvements virtually eliminate the technical requirements for creating multiple images. You only need to create additional images for specific business reasons. Now most organizations will only need a single corporate desktop image per architecture—one for deploying x86 Windows and one for deploying x64 Windows.

The ultimate goal is to have a single image that’s easy to maintain, will install on any corporate-supported hardware, will work for users in any region of the world, and will provide the desktop, applications and customizations they need in order to perform their jobs. Before you can move your organization toward this particular version of utopia, however, you’ll need to consider whether your desktop images should be thick, thin or hybrid.

Thick Images

Using thick images is the “all-in” approach to desktop images. The traditional way of building a thick image is to install Windows on a reference computer. Then install all the drivers you’ll need, all the applications your users will need, the latest software updates, customize everything, and then use Sysprep to generalize the image. You then capture and deploy the image to users’ computers using MDT, SCCM or custom in-house deployment tools.

“In building an image, careful consideration needs to be made on how the image will ultimately be deployed,” says Jeff Stokes, a premier field engineer at Microsoft who has been active in Windows 7 and Windows Server 2008 R2 deployments with MDT 2010.

For example, one problem with thick images is that they tend to be big—sometimes too large to fit on a single DVD if you’re doing media-based deployments. They can be so big as to consume valuable bandwidth when deploying over the network. Some users may also end up with software they don’t need, which is confusing. Unused applications must also be licensed on each computer installed, which can quickly drive up your costs. Thick images may also become outdated quickly, especially if you’re trying to ensure your images are always fully patched. Testing thick images is also more time-consuming because of the complexity of additional components they contain.

In any event, image size is often an issue with thick images. “You may want to consider creating custom multi-image WIM files as it can lead to large space savings and reduce the number of task sequences for deployment. See my blog post for more information,” says Michael Murgolo, a senior consultant with Microsoft Consulting Services.

Thin Images

Thin images are the “ante-up” approach. You keep everything at a minimum for simplicity. A thin image may include only boot-critical drivers, a service pack if available, and a few essential customizations and nothing else. The ultimate thin image is the install.wim file from your Windows 7 product media.

When you deploy thin images, you’re basically deploying only the OS. This means you have to deliver additional components like software updates and applications separately outside of the OS image. Doing things this way means additional infrastructure.

For example, you can use Windows Software Update Services (WSUS) to deploy software updates and Group Policy Software Installation to deploy applications. For larger environments, use SCCM for packaging and distributing applications and software updates to users. Organizations choosing to deploy virtualized applications to end users using Microsoft Application Virtualization (App-V) or letting users run RemoteApp programs using Remote Desktop Services are also good candidates for thin images.

While you can distribute software updates soon after deployment using Windows Software Update Services (WSUS) instead of building them into your images, this is not a good idea. It leaves open a brief window of vulnerability during which your deployed computers are less secure. A better approach is to add all available software updates to your base image when you build the image. That way, when you deploy your base image to your target computers, they’re secure from the start.

If you’re using MDT to build your base image, add WSUSServer=https://wsus_server_name to the CustomSettings.ini file for your deployment share. You can then use MDT to deploy a fully updated version of Windows to your reference computer, Sysprep the reference computer and capture its image, and upload the captured image to your deployment share. MDT makes all this easy by letting you automate the entire process.

“Use a dedicated WSUS server with automatic approval for installing software updates during the base image build process,” says Alexey Semibratov, a consultant II with Microsoft Consulting Services who works in the public sector, and state and local government areas. He has completed several Windows 7 projects.

Organizations that don’t have WSUS also have options. “Configure your Task Sequence to point directly to the Windows Update site for installing software updates during the base image build process because most organizations don’t have enough resources and time to review each and every patch coming from Microsoft every month,” Semibratov explains. Either way, he says “By having the image updating process 100 percent automated like this, you’ll save a lot of time—and even more so if you use WSUS.”

Hybrid Images

Most of the action in the deployment game these days is in hybrid images. A hybrid image is one that is lightly customized. It may include drivers, software updates, key applications every user needs, and essential customizations required either for business productivity or to ensure compliance with corporate policies. For example, you might create a base image for all your organization’s desktop computers that includes a productivity suite like Microsoft Office 2010 together with malware protection software like Microsoft Forefront Client. You could then use MDT to deploy the image to all users and SCCM for post-deployment provisioning of other line-of- business (LOB) applications to specific users.

With hybrid images, there are usually only a few basic customizations baked into the image. Most customizations are applied post-deployment using Group Policy or scripts to configure Windows Firewall, enable Remote Desktop, map network drives and so on. The hybrid approach also helps with third-party applications that don’t support unattended installation or are known to have compatibility issues with the Sysprep imaging process.

Multinational organizations can use this approach to deploy language packs to users on an as-needed basis. Because it takes time to install each language pack, however, adding a dozen different language packages into a single image can slow deployment considerably.

When deciding between thick, thin and hybrid images, you should also consider how often you’ll need to update your image.

“What goes into the image will determine whether a thin, thick or hybrid image is needed. If large parts of the image will need to change frequently, then a thin image might be the right choice. If packing everything into one file with no post-deployment tasks is the goal, a thick image may be best,” says Howard Carter, a Microsoft consultant specializing in Windows desktop image development and deployments with more than eight years of experience helping various government customers. “Typically, a combination of these techniques is used, resulting in a hybrid image where larger/non-changing items are included in the image, while smaller/frequently changing items are installed at deploy time,” he adds.

“In building a base image, some thought should be put into how often applications are changed or upgraded,” says Stokes. “After all, you don’t want to cook apps into your base image that will cause you to have to re-cook the image on a regular basis.”

Using MDT 2010

Regardless of the type of desktop image you use, you should use MDT 2010 to build and maintain that image. Semibratov says you can fully automate the image building process with MDT, and by automating that process, you get a consistent, stable result. Carter concurs. “Microsoft Deployment Toolkit provides an interface that provides a consistent and repeatable image creation sequence every time,” he says. “This removes the possibility of human error from the image-building process.”

Semibratov also points out that you no longer need a physical computer for building images. You can use Microsoft Hyper-V to build Windows 7 images. For this purpose, Stokes highly recommends Michael Niehaus’ Image Factory, which leverages MDT 2010, System Center Virtual Machine Manager (SCVMM) 2008 R2 and Hyper-V for maintaining Windows 7 images. Semibratov has also created a “set it and forget it” script that uses Hyper-V and MDT to automatically build Windows images and save the captured WIM file to any location.

If you plan to deploy your image with SCCM instead of MDT, you should still use MDT 2010 to build and maintain your image. Semibratov suggests using MDT 2010 to build your “golden image.” Then use SCCM with MDT Extensions (including “Modena” in the next release of MDT) to distribute the image to end users. You can use both MDT and SCCM to deploy the image, but using MDT to create your reference image lets you configure the look and feel of the user’s desktop. SCCM can’t do this because it runs under the Local System account.

Tips and Gotchas

Here are some additional tips and “gotchas” to keep in mind when building and maintaining corporate desktop images:

  • You can use DISM to audit your offline images by getting a list of the OS editions, device drivers, software updates and other packages, international settings and application patches in your image. Auditing your images regularly is an important part of any ongoing image maintenance plan. For more information, see the DISM command-line options,
  • If you use thick images, consider adding some applications to the image as pre-staged files (like application .msi or setup.exe files). Then configure the deployment process so that once you install Windows, a script kicks off local installation of the pre-staged application files. The advantage of this approach is that because the pre-staged applications haven’t actually been installed in the image, they can easily be updated while the image is offline (for tasks like replacing an older application with a newer version).
  • Thick images are helpful for organizations that need to deploy new (or refreshed) fully customized desktop environments to users as quickly as possible—such as call centers. In this case, deploying pre-staged applications can slow the deployment process and thus counter business needs.
  • Semibratov suggests that you should turn off automatic updates for products like Adobe Acrobat Reader. “In a locked-down environment, the ability to install updates is blocked for standard users,” he says. “So if the software is trying to update itself, it will cause prompts asking for credentials, which may result in calls to the help desk."
  • "If your organization is waiting on an SCCM deployment that might take three to six months to roll out, take a serious look at building a small MDT 2010 repository so you can get started building task sequences and base images. All the work you’re doing can later be brought into SCCM when you’re ready to do so, and building an MDT 2010 environment is a piece of cake,” explains Stokes.
  • Michael Murgulo has an excellent post on The Deployment Guys blog that describes different methods for configuring default user settings for deployment so the user will experience a consistent, known experience the first time he logs on to his computer.
  • Carter recommends that you always use the WIM format versus third-party sector-based imaging solutions. “This is because WIM provides significant flexibility since data can remain on the disk during deployment versus having to be moved to a remote location such as a network share or external USB drive,” he says. “By contrast, sector-based deployment requires that all data to be saved be moved off the hard drive prior to installing the image, and this transfer can add significant time to deployment."

To drive home that last point, Howard tells this story: “I once worked with a customer who selected a deployment tool without considering the implications of using an image file format other than the Windows Imaging format. The selected deployment tool used a file format that leveraged sector-based structure, thereby overwriting all data on the drive during the deployment. They also wished to use the User State Migration Tool (USMT) to preserve user data during the migration process.

“Because the user data could not remain on the drive, however, it had to be moved to a remote network store and then moved back to the drive once the image was installed. This approach resulted in very long deployment times, which could have been made much shorter if the WIM format had been used.”

Just another example of the importance of considering all your options when building, deploying and maintaining desktop images.

Email Mitch Tulloch

Mitch Tulloch is a Microsoft MVP (Windows Server–Setup/Deployment) and lead author of “Windows 7 Resource Kit” (Microsoft Press 2009). Tulloch also maintains an Unofficial Support Site for the Resource Kit on his Web site mtit.com. Thanks also to Alexey Semibratov, Howard Carter, Jeff Stokes and Michael Murgolo of Microsoft Consulting Services for their contributions, and Keith Garner and Tim Mintner of Xtreme Consulting Group.