Choosing an Image Strategy and Building Windows 7 System Images

Now that we’ve introduced deployment, data migration, and application compatibility, let’s focus on imaging. Now this is not the imaging that involves photos and cameras, but the imaging of computer disks.

Quick History Lesson for System Imaging

Imaging tools have been around for a long time, and the most basic tools essentially back up an entire hard drive, sector-by-sector. Then we can restore that drive, if desired, to the same computer or another computer. This is basically a form of drive cloning, and it was popularized in the 1980s and 1990s. Like I said in the first section, this type of imaging is fairly archaic when used for deployment in the enterprise, because you’ll need to maintain an image per Hardware Abstraction Layer (HAL) type. For people managing Windows XP, you’ll often see an image per language or region as well. What does this mean? Usually it means tens or hundreds of images to manage for many organizations, all requiring maintenance when “Patch Tuesday” or similar events or updates come around.

But sector-based imaging can’t be that bad right? Well, let’s say you have everything centralized and you have 20 images to manage and up to 20 computers in your lab. When that critical update hits, we’ll spend an hour rebuilding each of those computers, maybe an hour configuring them, and then up to three hours recapturing the image with sector-based imaging tools. That means 100 hours per month if you maintain images monthly and 1200 hours per year. To be fair, you aren’t clicking and configuring things manually the entire time, but it’s probably fair to say you’ll spend two hours per system across all tasks, and you’ll eventually have terabytes of system images to find space for. If you are using the System Preparation Tool (Sysprep) to generalize the image for installation on other computers, you get only three passes of the tool per system image over its lifetime. You generally need to capture each image before running Sysprep and afterward, so you can start next month with an image before you ran Sysprep, or else you need to start completely from scratch each time and apply all the changes since the last service pack.

Fast forward to 2003 when engineers were determining the future of system imaging, and along comes the Windows Imaging Format or “WIM” file. At the time, I was working with the Systems Management Server (SMS) and Solution Accelerator teams at Microsoft, and WIM was a prerequisite for the Operating System Deployment Feature Pack released in 2004. WIMs are file-based and compressed images that can also save the contents of a drive. WIMs used with Windows XP were a pretty good option from a deployment standpoint, based on the reduced image size and ability to pass that package over the network, but they were still tied to one Hardware Abstraction Layer (HAL) type.

Fast forward to around 2006 and the early iterations of Windows Vista…

WIMs used for Windows Vista and Windows 7 imaging and deployment take on a whole new meaning. Remember those tens or sometimes hundreds of images to maintain and up to five hours per month per image? With Windows Vista and Windows 7, you can get down to a single image per operating system architecture (for example, a 32 bit image and/or a 64 bit image). As an example, right now I am in an airplane writing on a Fujitsu U820 ultra mobile Tablet PC uniproc that I built using the same image I’ve applied to my bigger and less airplane-friendly Lenovo T60P 15” multiproc laptop, in addition to countless other hardware types.

But it gets even better than only a single image to manage for all hardware (and languages by the way, too). Remember the five hours or so we would spend building, configuring, and recapturing that old sector-based image? We can mount the file-based images of Windows Vista or Windows 7 to a file folder and service them offline. In other words, I need one computer in my lab to use as a reference machine for all computers, I can use a free tool in the Windows Automated Installation Kit called ImageX to capture and apply system images, and I don’t necessarily even need to use that reference computer in my lab to service my one image on Patch Tuesday. I can mount the image in a folder on my image storage server, use the Windows 7 and Windows Server 2008 R2 in-box tool called dism.exe (“Deployment Image Servicing and Management,” in case you’re wondering) and enumerate the contents of the image to see packages, updates, drivers, and features. I can then modify these contents offline by using dism.exe—again without building that reference lab computer. Those five hours it took you to apply the three critical patches on Patch Tuesday can take as little as about two minutes to mount the image, ten minutes to service it, and two minutes to unmount it. I’m usually pretty happy if I can save four hours and 45 minutes performing an otherwise boring, but necessary task. And instead of doing it 20 times and using 20 physical computers, I do it once. Makes sense, right?

To show some of that, here is a video of Sysprep and ImageX that shows how to generalize and capture a custom image: Preparing an Image using Sysprep and ImageX

Here is a video that shows dism.exe servicing an offline mounted Windows 7 image: Deployment Image and Servicing Management

I had to take a brief excursion from the deployment task at hand to give the history lesson, because in all my interactions and talks with IT pros and my desktop admin friends lately, I see two common issues when it comes to imaging:

  1. The majority of people I talk to are still using the sector-based imaging tools they’ve been using for decades.

  2. The majority of people aren’t maintaining Windows Vista or Windows Server 2008 images, so they aren’t able to perform offline image management.

Even more troubling are the situations where Windows Vista or Windows Server 2008 are in place, but people are using the 20-year old tools and processes to manage them. They aren’t using or aware of Sysprep, so they need an image per HAL type or lots of luck that the image that has not been prepared with Sysprep installs on foreign hardware. (This scenario without using Sysprep, albeit unsupported, is still somewhat common).

Building Your Image

Windows Vista and Windows 7 are delivered by a file-based WIM image and image-based setup. That DVD you might have or the ISO file you downloaded contains a 2+ GB file called install.wim in the Sources directory. The amazing thing about this WIM is that it actually can contain multiple operating system captures. In fact, the Windows Server 2008 R2 Enterprise image contains eight operating system variants and the 32-bit edition of Windows 7 Ultimate contains five variants.

This would normally be larger than the Windows 7 Enterprise install.wim with one variant or a custom captured image with a single operating system image, right? Not really. WIMs use single instancing of shared files, so you can have multiple operating systems available in an image that might be about the same size as one captured operating system.

This is important as you determine your image strategy, because you could, for example, have multiple operating systems of differing languages packed into a single WIM file, and even with multiple languages these should only be marginally larger than a single language WIM image. WIMs can also be used to compress and deliver data, so you can package multiple applications, drivers, and packages into the data WIM, then mount and call them at install time by using scripted operating system installations.

Now that you know a bit about WIM files, let’s cover the basics of imaging strategy. There are three primary strategies used for imaging and all are valid depending on the use case:

  1. Thick Image. I like to refer to this as the “old school” approach to imaging where you basically build a reference machine, install all possible applications to ensure users have the applications they could ever possibly need and usually more. After that is done, you apply software updates for the operating system and all the applications, then you use Sysprep on the computer to capture the image. Then you make sure everything works and ensure that Sysprep didn’t affect any applications.

  2. Thin Image. This approach takes things to the other extreme. Little or nothing is installed on the reference computer, and you use Sysprep to capture that image. Or some will just use the image as shipped in the Windows 7 retail DVD or ISO with zero customization. This strategy assumes that you’ll be customizing the installation with applications and other necessary data dynamically at deploy time. This also means that all of your applications are packaged for an unattended installation, or you are willing to prestage them for users to install when they want, or you use something like Application Virtualization (App-V) so application profiles follow users regardless of the device they log on to.

  3. Hybrid Image. In between Thick and Thin is a Hybrid Image, where applications that everyone uses or needs are captured in the base image (perhaps your VPN software, your antivirus software, your version of Microsoft Office, and the App-V client). Aside from those core applications, additional applications are layered on at deploy time based on user needs.

All three of these strategies can be justified, though I personally tend to favor thin images. The thick image approach is useful in situations where the company has a homogeneous environment, uses a single language, and all users use and need exactly the same set of applications. When using thick images in larger organizations, the trade-offs are that you pay for several applications that may not be necessary for all users, images are larger and multiple applications can affect performance, plus the image is more difficult to maintain, and flexibility is greatly reduced.

Thin images are the most flexible and easiest to maintain, but customizations need to happen at deploy time, and that means that applications are packaged for a silent install and application updates can be installed silently as well. Installation speeds can be slower compared to thick images because each application needs to install itself one-by-one at deploy time and more automation is required. Hybrid images include many of the components of thick images, without necessarily wasting licensing costs, required disk space, and often the performance hit of multiple unused applications.

Getting to Thin Images

If you currently use thick images, you might be asking, “What tools are there to move to thinner images then?”

Enter deployment task sequencing. Recognizing the limits of using thick images, many people have developed task sequencing engines to not only install applications, but also perform the other common operating system deployment tasks in an automated way. Task sequences are extremely important for the Computer Refresh and Computer Replacement scenarios, because they provide the following:

  1. Validate that the target hardware can install the operating system

  2. Capture user files and settings

  3. Invoke an installation environment like the Windows Preinstallation Environment (Windows PE)

  4. Customize the installation environment

  5. Apply the operating system image

  6. Apply drivers required by the hardware and connected devices

  7. Apply software updates

  8. Apply applications based on your selections

  9. Join the computer to a domain

  10. Reapply user files and settings

  11. Configure additional attributes such as BitLocker™ Drive Encryption or server roles

All of this is completely automated by using deployment task sequencing—you launch it for a minute or schedule it centrally if using System Center Configuration Manager and the rest just happens without you needing to touch the computer. For someone new to the space, it sounds difficult to get configured, but these are standard in-box task sequences, from the following sources:

Here’s a video of what preparing a build looks like by using the Deployment Workbench in the Microsoft Deployment Toolkit 2010: Deployment Workbench in Microsoft Deployment Toolkit 2010

The task sequence brings together the tools we need for the deployment to end-to-end. I like to think of everything we’re using in terms of music. If you think of unattend files, the User State Migration Tool, Windows PE, applications, and drivers as instruments, then the task sequence is like the conductor and sheet music. The end product is a symphony of automation that you have complete control over. When everything is finished and ready for automation, you can pick how you want to deliver your builds.

I’ll conclude this section with a video that shows a fully-automated migration with user data from Windows XP to Windows 7 that I built myself (but did not narrate) using the free tools described previously: Windows XP to Windows 7 Migration