Export (0) Print
Expand All

LOB Application Compatibility

Using Microsoft IT's LOB Application Compatibility Testing Processes

Technical White Paper

Published: March 18, 2008


Download Technical White Paper, 277 KB, Microsoft Word file

Download Podcast




Products & Technologies

Microsoft IT maintains a software portfolio of approximately 1,500 applications. These applications must be tested prior to software deployments. Testers belong to different business units within the company and are distributed around the world.

Microsoft IT developed an application-tracking method that simplified the process of selecting applications for sample-based testing. By identifying groups of applications that have similar data processing, controls, underlying technology, and methods, Microsoft IT is able to test approximately 4 to 6 percent of the total applications and gain a reasonable assurance of compatibility for all.

  • Improved employee satisfaction with the solution
  • Streamlined and automated testing processes and workflow
  • Ability to gain reliable results while testing fewer applications in each pass
  • Reduced effort to administer and support the solution
  • Microsoft Office SharePoint Server 2007
  • Microsoft Virtual Server 2005
  • Internal software portfolio tool
On This Page

Cc411438.arrow_px_down(en-us,TechNet.10).gif Executive Summary
Cc411438.arrow_px_down(en-us,TechNet.10).gif Introduction
Cc411438.arrow_px_down(en-us,TechNet.10).gif Portfolio Management
Cc411438.arrow_px_down(en-us,TechNet.10).gif Application Targeting
Cc411438.arrow_px_down(en-us,TechNet.10).gif Planning
Cc411438.arrow_px_down(en-us,TechNet.10).gif Communication
Cc411438.arrow_px_down(en-us,TechNet.10).gif Testing
Cc411438.arrow_px_down(en-us,TechNet.10).gif Reporting
Cc411438.arrow_px_down(en-us,TechNet.10).gif Pilot Program: Dogfooding
Cc411438.arrow_px_down(en-us,TechNet.10).gif Conclusion
Cc411438.arrow_px_down(en-us,TechNet.10).gif For More Information

Executive Summary

Application compatibility is a critical blocking issue during the deployment of new technologies or applications in an enterprise environment. Microsoft Information Technology (Microsoft IT) faces the same challenge, with the added complexity of multiple beta releases of new applications and technologies, while maintaining an application portfolio that consists of more than 1,500 applications. Microsoft IT tests approximately 4-6 percent of the application portfolio prior to each internal product release. This group of key applications consists of the most critical line-of-business (LOB) applications used within Microsoft.

Microsoft IT's application compatibility testing program achieves the following goals:

  • Minimize investment in application testing by targeting only a subset of the most important and representative applications.
  • Critical applications do not break during software deployment.
  • The cost associated with application compatibility testing is minimized by using virtual machine (VM) images instead of hardware with multiple configurations.
  • Minimize employee dissatisfaction resulting from application compatibility issues.

Testers from various business units within the Microsoft IT, located around the world perform the testing on selected critical LOB applications. They perform the testing using VM images hosted on a centralized server computer running Microsoft® Virtual Server 2005 Release 2 (R2), the second release of Virtual Server 2005. By using VM images, the testers can avoid hardware or software configuration issues that can delay the testing process.

All of Microsoft IT's testing data and application data is stored within a database-driven, internal, Web-based software portfolio tool. Microsoft IT can query the data to generate detailed reports of application compatibility status throughout the testing process, and it can use the data to identify applications for the key test group. Combining individual data results with historical trends enables Microsoft IT to predict the compatibility results of the entire portfolio based on the 4-6 percent of applications tested.

The process that Microsoft IT uses is separated into phases:

  • Portfolio management. Track all applications present in the enterprise environment, with appropriate contact information for personnel responsible for testing and support.
  • Application targeting. Identify applications that are critical to the business flow of the organization based on portfolio data and historical trends. Create a list of core, critical applications to be tested for each software deployment.
  • Planning. Create application testing schedules that permit enough time to fully test the core group of applications before each deployment milestone.
  • Communication. Frequently communicate schedule and testing status with the test team and stakeholders. Clearly communicate testing configurations and goals with testers.
  • Testing. The testing phase includes the tests that the team runs as well as limited pilot deployments.
  • Reporting. Test results are recorded in the application portfolio tool to provide accurate historical data.


Application compatibility is a concern any time an organization deploys new software to an existing computer environment. Whether the organization is deploying a new operating system or a new Web browser, it must ensure that the applications that users need to perform their daily tasks continue to function properly. Microsoft IT shares this challenge by being responsible for deploying and maintaining more than 1,500 internal LOB applications in an environment of mixed and changing operating systems, productivity applications, Web browsers, and development tools.

With such a large portfolio of applications and a rapidly changing environment, Microsoft IT must identify commonalities between the applications in the portfolio as a means of grouping applications for testing. Microsoft IT has found that applications can be grouped by the common components or underlying technologies, such as Microsoft Office Excel® spreadsheets, Microsoft Office Word documents, or the Microsoft .NET Framework. Or, the applications can be tracked over time for a historical sense of similarity to other applications. Based on this technology-dependency mapping, Microsoft IT can provide advance warning to application owners when changes may affect their application. This methodology, combined with substantial historical testing data, has enabled Microsoft IT to require testing only on approximately 60 key applications among the more than 1,500 present in the organization.

For purposes of this paper, Microsoft IT defines an application as a piece of software that more than one person uses. This broad definition encompasses Web pages, databases, and client server tools. An LOB application is defined here as an application used to perform a key business function, such as purchasing, staffing, or expense reporting, and is most likely an application developed within the company. The LOB applications referenced in this paper are nearly all Web-based applications used for internal business purposes.

Because of the broad variety of applications that internal employees use, the task of tracking them is monumental; however, having to test every one of those applications individually would require a vast number of testers. Microsoft IT has instead placed the emphasis on testing key "indicator" applications. Just as indicator species in wildlife can point out a looming danger in nature, a problem with a key indicator application can signal likely issues with other, similar applications. The key indicator applications that Microsoft IT tests tend to be large, complex business solutions. By conducting a successful test pass on one of these applications, Microsoft IT gains confidence that the other applications with similar design, dependencies, and underlying technology will also pass if tested.

Microsoft IT tests applications for several key reasons. The primary reason is to ensure that the internal LOB applications are compatible with new software before it is deployed to the corporate desktop environment. This means more than just performing tests in preparation for deploy a new operating system. It also means performing standardized testing when a new user application, such as a new version of Microsoft Office, is deployed. The testing process that Microsoft IT uses identifies issues with key functionality in the LOB applications and focuses on enterprise user scenarios.

Jim LaBreck, a senior program manager in Microsoft IT, describes his worldwide virtual team of testers as the "enterprise quality bar" of Microsoft. "The issues that we find and get the product teams to fix prior to release are issues that our external customers won't have to face," LaBreck says.

A goal of the testing process is to identify issues early in the product cycles for new applications or new version of applications. By identifying issues early in the process and providing clear feedback to the product teams responsible for the applications, Microsoft IT can minimize the support impact on users and help the product teams create high-quality applications.

Portfolio Management

The first step in testing for application compatibility is knowing the applications present in an environment. Microsoft IT gains this knowledge through an internal software portfolio-management tool that maintains a database of all applications deployed in the organization for general use. Microsoft IT has found it important to track not only the basic information of application name, function, and location, but also details that identify:

  • The number of people who use the application.
  • The people responsible for supporting the application.
  • The dependencies that the application may have on other software or specific technologies.

Microsoft IT requires that any time a new application is created for internal use, it must be entered into the portfolio tool. Microsoft IT further requires that, at a minimum, the person responsible for the application must enter the basic identifying information for the application, such as name and version, and the contact information for the people responsible for the application's compatibility impact. Typically, that contact person will be the person who is dedicated to testing the LOB applications for that business unit.

Microsoft IT tracks the following information for each LOB application:

  • Application name. The name by which the application is known in the network environment. The application name is useful mostly in terms of describing an issue when Microsoft IT contacts the product group.
  • Application version. Microsoft IT tracks each version of an application as if it were a completely separate program. The reason is that not only can underlying technologies change from one version to the next, but the contact information and responsible parties can also change.
  • Version identification. Because Microsoft IT tracks individual versions as separate entries, proper version identification is vital. Each version of an application may have different dependencies or contact information.
  • Estimated number of users. This item is an estimate on the part of application owners. The main importance of this information is to help assess the impact on the company if the application fails. This is also an important criterion for selecting applications to be part of the key test group.
  • Whether the application is a non-Microsoft application. Because Microsoft IT has no control over the development of non-Microsoft applications and because Microsoft customers likely use non-Microsoft applications, these applications receive a high priority in testing.
  • Business priority (for example, mission, business, external). The highest-priority applications affect Microsoft as a whole—for example, an expense tool or an ordering tool that employees across the entire company use—may be considered either mission or business critical. Other factors include the nature of the application's function or even the visibility the application has because a senior executive uses it.
  • Technology dependencies. The technology dependencies field is one of the most important, because it enables Microsoft IT to group the application with other applications based on the underlying technology. Knowing that a change is coming in a technology dependency, such as a new version of Microsoft .NET Framework, enables Microsoft IT to query the portfolio tool to find applications that depend on that technology and build a test group. Based on this information, Microsoft IT can notify users of the applications that such changes will potentially affect, and can provide these users with details of the change and the expected impact.
  • Test results. Microsoft IT uses the test results to analyze compatibility trends over time and to generate reports. Tracking the results over time enables more-accurate predictions of compatibility issues and more-focused testing. Applications that do not present any failing marks during testing might even be bypassed in future testing.
  • Contact information. This is a critical data item for testing program success. Having accurate contact information enables Microsoft IT to pass along vital information efficiently. Without this information, Microsoft IT would lose valuable time trying to find the responsible party.

Tracking this information reduces the time and effort required in subsequent testing and identifies risks. For example, if one Web-based application that depends on a common underlying technology fails, the portfolio tool enables Microsoft IT to determine other applications that depend on the same underlying technology so that the team members can quickly evaluate whether those applications have the same issue.

A software portfolio tool can take many forms—from a simple spreadsheet to a Web-based interface to a centralized database. The important factors to consider in choosing a portfolio solution include ease of long-term use, the ability to find information quickly as the number of applications increases, and the completeness of data stored. A database is highly useful because of the ability to group records by various relationships in the data, such as the technology dependencies.

Application Targeting

Perhaps the most remarkable aspect of the LOB testing that Microsoft IT performs is the success of the testing program that results from application targeting. Because the group already has a complete list of the applications present in the organization, Microsoft IT can focus on testing a small group of high-priority applications that will serve as a litmus test for the remaining applications.

Microsoft IT maintains a list of approximately 60 applications each for the Windows® operating system and Microsoft Office in its key priority test group. These applications are crucial to business functions across the company. Microsoft IT cross-references the applications by technology dependency within this group. Microsoft IT asks individual business groups within the company to volunteer information about the importance of the LOB applications that they use. The information enables Microsoft IT to make better decisions on the relative priority of the applications. Microsoft IT also uses historical data from past testing cycles to identify applications that had issues in the past. For instance, an application that always breaks when Excel is upgraded would be identified as an application to include in the key test group.

According to Jim LaBreck, "Based on historical data, we know which applications are more likely to have trouble and which applications basic business (payroll, expense reporting, procurement, etc.) simply cannot continue without."

Inclusion in the key test group has benefits to the business groups responsible for the applications in the group. The product groups give extra effort to resolving any bugs filed against applications in the key test group, which directly benefits Microsoft IT and other business groups. If Microsoft IT finds an issue in one of the key test applications, it may stop the deployment of the related product or technology until the issue is resolved. If Microsoft IT determines that an issue is related to the underlying technology, such as a new release of Microsoft .NET Framework, it works with the product group to resolve the issue, directly benefitting all customers who will use the new version of the technology in the future.

Microsoft IT expects applications in the key test group to go through testing for every scheduled test pass. The business units with applications in the key test group make a commitment to test on each pass but are permitted to skip a test pass if business activities require it. This system helps to maintain the working relationship between Microsoft IT and the product groups, and it enables Microsoft IT to respond with valuable feedback during the application development phase.


Planning for application compatibility testing is crucial to the success of a deployment. For the testing to be successful, enough time must exist to schedule testing, perform the tests, gather results, and forward all data to the product groups so that they can resolve the bugs before deployment. Microsoft IT schedules test passes before key product milestones and internal interim production builds (often referred to as dog food releases at Microsoft). The intent of this scheduling is to be certain that potential compatibility issues are properly identified while there is time to resolve the issues before the customer is affected.

"Dogfooding" is a way of life at Microsoft. The term originally came from an internal communication that expressed the idea that "we need to use our own products to prove that they're good enough for our customers." In real terms, the dog food concept means that long before a software product enters a public beta test phase, Microsoft employees have been using it in their daily work. Using beta applications in the daily workflow is a clearly stated internal expectation at Microsoft, and every employee participates.

For major test plans, Microsoft IT conducts one-week test passes approximately one month before scheduled major release milestones. At the end of that time, Microsoft IT gathers the results and communicates the feedback to the stakeholders, including the Microsoft IT desktop deployment team, the Helpdesk, and the product groups.

When Microsoft IT finishes defining the schedule, it communicates the schedule to the project stakeholders. In some cases, the stakeholders have additional concerns that may cause a reevaluation of the schedule. The LOB application testing is an iterative process that may require numerous passes before Microsoft IT deems a product ready for general consumption. Scheduling the test passes within Microsoft IT typically means an initial discussion with the product team or Technology Adoption Partner (TAP) team and coordinating with other, concurrent testing schedules. Microsoft IT publishes the proposed schedule to a Windows SharePoint® Services team site, which serves as the central repository for team communication, scheduling, and reporting. Microsoft IT also communicates schedules through e-mail to distribution lists that contain testers and stakeholders.


Preparing for a deployment can be a very busy time. By providing multiple channels of communication between itself and the various project stakeholders, Microsoft IT ensures prompt delivery of feedback, sharing of suggestions for features and functionality with the product teams, and sharing of the latest support information with users. Microsoft IT uses a combination of the following communication channels:

  • LOB testing aliases. E-mail is one of the primary modes of communication in the organization because of the wide geographic distribution of the testers. By creating e-mail group aliases, Microsoft IT can share status messages and other communication directly with the people involved in the project. E-mail communications tend to be timely and immediate; they communicate the current status and any information the stakeholders need to know at that time.
  • LOB testing Web site. Maintaining a Web site on the organization's intranet is another convenient mode of communication. The LOB Windows SharePoint Services site for Microsoft IT shares information about the current testing status (duplicating the schedule information sent in e-mail), the current application list for testing, known issues with the applications being tested, and any supporting documentation for the applications and testing process that may be pertinent. The Web site is an effective archive of past communications and current information.
  • Schedule. The testing schedule itself is an effective form of communication among all members of the project—from testers to stakeholders. Keeping the testing schedule updated and communicating any changes to the entire team sets expectations for all parties and lets them coordinate with their own internal test schedules.
  • Testing instructions. If there are specific scenarios to be tested in a particular test pass, Microsoft IT will communicate these instructions both during the scheduling and in specific e-mail communications at the beginning of the test pass.
  • Daily and final status. Sharing the status of the LOB testing on a daily basis helps to identify areas of concern in the process. If the daily status indicates that an application is failing the test pass, the test pass can be canceled for that application until the product team can improve the build stability. The final status is the core piece of the project postmortem; it is a chance to list all results, successes, and elements to be improved for the next round of testing.


Testing should be automated where possible, but this may not be possible with all applications, particularly with applications that experience frequent changes to the user interface (UI). LOB application testing at Microsoft is a two-phase practice. In the first phase, testing occurs within the business unit responsible for the application. Microsoft IT's involvement at this stage is primarily observation and coordination between stakeholders. In the second phase, the application is made available for dogfood testing, and a wide audience of users in the company use the application in normal tasks. In the dogfood phase, Microsoft IT monitors all bugs filed and coordinates communication with the stakeholders.

VMs are a good choice for application testing, because an administrator can restore them quickly simply by copying a file. One physical computer can host numerous VM images, each representing a specific software configuration. Microsoft IT has found that the most efficient process involves using pre-configured VMs hosted in Virtual Server 2005 to perform the testing. With this method, the testers can begin working with their applications as soon as they are assigned a VM rather than spending days configuring a computer prior to the test pass. According to Jim LaBreck, "Providing 25 VMs for one week is equivalent to 125 ‘machine days' for testers. This is why we can test our 60+ critical applications in one week." LaBreck notes that the number of applications tested in a single week has reached as high as 80 to 100.

By implementing dedicated test hardware, either by using multiple computers or by implementing VMs, an organization can gain a higher level of control over the testing process. If the organization performs application compatibility testing on a computer being used for production work, it runs the risk of making daily work impossible because of testing results or because a user cannot relinquish time on the computer for testing. Microsoft IT uses a virtual test lab that consists of multiple VMs hosted on Virtual Server for efficiency and to deal with wide-spaced geographic locations for testers. This technique has saved time, because no VM or deployment images need to be transferred to remote locations. The testing hardware is hosted locally and accessed across the corporate network from any location around the world. Table 1 shows the typical VM configuration.

Table 1. Typical Server Configuration for Hosting the Test VMs

Server component



Four dual-core processors

Random access memory (RAM)

48 gigabytes (GB); supports 25 to 30 VMs concurrently

System drives

Two 10,000-RPM serial attached SCSI (SAS) drives in redundant array of independent disks (RAID) 1 for the operating system

Data drives

Six 10,000-RPM SAS drives in RAID 5 for VMs

Network adapter

Two dual-interface network adapters (four total network connections)

Power supply

Redundant power supplies

Operating system

Windows Server® 2003

Virtualization software

Virtual Server 2005 R2


A problem that arises from testing new software on non-standard or older computers is incompatibility caused by hardware configuration and limitations. Using a variety of computers that have different components can lead to hardware incompatibilities with applications or operating systems. By using a dedicated test lab that has standardized hardware configurations, an organization can minimize the potential for hardware-compatibility issues. Using Virtual Server to accomplish this enables Microsoft testers to focus on the application instead of the hardware configuration. A final issue to deal with if an organization uses non-standardized computers—especially if the computers are not dedicated to the test process—is the time spent configuring the computers to work similarly during the test passes. Having identical computers has the benefit of a standard configuration that can be applied during an automated deployment.

For each test pass on a particular application, Microsoft IT logs the results directly in the software portfolio tool. The testers update the application's record in the portfolio with final testing results (such as pass, fail non-critical, fail, or blocked) at each step of the test pass. Updating the application's record provides the added benefit of enabling queries to be run against the portfolio database and enabling status reports to be compiled easily for any or all applications in the test process.

If Microsoft IT identifies any bugs during application compatibility testing, it reports them to the product team responsible for the application being tested. Microsoft IT enters these results into a database as soon as it identifies the bugs and assigns them to the appropriate contact in the product team to be investigated. In this way, the product team has the opportunity to triage all bugs, identify fixes for the issues quickly, and allow the application owners to verify the fixes in a timely manner.


One of the major design criteria for a software portfolio tool is its reporting ability. Reports on application testing status, application version in use, application team contacts, and related information can save large amounts of time and enable in-depth analysis of applications in the organization. Microsoft IT's portfolio tool has a Web-based UI and a Microsoft SQL Server® database back end to store the data. By storing the test data in a SQL Server database, Microsoft IT can create and run complex reports to provide testing status information, to create target application lists for ad hoc testing passes, and to provide other useful combinations. By pre-configuring the report queries, Microsoft IT can easily accommodate new applications that enter the testing and provide stakeholders with robust data from the beginning.

A typical data view includes a complete list of the applications being tested with all test results broken down by test pass. In this way, Microsoft IT can track the progress of an application as it moves through the test process. The data view includes the most recent comments from the tester and can be used to pass suggestions to the product team or to document the user experience that the tester encountered during testing, better describing what the tester saw.

The bugs associated with the application are also tracked in the data view, providing the first breadcrumb in the trail to determining responsibility for resolution. If the testers simply filed an entry stating that the application failed, Microsoft IT would have no way to verify that the issue had been identified and communicated to the responsible parties for investigation. Having the bug information enables Microsoft IT to track the progress of the bug and ensure that it is addressed. Before testers can enter a fail test result, they must enter the bug number related to the issue, which promotes entering bugs into the test process for timely investigation.

Finally, the data view of the applications in the testing report includes a list of compatibility contacts for each application. If an urgent issue appears during testing, Microsoft must be able to contact, as quickly as possible, the people responsible for supporting the application.

Pilot Program: Dogfooding

Pilot programs are familiar to anyone who has worked on a deployment project. In a pilot program, an application or product is deployed to a small control group of users, typically skilled users in multiple affected groups. The small number prevents costly downtime to a business group if issues are encountered, yet the number must be large enough to establish statistically meaningful field test data. The initial pilot deployment is small, but subsequent deployments are larger over time and include a broader cross-section of users. Microsoft IT typically deploys to a pilot group of between 2,500 to 5,000 users for the initial pilot around the Beta 1 milestone and scales upward through subsequent builds until adoption is essentially company wide.

It is a typical pattern for the first dogfood phase to include only the product team itself; later phases include the IT staff, and then the general users. Microsoft IT does not deploy applications to the general user base until:

  • It has a sufficient level of confidence in the application's stability and readiness.
  • It files a report on the Microsoft IT Windows SharePoint Services site to show which applications are compatible so that users can make an informed decision about when to move to the new version.

Pilot programs are keyed to major milestones in the product development cycle. For example, if a beta release is expected at a certain date, the Microsoft Desktop Deployment team plans to deploy the first pilot deployments as soon as the product team announces the finished build of a product and plans to work up to general availability when the beta is released. This schedule may include a day-to-day slip in schedule if outstanding issues still must be resolved with the application.

LOB application testing at Microsoft is thorough, but nothing compares to real-world user experiences. With this in mind, Microsoft IT uses pilot programs to gather reports of possible issues that did not appear during lab testing. Microsoft IT gathers user reports through the bug-tracking system and disperses them to the appropriate team contacts, just as it does during the early test phases. The decision to deploy to the next larger group of users occurs only when the "bug-bar" is met—that is, when the number of bugs is below a predetermined number and severity. The pilot program provides a view into real-world use of the applications.

Application compatibility testing is the final gate through which an application must pass before being deployed to general availability. Microsoft IT will not authorize the deployment of any application or new technology for general use until it has passed LOB testing and has mitigations in place for any outstanding compatibility issues.


Application compatibility testing is crucial for the successful deployment of new software in an enterprise environment. Microsoft IT has developed a methodical approach to LOB application compatibility testing. This approach uses multiple aspects, including product team testing, pilot programs, and accurate data collection over time. One key to success is a software portfolio tool that tracks every application deployed within the company and in which testers can directly add data on test passes, identify bugs and tracking information, and store contact information for responsible parties. Microsoft IT has implemented this portfolio tool as a Web-based application with a SQL Server back end, which enables complex queries that generate detailed reports.

Microsoft IT maintains a portfolio of more than 1,500 applications but is able to accurately predict compatibility results by testing a key group of approximately 60 critical applications. By comparing the underlying technology and components that LOB applications use, Microsoft IT selects applications for the key group that not only are critical in business function but also represent larger groups of applications that use similar technology and components. By combining volunteered information from product teams with historical data on compatibility testing, Microsoft IT can make accurate predictions of other applications that will have an issue if one of the key group of applications encounters an issue.

To further enhance LOB application compatibility testing, Microsoft IT hosts multiple VMs on a single Virtual Server computer. In this way, Microsoft IT can deploy a pre-configured VM for the testers to access immediately without requiring the testers to configure their own test computer before each pass.

By combining all these factors, Microsoft IT successfully deploys pre-release products within the enterprise while testing only 4-6 percent of the total LOB application portfolio.

For More Information

For more information about Microsoft products or services, call the Microsoft Sales Information Center at (800) 426-9400. In Canada, call the Microsoft Canada information Centre at (800) 563-9048. Outside the 50 United States and Canada, please contact your local Microsoft subsidiary. To access information through the World Wide Web, go to:



The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information presented after the date of publication.


Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

© 2008 Microsoft Corporation. All rights reserved.

Microsoft, Excel, SharePoint, SQL Server, Windows, and Windows Server are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.

All other trademarks are property of their respective owners.

Was this page helpful?
(1500 characters remaining)
Thank you for your feedback
© 2014 Microsoft