Microsoft Windows 2000 Server: ROI Impacts for Corporate Customers

Arthur Andersen was asked to assess the high-level return on investment (ROI) impacts for organizations considering the migration from Microsoft Windows NT® Server 4.0 with Service Pack 5 (SP5) to Microsoft Windows® 2000 Server. Our findings are covered in detail in the attached report and appendices. The key top-level findings are presented later in this article.

On This Page

Disclaimer I. Executive Summary Potential Benefits Larger versus Smaller Environments Potential Implementation Issues Summary II. Objectives, Scope AND APPROACH of the Study III. Basic ROI Framework and Methodology IV. Analysis V. Appendices Contact Information

Disclaimer

Statements, information, or opinions contained in Arthur Andersen's (Andersen) following evaluation of the Windows 2000 operating system should not be construed as an endorsement of it. Andersen hereby disclaims all responsibility and liability for any actions taken in reliance on its evaluation of Windows 2000.

Use of Windows 2000 in a specific environment is dependent upon factors unknown to Andersen and, therefore, not considered by Andersen in connection with its analysis. For more complete information related to Windows 2000, including functionality and risks that are associated with a particular environment, an organization-specific analysis should be performed.

Due to hardware and device differences, results obtained from an organization's equipment may differ from the scenario test results obtained using Arthur Andersen's lab equipment.

The Rapid Economic Justification framework, as outlined herein, was developed by Microsoft and a group of consulting firms. Andersen's use of this framework should not be construed as an endorsement or indication that Andersen has thoroughly tested its usefulness. Individual organizations should assess the validity of this framework for themselves.

Andersen has performed short, limited interviews with IT personnel from various organizations. Where applicable, Andersen has included the results found by these organizations. The information provided by these organizations is believed to be reliable, but has not been verified. No warranty is given as to the accuracy of such information. We have not compiled, reviewed, or audited the information referred to in this report (as those items are defined and used by the American Institute of Certified Public Accountants). We express no opinion or other form of assurances on such information.

The business initiative section includes a list of seven current business initiatives that could be encountered by individual organizations. This list should not be construed to be complete. Nor is any assurance provided that these initiatives will continue in the future in their present form. Further, because each organization will participate in these initiatives in varying degrees, the results obtained in a specific situation could differ significantly from the situation discussed in this report.

I. Executive Summary

Windows 2000 brings fundamental changes and a wide variety of new or improved features to a Windows-based environment. Through integrating features relevant to web-enabled business throughout the operating system, and by improving reliability, scalability and management, Microsoft is creating a platform geared to electronic commerce, as well as improved capabilities for its traditional markets in network operating systems and departmental line of business computing.

These significant changes will have different degrees of value to different organizations due to their size, complexity and business mission. For some organizations, the greater security, new services (such as Active Directory® directory service), or greater scalability of Windows 2000 will be sufficient justification for the adoption of this operating system. For others, Directory Enabled Applications or Directory Enabled Networking may require Windows 2000–based directory features. For such organizations, direct, short-term administrative savings are not the driving consideration in Windows 2000 adoption. Such scenarios are not the primary focus of this study.

The greatest value of the Windows 2000 Server Family is realized when one views the technology as a new paradigm where advanced features are incorporated into a directory-enabled, distributed environment that encourages communication and ease-of-use for all users in the organization. The integration of these features into a cohesive operating system, as well as improved reliability and scalability, highlights this paradigm shift. Windows 2000 is also an environment where the client and server are designed to work closely together as a platform in areas like end-to-end security and desktop management (IntelliMirror management technologies).

Arthur Andersen was asked to assess the high-level Return on Investment (ROI) impacts for organizations considering the migration from Windows NT Server operating system version 4.0 with Service Pack 5 to Windows 2000 Server. Our findings are covered in detail in the attached report and appendices. The key top-level findings are:

  • Windows 2000 can have a significant positive impact on overall cost and control of IT assets

  • Windows 2000 appears to be more scalable, reliable and manageable than Windows NT Server 4.0 with Service Pack 5

  • Windows 2000 has enough new features to have a learning curve, or front-end training requirement, which some organizations will find significant, and against which a future stream of cost savings and benefits should be weighed

  • The Beta 3 version1 of Windows 2000 Server is functionally complete and sufficiently robust to initiate pilot projects and other hands-on analysis. Windows 2000 Server deserves evaluation to assess exactly how its benefits can create value within a particular organizational or enterprise environment.

Potential Benefits

The decision criteria to migrate to Windows 2000 should include the number of potential benefits specific to the IT and business environment. From an IT perspective, cost savings can be realized in infrastructure management and support early on in an implementation. This should also translate into better service to the business unit in system availability and recovery of information and systems. These cost savings are magnified in widely distributed environments where the user and client system environments have become more complex.

Further, by enabling the IT environment to closely reflect the business environment, through the hierarchical structure supported by the Active Directory and the integration of a number of business critical features, IT will have greater ability to support the business initiatives and capture its associated value. Moreover, the availability of information and organizational resources can impact a business's ability to respond to its changing environment. Windows 2000 Server takes great steps to facilitate the availability of information and applications, not only from the perspective of system availability, but also from the perspective of being able to locate resources through the directory services.

A number of the benefits of Windows 2000 Server are fully realized only with the implementation of Windows Professional 2000 on the desktop. In this regard, the server and desktop versions of the operating system are more closely linked than before. Together, they offer simple solutions to complex, and sometimes previously insurmountable, problems.

Overall, the most critical features along with their corresponding benefits are as follows:

  • Enhanced Distributed Services – including Active Directory™ directory services, standards-based security, management tools (like IntelliMirror) which allow the organization to create policy and perform more centralized management and oversight of a distributed environment.

  • Enhanced Reliability and Availability– through many system enhancements including the Distributed File System, Enhanced Clustering Services and a reduction in the number of scenarios which require system reboots or service restarts, to ensure greater system uptime.

  • Enhanced Scalability through improved Symmetric Multi-Processing, Network Load Balancing and EMA (Enterprise Memory Architecture).

  • Enhanced application support, through the integration of a number of services into the COM+ component model enabling integrated Web-based applications.

  • Enhanced interoperability through adherence to standards including X.500 style hierarchical naming structure for directory information and Lightweight Directory Access Protocol (LDAP) for access to and from the directory and other industry standards allows the organization greater ease of communicating within heterogeneous environments.

Larger versus Smaller Environments

In larger environments, the savings in IT management and administration alone may be sufficient to justify rapid migration to Windows 2000 from earlier versions of Windows NT Server. This is especially true in IT organizations where IT personnel and computing resources are widely distributed and loosely connected. The benefits of the integrated Active Directory, along with improved availability of resources and information, can be evaluated within this business context.

Despite the numerous benefits, the associated costs should also be considered. In smaller environments, the cost of learning the new technology, planning the implementation and performing the conversion, along with the risks that come with the implementation of any new technology, may outweigh the immediate cost savings-related benefits. Other aspects of Windows 2000, however, may still make implementation compelling. These aspects could include: security, application support, higher availability and new internet features.

Potential Implementation Issues

Since this new technology constitutes a considerable change to the fabric of the IT infrastructure environment, an organization must be willing to spend the required resources to plan the fundamental restructuring of the IT infrastructure. Planning, especially in larger organizations, will pay dividends.

Planning and implementation may include the involvement of scarce, high-level human resources that are involved in the day-to-day management of the current infrastructure. While the move to the new technology may start shortly after the availability of Windows 2000, the implementation timeframe may be longer for large organizations. The prioritization and phasing of the new environment must be well-planned to capture and fully realize key benefits.

Summary

In essence, greater value will be more readily apparent to the larger enterprise, especially one that is already committed to a Windows NT Server 4.0–based environment. When implemented properly, the ability to organize the IT infrastructure in a manner similar to the business organization will potentially improve manageability and information availability. In today's fast changing business environment, the bottom line of an organization could be positively impacted by taking early advantage of the potential of Windows 2000.

II. Objectives, Scope AND APPROACH of the Study

A. Objectives and Scope

The traditional technology study focuses on Total Cost of Ownership (TCO) in determining whether an organization should migrate to a specific technology. Arthur Andersen's study on Windows 2000 Server has expanded this approach, enabling customization of the study to develop a more meaningful ROI assessment that is specific to an organization.

This report makes generic high-level ROI observations about the impact of Windows 2000 Server that may be used by corporate customers as a starting point in assessing the financial impact Windows 2000 Server may have on their organization.

Specific ROI calculations are not provided in this study because there are too many organization-specific factors that need to be considered in the migration decision. However, through each step of the ROI process provided in the study, an organization may attain an increasingly refined ROI answer. The steps are illustrated below:

Bb742519.serv01(en-us,TechNet.10).gif

The focus of this report is on the qualitative returns that will assist potential customers in evaluating Windows 2000 Server. As input to the study, Arthur Andersen utilized Microsoft's internal product technical specifications and marketing materials to arrive at relevant test criteria. This information forms the basis of the scenario testing that was conducted in our technology lab.

The scope of the study is limited to performance and benefits associated with Microsoft Windows 2000 Server compared to Microsoft Windows NT Server 4.0 with Service Pack 5. In order to test the benchmarking criteria, we developed two or more test scenarios for each of the following four key IT areas:

  1. Manageability

  2. Availability

  3. Scalability and Performance

  4. Interoperability and Security

B. Approach

In conducting the study, the 4x7 Benefit Matrix from the Rapid Economic Justification (REJ) Framework (developed by Microsoft and a group of consulting firms) was utilized. The report focuses on the impact Windows 2000 Server will have on the IT Area (left side of the matrix). In order to arrive at a more complete financial analysis, companies are strongly encouraged to perform a broader analysis, similar to the one reflected in the 4X7 Benefit Matrix (see below). However, much of the report's analysis, and many of its conclusions, will help companies analyze and assess the Business Area of the matrix as well.

Bb742519.serv02(en-us,TechNet.10).gif

In conducting this study, a critical path approach was used to develop the conclusions relating to the product itself and its potential ROI applicability to non-test environments. Throughout the study, the focus of the analysis is on those areas of Windows 2000 Server that have potential financial implications for an organization. That is, while an exhaustive technical study of the product has not been performed, the focus is on areas that would have a material financial impact.

The underlying analysis relates the ongoing costs and benefits of the product to the one-time investment required to implement it. This principle has been used throughout in order to determine the value of Windows 2000 Server to an organization.

The first step was to identify high-impact ROI features or components of an operating system based on four key IT areas: Manageability, Availability, Scalability and Performance, Interoperability and Security.

Next, a list was developed containing criteria and specific scenario tests that would highlight the differences between the Windows 2000 Server and Windows NT Server 4.0 operating systems. The results of this scenario testing have been applied to the IT Profile Matrix (see below), a tool that captures the highest-level characteristics of an organization, enabling organizations to see how the feature-specific ROI impacts would affect their particular environment.

Since different organizations focus on different business initiatives, we analyze seven current initiatives that organizations should review in order to qualify the findings in the IT profile matrix.

Bb742519.serv03(en-us,TechNet.10).gif

Windows 2000 Server IT Profile Matrix

III. Basic ROI Framework and Methodology

A. Introduction

Organizations differ greatly in their use of PC technology, resulting in widely varying potential ROI's. In order to accommodate a more complete range of organizational technology characteristics, we utilize the Windows2000 ServerIT Profile Matrix detailed in this section. The matrix represents a way to visualize the differing impacts based on three fundamental criteria:

  • the role of the server,

  • the distribution of the IT infrastructure, and

  • the complexity of the environment.

The business initiatives described in Section IV-C may provide additional areas to review applicable to a specific business environment.

Finally, the detailed appendices in Section V enable an organization to apply the specific comments related to key IT areas to its own situation.

B. Summary of ROI Assessment Steps

Potential corporate customers may make use of the tools described below to better understand the impact of Windows 2000 Server on their organizations. The overall assessment of the ROI impact for each area includes the following steps:

Bb742519.serv04(en-us,TechNet.10).gif

  1. The organization determines which areas of the Windows 2000 Server ROI impact matrix apply to itself by identifying the role of servers in the organization, the distribution of the IT infrastructure, and the complexity of the IT environment.

  2. For each matrix area chosen, the company assesses four key IT areas. As input, the company uses our ratings and the ROI Impacts of Specific Features.

  3. Business initiatives are used to analyze how Windows 2000 Server may apply, given a particular circumstance. The ROI may be further refined.

  4. The detail of this paper is provided to assist the organization in refining or drawing further conclusions specific to its situation.

The following pages detail the ROI assessment steps.

C. Using the ROI Assessment Tool

Bb742519.serv05(en-us,TechNet.10).gif

1. Identify Relevant Segments of the IT Profile Matrix

As a first step in its ROI assessment, the organization first determines its place in the IT Profile Matrix and compiles a preliminary list of potentially high-impact ROI areas.

Key Initial Characteristics for Organizations

The following three key characteristics are captured in the IT Profile Matrix. The full ROI picture will contain these three key characteristics, along with the analysis of the ROI Impacts of Specific Features, comparison to the Business Initiatives, and relevant detail from the test results.

i Role of Server: This is the primary function a server provides its network community. The four roles defined for this paper are:

  • Infrastructure Server - Infrastructure Servers deliver the facilities that desktop, network, and data center administrators need to manage an enterprise environment. It is made up of different services that are built-in for Windows 2000 Server. Examples include: Active Directory Services, DNS, DHCP services, Windows Management Services, and Networking and Communications services.

  • File and Print Server - File Servers provide network data storage to remote users allowing them to share files with remote users. Print Servers provide printing services for the entire intra- and inter-networks. The Windows 2000 Server provides integrated services for file sharing and printing.

  • Application Server - Application Servers provide faster processors, wider, faster buses, and very large memory models. Application Servers provide backend server resources to client side applications requiring heavy disk I/O and processor intensive computations. Application Servers are also capable of integrating applications with different data sources and support applications built in the N-Tier client server model.

  • Web Server - Web Servers provide corporations key services to develop three-tiered applications that take advantage of Web-based technologies. Windows 2000 Server provides integrated services for web serving and streaming media.

ii. Environment Distribution: This is the extent to which the environment is dispersed from a central location and accessible only by slow (1.5 mbps – 56Kbps) or intermittent (dial-up, ISDN) communications lines.

  • An organization with two sites linked via a T1 (1.544mbps) or faster communications line is considered narrowly distributed.

  • An organization with five or six sites connected by both slow and fast communications lines may be considered moderately distributed.

  • An organization with 15 or more sites with at least some of those sites connected by slow communications lines is considered widely distributed for the purpose of the analysis in this study.

iii. Complexity of Environment: increasing the complexity of the organization's IT environment increases the TCO. Three factors contribute to complexity and each increases the cost to the organization. The important point is that it does not matter which of the three factors makes the environment complex. What matters most is on which matrix segment the company falls as a result of these factors. These factors are:

  • The number of servers in the environment. A large number of servers is generally more difficult to manage and administer. Typically, the number of domains grows and the utilization of servers decreases as the server environment becomes more complex.

  • The heterogeneity of the operating system environment, both between various versions of Microsoft Windows and also between other server and client systems. In general, the less standardization existing across the organization, the more difficult (and hence expensive) it is to install, maintain, and support technologies.

  • The number of users in the organization. As a greater number of users are supported in the environment, the overall application requirements grow in terms of variety, complexity, and performance.

    In this study, we evaluate each intersection of these parameters, and assess the potential ROI impact.

Instructions for Completing the Windows 2000 Server IT Profile Matrix

To place the IT organization in the Windows 2000 Server IT Profile Matrix:

  1. The organization determines the role of servers in its environment and the relative importance of these server roles in the environment to prioritize the evaluation. A single organization may evaluate the criteria in multiple segments (one for each server role) and weigh the results according to priorities.

  2. The organization assesses the level of distribution of servers in the IT environment.

  3. The organization assesses the complexity of the IT environment based on the following guidelines:

    Number of Servers

    Complexity

    Greater than 300

    Complex

    Between 20 and 300

    Average

    Less than 20

    Simple

    Number of Users

     

    Greater than 1,500

    Complex

    Between 500 and 1,500

    Average

    Less than 500

    Simple

    Operating System Mix

     

    Windows NT Server 4.0 with Unix and/or Novell

    Complex

    Windows NT Server 4.0 and 3.51

    Average

    Windows NT Server 4.0 only

    Simple

Initial Assessment using the IT Profile Matrix

Andersen tested the four key IT areas and arrived at a list of high-level ROI impacts for each of them. For example, the first key IT area is Manageability. As a result of our testing, we determine the following to be potentially high ROI impact areas of Manageability:

  • Disk and Printer Resource Management

  • Centralized Management

  • Terminal Services with Multi-Lingual/Language

Based on our testing, we provide for each of the four key areas the following indicators: an up arrow (serv43) for a potential positive ROI, a down arrow () for a potential negative ROI, or a flat line () for no significant ROI impact. Other factors specific to an organization may impact the ability to achieve the full effect of the anticipated ROI impact.

2. Review ROI Analysis

Bb742519.serv06(en-us,TechNet.10).gif

In Step 2, the organization reviews the analysis of the four ROI impact areas to determine what specific Windows 2000 Server options and features apply to its IT environment. The ROI impact areas are:

  • Manageability

  • Availability

  • Scalability and Performance

  • Interoperability and Security

Some business activity illustrations are included throughout the text, indented within vertical borders, to facilitate understanding.

3. Review Business Initiatives

Bb742519.serv07(en-us,TechNet.10).gif

Andersen outlines seven broadly based business initiatives that apply to many or all organizations. If an organization has characteristics similar to those outlined in this section, it should make an effort to review the impact of the Windows 2000 Server family. The Business Initiatives Section includes some of the ways this technology may enable a business to achieve its goals. However, the answers are organization-specific for this particular study and are not general.

4. Review Detail Test Results

Bb742519.serv08(en-us,TechNet.10).gif

If an organization wants a more detailed view of the ROI impacts, it should complete step 4. This review of the Detail Test Results may assist the organization in refining the assessments made in steps 1 through 3 above, and generate other organization-specific conclusions.

IV. Analysis

A. IT Profile Matrix

The pre-filled IT Profile Matrix appears below. The results shown in this table are based on scenarios that were constructed using the criteria described in Appendix D. "Criteria for Scenario Testing".

To use the tables, an organization must know the role and number of servers as well as the distribution and complexity of its environment. With these three variables, the organization can obtain a preliminary assessment of the impact of the Windows 2000 Server family, based on the four key IT areas.

Bb742519.serv09(en-us,TechNet.10).gif

Windows 2000 Server IT Profile Matrix

B. ROI Analysis

1. Manageability

A number of additions to the Windows 2000 platform improve the ability to manage enterprise resources. The primary conceptual hurdle that needs to be overcome is the structure and organization of enterprise information in the Active Directory. Those familiar with an X.500 style directory like Novell's NDS will easily see how the hierarchical structure can be organized. However, understanding how Group Policy works, and how the rules of inheritance and all of its nuances are implemented, will take some effort. In a small environment with few servers and a single domain, the Windows NT Server 4.0 paradigm is still an appropriate method of managing resources. Once an organization expands into multiple domains in a Windows NT Server 4.0–based environment (which may translate into Organizational Units in Windows 2000 Server), the Windows NT Server 4.0 method quickly becomes difficult and archaic in comparison to Windows 2000 Server.

Active Directory

Features such as Multi-Master Active Directory data stores and Delegation of Authority provide the tools required to perform centralized management and oversight of a distributed environment, and allow local administrators or help desk personnel the security and functions required to solve local issues in a timely manner. One company estimates that this facilitates a reduction in domain servers from 168 to 8 in a single domain environment, resulting in annual cost savings of more than $700,000. While the inheritance model is complex, it is also effective in that certain security, application and desktop settings can be established at an enterprise level and then either forced at lower levels or allowed to be overridden. There are some limitations to this strategy, since there is no ability to enforce whether an overridden parameter is more or less restrictive than the inherited parameter.

The enterprise security setting allows five logon attempts before the user is locked out of the system; the account is reset after 15 minutes. If a group within the company, defined as an Organizational Unit (OU), requires greater security, the security parameters at the higher level have to be overridden. In this case, however, other OUs could override the parameter with less restrictive policies. The work-around solution is to establish the security parameters at a lower level in the hierarchy, but this reduces some of the advantage of the containership paradigm. While such issues may be a nuisance, they are not terribly problematic and are certainly outweighed by the advantages of inheritance and group policy.

Centralized Management

One challenge faced by IT managers is how to control the desktop from a centralized management system. This issue has been addressed in a number of ways. In Windows NT Server 4.0, policies are server-centric so that in order to enforce a policy, the template file must be placed on a server in a specific file location and that location has to be referenced in the user setup. With Windows 2000 Active Directory, these policies and additional security settings are managed in a central directory store. This allows for instantaneous (discounting the replication schedule) and easy implementation of changes to policy, saving IT managers from having to reconfigure all servers that implement the policy and reducing IT resource requirements.

Policies are applied at the OU, where all objects in the OU container will inherit the policy, but are filtered by group membership. In this way, control of which users are affected by what policy is easily manageable. The centralized nature of user information combined with control over local group information and synchronization to other directories (currently only Microsoft Exchange and Novell NDS - both with some limitations) makes removal of a user from the system a simple one-step process. To the extent that user existence within the enterprise is determined by another system (for example, Human Resources), the Active Directory Services Interface (ADSI) can be used to interface into the Active Directory and programmatically remove the user.

Disk and Printer Management

In addition to user and system management, enhancements have been made in disk and printer management. The incorporation of Disk Quotas (previously available from a third party in Windows NT Server 4.0) and Hierarchical Storage Management (HSM) allows much greater flexibility in managing centralized disk storage resources. Even with disk capacity becoming less expensive, having disk capacity when and where it is needed and being able to control the availability of storage space in critical areas can resolve many user problems proactively. With HSM, parameters can be set to move older, less-used files off the disk onto a multi-tape storage device, providing a very large storage space for files even when disk capacity is somewhat limited.

The Remote Storage and Removable Storage management consoles are produced by third parties and certain limitations are incorporated into the system. The primary limitation is that the remote storage function, part of the HSM solution, can only be applied to a disk that is directly connected to the same system as the remote storage device. Although it may be shared by other servers, the same storage device cannot manage other disks, though shared from other servers. If those functions are required, a third party application must be acquired to provide that functionality. The system infrastructure for providing these add-ons is incorporated into Windows 2000 Server so it can be extended by third party developers.

Setting up a print server in Windows NT Server 4.0 used to take a valiant effort to configure and make available to users. In addition, it takes a comparable effort for users to create a printer connection that is both convenient and efficient for them to do their business functions. With the ability to publish printers in the Active Directory, Windows 2000 Server makes all shared printers in a domain available in the directory. This enables users to quickly locate the most convenient printer resource. In an environment with high user mobility, this may reduce the time required to find and configure a new printer by two-thirds, saving hours of collective user time and allowing users to spend time doing the work for which they were hired. Printer management also is improved by allowing print administrators to manage printers by using Internet Explorer browser. The Windows 2000 family supports Internet Printing Protocol (IPP) that lets users print directly to a URL over the intranet or Internet if they know the printer's URL.

A new user has just joined XYZ company as a temporary employee. On the user's first day, he receives an e-mail from the printer administrator stating the URL of the printer he should connect to. The new temporary employee launches Internet Explorer and clicks on the URL of the print device, connecting to the printer, and downloading any required device drivers. The significant reduction in time required to configure any network settings or printer drivers allows the temporary employee to get right to work.

Terminal Services with Multi-Lingual/Language

The Terminal Services add-on in a Windows 2000 Server–based environment, like Windows NT 3.1 with a Citrix solution, allows the use of thin clients to access one or more sessions. Windows 2000 Server provides all of the same capabilities as Windows NT Server 4.0, Terminal Server edition, but two differences expand on the capabilities of Windows NT Server 4.0 Terminal Server to provide a more complete package for serving thin clients.

The first is that the Terminal Services add-on in Windows 2000 Server is an extension of the basic executable operating system. No separate installation of the operating system is required; the service is added through the normal installation process and then the required licenses are purchased to activate the service (the beta 3 version did not require licensing). This means that only one operating system will need to be managed and all versions can be updated to the same service pack levels when those service packs are available in a Windows 2000–based environment. In Windows NT Server 4.0, a different executable operating system with separate service packs is required.

The second factor is that all Windows 2000–based systems are multi-lingual out of the box. This means that the entry of data into any application supporting Unicode can be changed to the user's character set without requiring a separate version of the server or a reboot of the system. Also, if supported by the application, multiple languages can be displayed simultaneously. The multi-language features, where the menus, help and dialog boxes are in a different language, may be acquired separately but are installed on top of one another so that a single operating system can support multiple languages simultaneously. Together, these features provide an easy and quick implementation to support global users from a single server on legacy client systems.

An American company has entered into a joint venture with a Korean firm to provide engineering support in its product design. In order to incorporate the new Korean users into the American company's systems, standard e-mail and office productivity applications will be introduced requiring an upgrade to Office 2000 overseas. In addition, a number of the engineers will cycle through the U.S. facility in order to get acquainted with the company, culture and other engineers. There is a capable systems administrator in Korea who understands little English, but is obviously fluent in Korean. In order to get the facility in Korea up and running quickly, Windows 2000 Server with Terminal Services is installed with multi-lingual English and Korean. In addition, the Korean language files are added to the server so that the local administrator can manage it. When the engineers visit the U.S., a Terminal Services enabled Windows 2000–based server is similarly configured to allow them access to their required applications and files in their native language.

Backup and Recovery

The function of backup and recovery is critical to all organizations. New features in Windows 2000 Server can significantly reduce the time required to bring a system back on line after failure of a disk drive. The integrated backup utility allows access to a wider variety of backup devices than the Windows NT Server 4.0 version of the same utility. Most notably, a backup can be made to another disk device on another system on the network. This allows a quick restore of all of the settings should an operating system restore be required.

Secondary Logon

Typically, a system administrator will be logged onto a system as a user with reduced security for performing typical system management functions. A request to perform an administrative task may require the administrator to logon as a different user or use another machine. By using the secondary logon feature of Windows 2000 Server, an application can be executed with another user ID and password so that the administrator will not have to logoff and back on again to perform this function. Conversely, an administrator can troubleshoot another user's security issue by executing an application as that user and determining where the problem is occurring.

Group Policy and Dynamic Configuration

The combination of Group Policy, folder redirection, roaming profiles and other platform components can be utilized to allow limited administrator interaction when moving a user from one system to another, or when setting up a system for the first time. Since Group Policy provides administration of the user and the computer independently, new users can move to computers that have not been preconfigured and logon to establish their environment, including their desktop settings, requisite applications and access to data files. A design and manufacturing company spends an average of 2 ½ hours configuring a new system or moving a user to another client system. With the features of the Windows 2000 Server and Windows 2000 Professional platform, this potentially could be reduced to one hour.

Folder redirection with the synchronization manager maintains user data files on the server and replicates them to the local drive when the user logs off of the domain. These files, which appear to the user as local files, are backed up on a regular basis to reduce the possibility of file loss through user error or client system failure.

ROI Issues

Many of the enhancements described above result in labor efficiencies. These labor savings are on behalf of not only the IT administrator but also the end-user. Therefore, to calculate a particular value metric, one would certainly take into account the following: (1) time saved from the previous technology (Some figures may be obtained in this white paper while others require an organization specific analysis), (2) experience level of the person involved, (3) cost per labor hour and (4) activities that the person could have been involved with had he not been dealing with IT issues. This last issue is more difficult to quantify. It entails examining that individual's tasks and exploring what value they add. For some employees, the time conflict means they have to work longer hours to accomplish their objectives. This has ramifications in employee dissatisfaction and lower morale. For some executives, time is their greatest constraint and they simply cannot work longer because there are not more hours in the day. For these individuals, time spent on internal IT issues instead of business issues can result in lost intellectual capital.

2. Availability

Whenever systems are not available for their intended purpose, the effects can range far beyond the simple effort required to restore them to full operation. The cost to an organization of lost productivity and data can be considerable. One of the key areas where Windows 2000 Server has been improved over Windows NT Server 4.0 is in the area of availability.

Driver Signing

Unexpected downtime is the most costly availability issue. Resources becoming unavailable during critical business functions or operations may impact a large population, depending on the role of the server. To address this issue, Windows 2000 Server introduces the concept of driver signing. While Microsoft itself does not create the majority of device drivers used in a typical installation, it has instituted a new program to vigorously test device drivers' compliance with resource usage standards. Once a driver is approved, it is signed. The administrator then has the option of allowing only signed drivers on the system. Predicting the actual effect of this option on the frequency of system crashes is difficult, but we would expect a marked decrease in those events that cause server failure related to memory or other resource abuses by a device driver.

Clustering

The clustering capabilities previously available only with an add-on product are now incorporated into the operating system and have been enhanced. Both Fail-Over and Load Balancing are clustering features made available through selections from the installation program. The fail-over functionality has been enhanced to support multiple fail-over systems. Instead of the previous limit of two systems, four systems can be clustered to provide multiple system fail-over capability in the event the primary system fails. Critical database applications may lose connections, but the relatively short time it takes to switch to the backup system (10 to 15 seconds) will not be significant to most users.2

The load balancing features have been extended to include not only network load balancing, but also COM+ object balancing. This allows the middle layer of a three-tiered architecture to be distributed to multiple systems both for reliability and performance improvements.

Recovery Time

When systems do fail, either unexpectedly or planned, the time required to bring the system back to full function can be critical. In addition to improvements in this recovery time, Windows 2000 Server has been designed to reduce the number of reboots required for maintenance upgrades. It will be some time before the standard message that a system is required to reboot after an installation is removed from all of the appropriate places but, once software developers have made the adjustment, fewer reboots will be required.

Additional features address the issue of Mean Time To Recover (MTTR). The Check Disk (CHKDSK) process, a utility that is run by default after a system crash, has been retooled to complete mission critical functions in a much shorter time. One company estimates that CHKDSK on a Windows 2000 Server–based system will execute about 10 times faster than on a Windows NT Server 4.0–based system. The addition of Plug and Play significantly reduces the recovery time after adding a component to the system. In addition to the reduced reboots of the server, Plug and Play enables much quicker configuration of devices relative to Windows NT Server 4.0. Also, Service Pack slipstreaming removes the requirements to apply service packs that have already been applied.3

Consider the installation of an additional adapter card or a modem card on a Windows NT–based server. One must first turn the system off to install the card, reboot the machine after necessary network bindings are configured and then reboot again after re-applying any required service packs. Service packs have to be re-applied to prevent major problems when system state has been changed (for example, RAS installation). On a Windows 2000 Server–based system, the only time the system would be down is during the installation of the physical hardware itself.4 No reboot is necessary after network binding configurations and because of Service Pack Slipstreaming the service pack no longer needs to be re-applied.

The combination of these features together will significantly reduce downtime and the costs associated with servers being unavailable. A company with 12,000 users predicts reduced downtime savings of approximately $3 million annually. The savings stem from the ability of Windows 2000 Server to run CHKDSK faster than Window NT Server 4.0 and from its Offline Folders that allow some work to continue even while the system is recovering. The company calculates this savings by determining the saved incremental system downtime, in hours, and multiplying it by the fully loaded hourly labor cost per employee.

Data and Application Availability

Loss of data can be much more expensive than loss of the hardware that stores it. A number of enhancements in Windows 2000 Server provide the ability to reduce the risk of losing data stored on a server. The Distributed File System (DFS) is available in Windows NT Server 4.0, but has been enhanced in Windows 2000 Server to allow a level of fault tolerance by replicating files stored on one root share to another. If one of the shares fails or is unavailable for a time, users who normally use files on the failed share will automatically be directed to the replica share. A DFS share can also be expanded to volumes on different servers so that one logical drive is presented to users. One weakness is that only one DFS share can be created on a server which may limit this feature's utility in some instances.

While DFS contributes to the ability to expand the size of logical volumes, Disk Quota and Remote Storage Services can be utilized to control the amount of disk capacity available. Both of these features are available through third party add-ons in Windows NT Server 4.0. Disk quotas can be extended to apply to the client computer so that limits can be set on local as well as central resources. Unfortunately, Windows 2000 Server does not allow the application of disk quotas to users through Group Policy. The association of users to volume quotas is still maintained on the server.

The Remote Storage Services (RSS) feature is configured through the common Microsoft Management Console (MMC) interface to provide hierarchical data management. Parameters can be set on each disk on a target server, so that minimum availability on the critical disk system can be maintained by caching less used files to the remote storage device. As long as the entire required free space is not populated with files between caching events and the appropriate quantity of files are eligible to be moved, space will be available for general use. When the file is then requested from disk, RSS brings the file back from tape.

Quality of Service (QoS)

All of the hardware components required by an application may be up and running, but application performance is impacted by network bandwidth. The goal of QoS integrated control is to provide appropriate bandwidth when and where it is required. This makes network resource availability more efficient and ultimately benefits business operations and process uptime. This function requires networking hardware that supports the dynamic assignment of bandwidth to specific ports, but QoS is administered through policy-based management and is easier to apply to the appropriate users or functions. QoS also provides for allocation of planned amounts of bandwidth to support other applications that require a minimum quality of service to be functional.

A meeting to define technical specifications for a device must be held between engineers in Germany and Detroit. Typically, the network traffic between these two sites spikes every fifteen minutes or so with the replication of directory data making a session using NetMeeting® conferencing software difficult to maintain. QoS policies can assign users a minimum bandwidth, so the meeting can remain undisturbed for maximum effectiveness. Once the meeting is concluded, the bandwidth is returned and other processes may use the bandwidth as required.

ROI Issues

The impact of availability, or the lack thereof, can be dramatic for all types of organizations. Research firms have found that one hour of downtime can cost from thousands up to millions of dollars per hour, depending upon the industry and business environment. Two organizations, Contingency Planning Research and Gartner Group, have performed availability studies. Their results show an average range from $14,500 per hour of downtime for ATM fees to $6.5 million per hour of downtime for brokerage operations. Calculations to assess the impact of availability may include some of the following factors: (1) idle labor costs, (2) idle machine time or other capital resources resulting in higher production costs, and (3) lost sales for those organizations involved in eCommerce solutions or mass transaction processing.

3. Scalability and Performance

In the past, many viewed the Windows NT Server 4.0 operating system as limiting with respect to serving the needs of the enterprise. The mind-share at the enterprise level belonged to non-Windows operating systems. Windows NT Server 4.0 did not fully utilize available physical resources, such as additional processors or very large memory (VLM). Symmetric MultiProcessing (SMP) on Windows NT Server 4.0 Enterprise Edition was licensed only up to eight processors. An SMP configuration above eight processors, up to 32 processors, required support from individual vendors. In many cases, the incremental benefit of adding more than four processors quickly decreased per processor. Although applications for Windows NT Server 4.0 can provide two gigabytes of virtual address space to every application, Windows NT Server Enterprise Edition is limited to recognizing only up to four gigabytes of physical memory. The four gigabyte tuning functionality extends this capability by allowing large memory-aware applications to use up to three gigabytes, limiting available memory to one gigabyte for the operating system.

Microsoft has addressed these specific issues in Windows 2000 Datacenter Server and Windows 2000 Advanced Server with improved Symmetric Multiprocessing and Enterprise Memory Architecture (EMA). By utilizing Intel's Physical Address Extension (PAE) on Intel 32 bit (IA-32) platforms, Windows 2000 Datacenter Server supports up to 64 gigabytes of physical memory. Key SMP improvements include: 1) better management of CPU memory allocation to improve locking, balances contention between processors more effectively, 2) a new unit of processing which reduces CPU overhead by using "fibers" that consume fewer resources than threads, and 3) the Windows Job Object API which interacts more efficiently with the CPU to increase overall scalability.

Other scalability factors include standard clustering capabilities in Windows 2000 Datacenter Server and Windows 2000 Advanced Server. Clustering technology that directly impacts scalability includes Component (COM+) and Network Load Balancing (NLB). COM+ makes it possible to distribute the load on specific applications in an N-Tier architecture while NLB distributes and balances TCP/IP client requests across up to 32 servers transparently to users.

The Windows 2000 Server family includes IP stack optimizations as well as APIs to allow hardware acceleration of key networking functions including checksums, DES/3DES encryption, and IPSec session re-keying. These features increase system throughput, off-loading the system processor for application performance gains.

Memory

Windows 2000 Datacenter Server and Windows 2000 Advanced Server extend the maximum amount of accessible server memory to 64 gigabytes. This improves scalability and performance by allowing more users per server, and allowing applications to store and load data in memory rather than on disk. These improvements are most beneficial to enterprise applications such as databases, Web-based applications and applications accessed through Terminal Services.

Industry experience indicates that enterprise applications typically become I/O bound, regardless of the operating system and hardware architecture. This has caused the resurgence in demand for mainframe systems due to their I/O subsystem bandwidth advantages compared to smaller, less costly SMP systems, since the cost of storage per megabyte of memory has been much higher than disk storage. However, the cost of memory has fallen dramatically for Intel based systems. Windows 2000 Datacenter Server and Windows 2000 Advanced Server enable the less expensive Intel based SMP systems to leverage the price performance advantages of accessing data from memory. This enables customers to run larger enterprise class applications on a less costly architecture.

In addition to improved application performance, servers can be consolidated to reduce overall management and hardware costs.

Symmetric Multi-Processing

Windows 2000 Datacenter Server extends the maximum number of CPUs to 32 (64 through OEMs). In Windows NT Server 4.0, there was little performance gained by going beyond four processors due to the SMP affinity mechanism used in the Windows NT Server 4.0 kernel. As a result, the three leading hardware manufacturers for Windows NT Server 4.0–based computers (Compaq, IBM and Hewlett-Packard) have not offered servers with more than four processors for the Windows NT Server 4.0 market segment. The SMP affinity mechanism has been rewritten in the Windows 2000 Server Family to enable effective use of larger scale SMP servers and to achieve better performance from smaller scale SMP servers. Our tests indicate a 63% improvement in transactions per second with Windows 2000 Datacenter Server compared to Windows NT Server 4.0 Enterprise Edition on a four-processor system. Our results and the fact that hardware manufacturers are investing in the production of servers with more than four processors for the Windows 2000 Server family market segment suggest this feature will lead to benefits to customers through SMP scalability beyond four processors.

Network Load Balancing

With the dramatic growth of the Internet and intranets, Web servers are increasingly serving as the front ends to multi-tiered applications. Scalability in such cases has been addressed by the implementation of Network Load Balancing that enables organizations to cluster up to 32 servers. The server cluster functions as one system regardless of the number of servers available within the 32-server cluster. Windows 2000 Advanced Server evenly distributes incoming traffic, providing a single system image to the client. This lets mission-critical servers, through which Web-based applications connect, scale their performance as needed to mirror demand. When a computer fails or goes offline for maintenance, Network Load Balancing automatically reconfigures the cluster to direct client requests to other servers and maintains continuous availability of network services.

Another key application of NLB is virtual private networking (VPN). For a mobile workforce, access to the corporate network is critical. Implementing NLB in conjunction with VPN servers offers two key benefits. First, the end user can connect using a single server address as long as there is at least one load-balanced VPN server running. Second, with a large number of users, VPN session load can be distributed across multiple servers to increase session performance and overall throughput.

Component Load Balancing

Similar to Network Load Balancing clusters, COM+ clusters provide scalability by distributing the load at the application level. In an N-Tier architecture solution, the application object can be distributed to multiple servers to balance the load on each application as the number of users increases.

ROI Issues

The Windows 2000 Server family, when compared to Windows NT Server 4.0 family, allows an organization the flexibility to grow without having to adopt another operating system. A single server could support more users and additional applications while still allowing those applications to function efficiently. This expanded scalability not only saves the purchase of additional software/hardware, but also reduces the need for IT support staff to administer additional operating systems. Further, the diminishing returns of adding more than four processors in previous Windows NT Server 4.0–based servers is no longer a factor in the Windows 2000 Server–based environment.

4. Interoperability and Security

Windows 2000 Server implements a number of industry standards both within the fabric of the operating system itself and in the communication protocols and security mechanisms used in the basic communications infrastructure to provide a more secure network. In addition, the directory synchronization options, Lightweight Directory Access Protocol (LDAP) support and open Active Directory Services Interface (ADSI) open up methods of interoperating with diverse systems at several levels in the IT infrastructure.

Directory Standards

The most significant standards that Windows 2000 Server adopts are a X.500 style hierarchical naming structure and LDAP for access to and from the directory. The hierarchical organizational structure allows the corporate structure to be reflected within the IT management framework. Within this hierarchical structure, rules and policies can be inherited to allow the implementation of global policy on groups of objects/users inside an OU. Along with Group Policy, the inheritance model and the ability to run multiple logon and logoff scripts per session enable administrators to manage the user environment at the desktop from a centralized store of information. In addition, locating people within the organization as well as disk, printer and client system resources is enabled through a search interface on both the server and desktop that is simple to access and use.

Directory Synchronization

There are currently two directory synchronization add-ons available with Windows 2000 Server. The first one, currently shipped with the beta 3 version, is for synchronizing with an Exchange 5.5–based environment.5 Since Exchange 5.5 follows a flat structure versus the hierarchical model of Active Directory, multiple synchronization sessions must be configured to create an effective synchronization implementation in a complex environment. The other add-on interacts with a Novell 4.X/5.X NDS structure to synchronize the two hierarchical directories. In this case, the structure in both systems can be identical as long as Windows 2000 Server is the source directory.6 Both of these add-ons are managed through the MMC interface and communicate via the LDAP protocol through the ADSI interface. These open standards can easily be built upon to develop other directory-specific synchronization tools.

Security and Standard Protocols

By default, all Windows 2000 platform communications are authenticated by the standard Kerberos 5 security protocol. This most recent version of the Kerberos standard is not yet widely implemented. In addition to a greater level of security in all communications and a mild performance benefit due to the nature of the authentication mechanism, this standard will eventually7 provide the ability to authenticate to non-Windows 2000–based systems.

In order to secure sensitive information within a contiguous non-firewalled network (for example, a campus network or privately managed WAN), Window 2000 Server includes IPSec Transport Mode encryption. Active Directory maintains policies to define IPSec use and these policies can eventually be received by non-Windows-based systems through Active Directory interfaces. For Virtual Private Networking over the Internet, a number of standard secure communications protocols have been implemented: IPSec Tunnel Mode for IP only site-to-site; Layer 2 Tunneling Protocol encapsulated in IPSec Transport Mode (L2TP/IPSec) for multi-protocol site-to-site; and, for remote access, Point-to-Point Tunneling Protocol as a less complex alternative to L2TP/IPSec. The combination of IPSec, Level 2 Tunneling Protocol, SSL to protect Internet transactions and the ability to authorize, distribute and store X.509 public keys combine to create an easily implemented full-blown security solution including internal and external web- based communications.

Windows 2000 Server also includes RADIUS authentication services to allow companies to grant and deny access to remote users through ISP connections. This lets companies outsource remote access infrastructure through VPNs without having to purchase ISP accounts for each user. It also allows companies to monitor use through RADIUS accounting and consolidate billing for ISP use to negotiate reduced rates.

ROI Issues

With the adoption of industry standards and communication protocols, Windows 2000 Server will enable easier access to strategic partners. What may have been an enormous integration task in the Windows NT Server 4.0–based environment is now accomplished more easily due to standardization and the integration of these technologies into the core product.

  • To assess the financial impact, one might assess the following: (1) administrator's time, (2) administrator's experience level, (3) the importance of a consistent flow of information between strategic partners, and (4) the ramifications of not being able to obtain a partner's data.

  • Security is an extremely difficult factor to assess. The ultimate questions are, (1) What would it cost if competitors were able to obtain proprietary information? and (2) How likely is this scenario based upon the current environment? The answers to these questions will be different for each organization, but must be seriously considered.

C. Current Business Initiatives

In today's business environment, the advent of technology, communications, and transportation has enabled organizations to become more effective through a myriad of business initiatives. Some of these initiatives have been evolving for decades, while others are a more recent phenomena. Nevertheless, these initiatives change the way organizations operate and serve customers. Some of the initiatives include:

  1. Globalization

  2. eCommerce

  3. Outsourcing

  4. Knowledge Management

  5. Telecommuting

  6. Increased Merger and Acquisition Activity

  7. Increased Use of Teams

For each of these initiatives, Andersen assessed how Windows 2000 Server can enable the organization to improve its effectiveness. The following section outlines the global business initiatives and demonstrates how Windows 2000 Server can facilitate the migration of these initiatives within the organization. Additionally, we have conducted several short, limited interviews with multinational companies to gain their perspective on the benefits of Windows 2000 Server as it pertains to them. The estimated savings are based on their operating environments. Andersen provided no assistance in formulating the cost savings for this report, and has not performed any verification of the data.

1. Globalization

The proliferation of conducting business outside of one's borders has been ongoing for decades. Today, the competitive landscape dictates that organizations not only operate all aspects of their business globally, but also compete globally for business. This global economy can be illustrated by the fact that world merchandise exports between countries have more than doubled since 1980, as shown in the following graph:

Bb742519.serv10(en-us,TechNet.10).gif

People must collaborate on projects outside their borders, work remotely and have access to data. Technology has been the catalyst for reducing a company's dependence on physical location, allowing information to be shared across continents as easily as across town. Language translation services will continue to be a critical issue, especially over the Internet. One estimate indicates that the amount of non-English language material on the Web is growing so quickly that by 2003, more than half of the content will be in a language other than English, up from 20 percent today.8 Globalization has spawned new paradigm shifts pertaining to a manager's view of their organization. As long as technology progresses, the trend towards enterprise globalization will continue to flourish.

The Windows 2000 platform helps companies break down "language barriers" using multi-lingual/ multi-language functionality. When these services are applied in conjunction with Terminal Services, administrators are able to set up user profiles in Windows 2000 Server, using Group Policy for specific users, based on their regional location and native language. The use of roaming profiles enables each user to access his personal settings and applications from any machine on the enterprise network with minimal time and effort, regardless of geographic location. For example, French-speaking users can logon using a terminal server in Spain and retain their language settings without additional time and effort. This frees end-users from dealing with technical IT issues and allows them to focus on what they do best.

In one instance, the management of an international organization is required to submit reports to the foreign governments of countries in which it operates in the government's native language. Management sees opportunities to streamline the cumbersome translation process through the use of this technology. Not only will the company save time in completing reports, it also will be able to submit them on a more timely basis.

2. eCommerce

Started in the late sixties as a government initiated project to provide a means of communication that could survive a nuclear explosion, the Internet has grown into a company's gateway to new markets and competitive advantages. Arising from the Internet, eCommerce has emerged as a "location independent" medium for conducting business transactions, advertising services and creating synergies with other organizations. According to a recent U.S. Department of Commerce report, at the beginning of 1999, 37% of the U.S. population has Internet access at home or at work. A 1999 CommerceNet/Neilsen Media Research study demonstrates the migration to on-line purchasing and shopping:

Bb742519.serv11(en-us,TechNet.10).gif

Companies simply cannot afford to ignore this new medium of commerce if they are to remain competitive.

The following technologies in Windows 2000 Server address eCommerce project issues such as scalability and availability :

  • Network Load Balancing Clusters

  • Server Clusters

  • Component Load Balancing Clusters

  • Integrated Public Key Infrastructure Support

  • Encrypted Network Access

In any eCommerce environment, the lack of availability can translate into lost sales and erosion of customer satisfaction.

3. Outsourcing

As the rate of change quickens, many organizations find they are unable to master all aspects of their manufacturing and distribution process (for example, the value chain). Further, these same organizations find that certain links within this chain are key drivers for meeting and exceeding customer's expectations, while others are not. Therefore, they are allowing other entities to subcontract specific functions. Technology has made it easier for businesses to find and collaborate with these outside specialists, allowing companies to focus on what they do best. For example, companies who compete based on knowledge capital often delegate administrative tasks such as procurement and accounting services so they can focus strictly on their business. However, as sensitive information is exchanged between parties, security becomes a key factor in assessing the risk level of the outsourcing relationship.

Windows 2000 Server supports a Public Key Infrastructure (PKI) that enables companies to safely exchange specific business information over the Internet by applying Group Policy settings. For example, a company can reliably provide limited access to its network yet still enable vendors to maintain supply levels on a real time basis. With enhanced security, companies may be more willing to outsource key business functions in the future.

4. Knowledge Management

Companies, like individuals, compete on the basis of their ability to create and utilize knowledge; therefore, managing company knowledge is as important as managing company finances. Knowledge collects in employees' minds and is codified in physical assets, software, and routine organizational processes. The ability to access and contribute to this knowledge base remotely and on a real time basis will continue to be important to knowledge management. According to a recent study, 27 percent of the companies surveyed are implementing some kind of knowledge management system.9 As a company grows, management of these knowledge assets will often determine the company's ability to survive, adapt and compete in today's dynamic marketplace.

Windows NT Server 4.0 limits indexed searches to file names and location. Windows 2000 Server Indexing Service with Active Directory Services allows users to search any document within the network relating to specific authors or subject matter. Therefore, the company network also serves as a means of sharing the company's knowledge. This allows the key intellectual property within an organization to be more accessible and facilitates greater communication. For those organizations that compete based upon knowledge, this ability to quickly access this information could be the difference in exceeding customer expectations.

5. Telecommuting

As technology advances, employers find it advantageous to allow employees to work from locations other than the traditional office. The number of telecommuters in the U.S. rose to 15.7 million as of mid-year 1998, according to research conducted by Cyber Dialogue, a New York–based research and consulting firm. Based on estimates in the federal Washington-area Pilot Telecommuting Center Initiative, private sector office cost savings (including space, furniture, equipment, service and maintenance, and recurring telecommunications charges) ranged from $10,000 to $15,000 per workstation, annually. Hence, with the potential cost savings and the opportunity to empower employees with the option of working remotely, telecommuting will become more prevalent.

Windows 2000 Server Remote Access Service (RAS) and Virtual Private Network (VPN) enhancements allow effective and secure communication for remote users with significantly less hardware. For example, a Windows NT Server 4.0–based network allows a maximum of 256 user connections per server and one modem for each user connection (that is, to support 12,000 remote users accessing the network simultaneously requires 47 RAS servers and 12,000 modem connections). Within a VPN, a Windows NT Server 4.0–based server can be used to authenticate and maintain 256 simultaneous connections, but still require the same number of RAS servers, even though the sessions are connected via the Internet. Windows 2000 Server has enhanced support for VPN to allow unlimited sessions on a single server, depending on the hardware configuration.

A Company has 2,400 users that require remote access or employ some form of telecommuting. The company needs to be able to support at least 50% of the users simultaneously. Using RAS with Windows NT Server 4.0, this would require 5 RAS servers and 1,200 modem connections. Using Windows 2000 Server one or two RAS servers would be required. Similarly, using a VPN solution, the number of RAS servers in a Windows NT Server 4.0–based environment would remain at five while the Windows 2000 Server environment would require only one. In addition, because of increased scalability and performance in Windows 2000 Server, other functions could be consolidated onto the same server. Also, the users who are allowed access can easily be managed via Group Policy and the hierarchical structure in combination with groups to reduce management overhead of the telecommuting function.

6. Increased Merger and Acquisition Activity

The 1990s have seen a dramatic increase in the level of merger activity. This has been driven by the need to increase market share, achieve economies of scale, attain a global presence and expand product offerings. The chart below shows how the number of mergers in the United States has almost quadrupled in the past decade:

Bb742519.serv12(en-us,TechNet.10).gif

As these organizations integrate their merged operations, the need to have fully functional information systems is critical to achieve success.

In a Windows 2000 Server–based environment, domain consolidation is made easier with Active Directory and Directory Synchronization. The ability to synchronize directory services to and from Windows 2000 Active Directory allows domain administrators to manage newly acquired domain infrastructures in a relatively short time period. Further, Terminal Services allows the acquiring company to utilize the target's older, less capable computers to take advantage of running applications that require heavy processing. In the short term, an acquiring company may not need to replace all of the target's computers for a successful integration. Further, the organization can begin reaping the benefits of the merger immediately, rather than wait for the technology to catch up to the business.

7. Increased Use of Teams

Organizing people into self-managed teams is a critical component of virtually all high-performance management systems. Research at the Center for the Study of Work Teams indicates that 80% of Fortune 500 organizations will have half of their employees on teams by the year 2000. Today's teams are often made up of individuals with a variety of skills and disciplines. It is not uncommon to find marketing or finance individuals working alongside the product development or manufacturing team. In these team environments, effective communication is critical. Individuals in these environments demand authorized access to key company information and the tools necessary to collaborate with other team members.

Through the Windows 2000 Server Distributed Files System (DFS), administrators can now link various servers into one distributed share, allowing users to access a variety of files located across the enterprise network. For an organization with many teams, finding data can often be time consuming and tedious. With DFS, these newly formed teams can find each other's data faster, streamlining the team formation process. Furthermore, successful team collaboration requires dedicated bandwidth to allow for continuous, uninterrupted communication between the members of the group. Quality of Service (QoS) allows administrators to provide guaranteed network bandwidth to ensure network resource availability.

V. Appendices

A. Research Methodology and Design

The qualitative framework is derived from information collected during the scenario testing conducted in Andersen's Business Consulting Technology Lab. Seven key stages of our approach are illustrated below and outlined in detail in the next section.

Bb742519.serv13(en-us,TechNet.10).gif

As illustrated, the study's stages were conducted in parallel. Key management checkpoints ensure quality control throughout the study.

Stage 1. Identify Potential ROI Model Inputs

Andersen determined key qualitative criteria for inclusion in the study. Criteria for consideration were generated via interviews with Microsoft developers and Andersen experts. Andersen Cost Management experts assisted to ensure that the criteria included both direct and indirect performance metrics. The criteria take into account the following elements:

  • Hard and soft costs

  • Flexibility/ Interoperability

  • Robustness/ Reliability

  • Scalability

  • Security

  • Ease of management

  • New features and capabilities

Stage 2a. Develop Test Scenarios

Andersen set up and tested benchmarking scenarios for each evaluation criteria area. We measured and/or observed improvements or weaknesses of Windows 2000 Server in areas that are consistent with the criteria selected for the study.

An example scenario is Management of Users. Test evaluation criteria of the Management area include:

  • Time, effort, and skill level required to add users to a Windows 2000 Server–based Domain environment

  • Time, effort, and skill level required to alter the user configuration

Stage 2b. – 4b. Utilize Microsoft's REJ model and ROI principles

We utilize Microsoft's REJ (Rapid Economic Justification) model framework. Since the information gathered in the study is primarily qualitative, the potential ROI impact contains many assumptions and will only provide high-level ROI conclusions.

Stage 3a. Scenario Test Preparation

The objective of this stage is to set up the simulation tests for the scenario areas selected in the prior stage. The intent of lab testing is to derive and/or validate qualitative returns.

Stage 4a. Scenario Testing

In this stage of the study, the selected scenario areas are tested and the results analyzed and compared.

Stage 5a. Evaluate Scenario Test Results

Andersen analyzes and interprets the scenario test results. Findings are divided into four result areas: ROI impacts, high-level technical observations, summary detailed analysis, and detailed technical analysis.

Stage 5b. Perform High-Level ROI Analysis using Scenario Test Results

Using the REJ framework and ROI methodology, Andersen performs a high-level ROI analysis based on the results of the Scenario Testing.

Stage 6. Summarize and Present Findings including ROI Model

Andersen creates an executive-level report outlining:

  1. Research Methodology and Design: Details the approach taken in designing the study with respect to the simulations, including how they were administered and conducted.

  2. Findings: Provides a synopsis of the information gathered during the study.

  3. Conclusions: Provides conclusions substantiated by the study, including statements relating to ROI and qualitative performance enhancements.

  4. ROI Model/Methodology: Describes the ROI methodology developed by Microsoft and its link to simulation results.

  5. Appendix: Contains a detailed description of the technical environment utilized for the scenario testing.

Stage 7. Develop and Write White Paper

This White Paper is based on the test results and the ROI analysis.

B. PRESENTATION and USE OF THE ANALYSIS

1. Presentation of the Test Results and Analysis

The analysis is divided into five logical areas. The illustration below shows the five areas and where they appear in this report. The test results and the technical analysis lead to the summary of technical analysis, which in turn lead to the ROI impacts of specific Windows 2000 Server features. The cumulative test results and the ROI impacts lead to the high-level technical observations and high-level ROI observations.

Bb742519.serv14(en-us,TechNet.10).gif

WINDOWS 2000 SERVER FAMILY ROI IMPACTS FOR CORPORATE CUSTOMERS

PRESENTATION OF RESULTS

A definition of the five levels is as follows:

High-Level Observations

  1. ROI

    High-level ROI observations that assist a CIO/CFO in determining whether to implement Windows 2000 Server.

  2. Technical

    High-level technical observations based on the technical summaries (section 4. below). These represent some of the potential technical implications that should be considered by prospects. While the technical analyses below are based on specific features, these high-level technical observations are based on our examination of common themes (for example, the potential for reduced support calls that would result from the relationship between Active Directory and Group Policy).

    Feature-Based Testing and Analysis

  3. ROI Impacts of Specific Features

    Feature-based ROI impacts in each of the four key IT areas may be potentially positive, neutral, or negative.

  4. Summary of Test Results and Technical Analysis

    Summaries of the combined technical merits or shortcomings of specific features of Windows 2000 Server and, where there is a transitive implication, Windows 2000 Professional in conjunction with Windows 2000 Server (platform).

  5. Test Results and Technical Analysis

Detailed test results and technical analysis of Windows 2000 Server organized by the four key IT areas.

2. Description of the ROI Assessment Tools

The following tools will assist organizations or individual departments to determine, at a high level, the potential ROI impacts Windows 2000 Server will have on their specific organization. Although each tool can provide a fairly consistent high-level analysis, it is the use of the tools collectively that will provide the optimal analysis. For detailed descriptions and instructions for use of these tools, please refer to Section III. Microsoft Basic ROI Framework and Methodology**.**

  1. IT Profile Matrix

    ROI Matrix divided into the four IT areas, with a definition of terms and instructions to guide organizations in completing it.

  2. ROI Analysis

    A list of factors that may influence the ROI impact for organizations. These are business, process, and infrastructure-related issues that assist an organization in refining their high-level view. Andersen's analysis was augmented by information obtained via brief interviews with Microsoft customers.

  3. Business Initiatives

    The greatest challenges facing businesses today, based upon a survey of business literature. Based on interviews with Microsoft customers and our consulting experience, a number of potential applications of Windows 2000 Server features could be applied to address these challenges.

  4. Detail of the Test Results

    Detailed test results that may yield additional information for an organization's specific circumstances. These results are, where appropriate, measurable with the metric of human resource time expended on a task, time that a function will be available or unavailable, or the speed with which a system responds. In other places where metrics were not possible or practical, specific technical observations are included.

C. REJ 4x7 benefit matrix

Information Technology (IT) Managers are increasingly being asked to provide an economic justification for IT investments. Some corporations are even asking IT Managers to explain how a project creates shareholder value. This can be difficult at times. The economic justification of IT projects has been researched extensively in the past decade. While the models and techniques developed as a result of this research provide a high degree of precision and mathematical certainty, they require extensive amounts of data and time to prepare. IT Managers need a tool that will allow them to rapidly understand the value of an IT investment. In response, Microsoft's framework, Rapid Economic Justification (REJ), builds a bridge of common language between IT professionals and senior executives to demonstrate how investments in IT benefit the business.

REJ performs an assessment of an IT project's economic impact on an enterprise, giving a decision-maker the information necessary to make rational IT investment decision. Portions of the REJ framework are used in this White Paper; however, REJ is best tailored to individual organizations due to the significant differences between organizations. Therefore, Andersen uses only the generic portion of this framework since no particular company or situation is fully analyzed (that is, no conclusive financial metric figures are calculated). Applying the tailored version of the framework will yield greater accuracy and detail for individual organizations.

REJ Methodology

REJ is a results-based methodology. At each step, a specific deliverable is produced. While each deliverable has its own value, it is also used to produce a deliverable in the next step. The final report integrates the deliverables resulting from each of these steps, logically linking the IT project and the business objectives written in business terms as illustrated in the 4x7 Matrix Step Three. The five intermediate deliverables are as follows:

Bb742519.serv15(en-us,TechNet.10).gif

  1. Business Assessment roadmap outlining the business's goals, objectives and measurements.

  2. Description of the business solution and its supporting technology.

  3. Solution cost/benefit analysis.

  4. Organizational impact study in which the cost/benefit analysis results are adjusted to show each economic effect in present day terms and its impact on the organization's bottom line.

  5. Risk management assessment in which the risks related to the project are identified and accounted for in the context of overall economic impact on the organization.

Step 1: Business Assessment Roadmap

The Business Assessment Roadmap, shown below, identifies the key stakeholders, their Critical Success Factors (CSF), the strategy to achieve them, and the Key Performance Indicators (KPI) to determine success. The map will lead the organization to the final destination of this step - identifying the owners of the key processes that can be improved by information technology to get to the desired state.

Bb742519.serv16(en-us,TechNet.10).gif

Step 2: Business Solution

In conjunction with the owners of these key business processes, a business solution is identified using flow charts, fishbone and process analysis describing ways of applying technology solutions to increase alignment with the organization's CSF.

Step 3: Solution Cost/Benefit Analysis

The cost/benefit calculation in REJ goes beyond an itemized list of benefits for the IT budget owners, usually found under the Total Cost of Ownership (TCO) umbrella. Analyses often fail to relate benefits to the current business initiative of the organization. In order to make that connection, the team analyzes, profiles and quantifies benefits related to the current initiatives. The key is to express them in terms that match the needs of the business with respect to the total cost to achieve those benefits.

REJ provides a framework to assess the cost/benefit analysis. The 4x7 Matrix, shown below, expresses this framework visually.

Bb742519.serv17(en-us,TechNet.10).gif

4X7 Matrix

Step 4: Organizational impact analysis

A forecast of the financial impact of the IT initiative on the bottom line is expressed in marginal Earning Per Share (EPS) and Economic Value Added. In order to maximize value, various options may be compared, such as lease vs. buy, outsource vs. hire, timing and size of implementation, and ROI from other opportunities. These adjustments take into account the cost of funds, tax laws, and so forth.

Step 5: Risk management

Many IT projects successfully build an economic justification identifying benefits and costs, only to fail to live up to the expectations of senior management or stakeholders. This step focuses the organization to identify various forms of risk, develop risk mitigation solutions and adjust estimates of benefits and costs accordingly. Risks are profiled to adjust the solution and economic impact of the project.

Summary

With this approach, an IT professional in concert with Line of Business managers can plan and develop effective justifications for projects that provide the maximum business value by directly supporting the organization's strategy.

D. CRITERIA FOR SCENARIO TESTING

Category

Sub-Category

Scenario

Inputs (Based on IT Profile Matrix)

Inputs (Based on scope of options/features)

Anticipated Output

Answers

1. Manageability

Create Users

Create 55 users, groups and policies. Test with ADC in Windows 2000 Server.

Environment Distribution;
Environment Complexity

Active Directory Services Interfaces (ADSI)
Microsoft Management Console (MMC)
Group Policy
Security Configuration Manager

Metric: Time - Reduce Administrator Hours Required

Administration costs will be reduced by integrating all user administrator functions into one user interface. The Active Directory can be accessed from any server or any client machine for administration and maintenance.

 

Create Groups

     

Metric: Time - Reduce Administrator Hours Required

 
 

Apply Policies

     

Metric: Time - Reduce Administrator Hours Required

 
 

Create Domain Trusts

Create trust relationships between multiple domains.

Role of Server;
Environment Distribution;
Environment Complexity

Directory Management Resource Kit Tools
Active Directory
MMC

Metric: Time - Reduce Administrator Hours Required

Administration costs will be reduced because of the simple domain structure Active Directory provides.

 

Manage User Data

Test user settings and desktop control.

Role of Server;
Environment Complexity

Active Directory
Group Policy
Client-side caching
Security Configuration Mgr.
User data and settings management
Roaming User Profiles

Observation: Yes/No
Metric: Time – Administrator time

The ability to centrally manage desktops will reduce administrator and help desk support time by obviating the need to configure and change each user's system.

 

Manage User Settings

     

Metric: Time - Reduce Help Desk and/or Administrator Time.

 
 

Restrict File-Server Storage

Test size restricted share for multiple users for their saved data.

Role of Server;
Environment Complexity

Disk Quota
Group Policy

Observation: Yes/No -
Potential Impact on Resources Required

Ability of Windows 2000 Server to manage and limit disk space per client improves resource allocation by saving storage space.

 

Protect Loss of Data

Backup and restore system partition to/from remote storage devices.

Environment Complexity

Enhanced Backup Utility

Metric: Downtime - Reduce time to backup/recover

The backup software is flexible to work with non-tape media.

 

Execute
Admin. Functions on Demand

Run administrative tools and execute admin tasks while logged in with a domain user.

Environment Complexity;
Environment Distribution

Secondary Logon

Metric: Time - Reduce Administrator Hours Required

Reduces administrative service time by giving the ability to respond to admin requirements without logging in again.

 

Delegate Admin. Control

Delegate control of a test OU to a domain user account.

Environment Distribution;
Environment Complexity

Administrative delegation
Group Policy
Organizational Units (OU)

Metric: Time; Experience Level -
Shift admin hours to lower level administrators

Lowers administrative cost by delegating admin. tasks to others with sufficient and reasonable control.

 

Provide Legacy Clients

Implement Terminal Services.
Review Multi-
Lingual/Multi-
Language features.

Role of Server;
Environment Complexity;
Environment Distribution

Terminal Services
Multi-Language Support
Multi-Lingual Support

Metric: Reduce client upgrade requirements

Terminal Services can delay the requirement to upgrade clients.

 

Lock Down Desktop

Apply desktop constraints and ensure applicable policies are implemented correctly.

Environment Complexity

Group Policy

Observation: Yes/No -
Centralized Administrative control

Administrator can protect and control desktop systems through centralized management interface.

2. Availability

Maintain File-Server availability

Create, maintain and test various DFS configurations.

Role of Server;
Environment Complexity;
Environment Distribution

Distributed File System (DFS)

Metric: Downtime -
Reduce downtime for lost disk resources

DFS reduces resource down-time by providing higher availability across file servers.

 

Maintain HSM configuration

Configure and test removable storage and HSM features.

Role of Server;
Environment Distribution

Remote Storage Services (RSS)

Metric: Media Quality -
Reduce Requirements for High-Performance Media

Minimize administration overhead when server storage space reaches its capacity limit.

 

Software Installation

Create and apply policies to test publication of applications.

Role of Server;
Environment Complexity

Active Directory
Windows Installer
Group Policy

Metric: Time - Less Help Desk Support Hours Required

Provides automated support for deleted or corrupt application.

 

Provide printing availability

Test setup and print from web browser and remotely.

Role of Server;
Environment Distribution;
Environment Complexity

Internet Printing Protocol (IPP)

Metric: Time - Less Help Desk Support Hours Required
Observation: Yes/No

Users can find and connect to the required print device quickly and with little effort.

 

Dedicate network bandwidth as required

Test application in heavily trafficked environment with and without QoS active.

Environment Distribution;
Environment Complexity

Windows 2000 Server Quality of Service (QoS)

Metric: Response Time - Critical Applications consistently respond at the same level.

Guaranteed network bandwidth will decrease production slowdown due to network traffic.

 

Plug n' Play

Test the difference in changing out a plug n' play component.

Role of Server;
Environment Complexity

Plug n' Play

Observation: Downtime

Recovering a server to full functionality after a planned or unplanned shutdown will reduce server downtime.

 

Kill an application's entire Process Tree

Create a process tree and compare the effect of killing the entire tree versus only a single process within the tree.

Environment Complexity;
Role of Server

Enhanced Process Management

Observation: Effort Process - Less time to kill process while leaving application running.

The ability kill an entire process tree of an application may lead to more uptime of application or fewer reboots.

3. Scalability and Performance

Physical memory utilization above four gigabytes

Compare transactions served in a memory application in a large memory configuration between Windows 2000 DataCenter and Windows NT Server 4.0 Enterprise Edition.

Role of Server;
Environment
Complexity;
Environment Distribution

Enterprise Memory Architecture (EMA)

Metric: Response Time - Reduced time to execute query

With improved EMA, Windows 2000 DataCenter improves scalability. Memory intensive application will respond better, given greater physical memory.

 

Multi-Processor Support

Compare the volume of transactions processed by an Internet Information Server (IIS) based application on each operating system.

Role of Server;
Environment
Complexity;
Environment Distribution

Enhanced Symmetrical Multi-Processor (SMP)

Metric: Response Time; Greater Capacity - Reduce time required to complete a function.
Increase number of records that can be processed in a reasonable timeframe.

Improved multi-processor performance can be extended using a greater number of processors.

 

File Server Performance

Compare read/write times to transfer a large file in Windows NT Server 4.0- and Windows 2000 Server–based Environments.

Role of Server;
Environment
Complexity;
Environment Distribution

More efficient use of Server resources

Metric: Response Time

Improved management of resources in Windows 2000 Server should produce faster file transfer time than Windows NT Server 4.0.

 

Balance and distributing client TCP/IP connections

Compare functionality between Windows 2000 Server and Windows NT Server 4.0, Enterprise Edition.

Role of Server;
Environment
Complexity;
Environment
Distribution

Network Load Balancing (NLB)

Observation: Yes/No
Metric: downtime

Improved response to process critical thin client applications.

4. Interoperability and Security

Novell Directory Synchronization

Synchronize the directory information between Active Directory and DNS.

Role of Server;
Environment
Distribution;
Environment
Complexity

DirSync

Observation: Yes/No

Windows 2000 Server should interact with other directories through services running on the server.

 

Public Key administration

Compare Authentication. Certificate administration.

Role of Server;
Environment
Distribution;
Environment
Complexity

Certificate Server
IIS 5.0
PKI

Metric: Time - Quicker delivery of extranets

By taking advantage of an Integrated Certificate Server and a Web Server, extranet scenarios should be easier to implement.

E. ASSUMPTIONS and TEST ENVIRONMENT

  1. General Assumptions

    • No assessment is made on Windows 2000 Server compatibility with third-party products.

    • Due to hardware and device differences, results obtained from an organization's equipment may differ from the scenario test results obtained using Andersen's lab equipment. Therefore, it is recommended that organizations conduct their own testing in order to determine results specific to their environment.

  2. Test Environment

    • The following is a description of our test environment. Each of the tests may have additional test variables and assumptions.

    • Build 2031 of the Windows 2000 Advanced Server beta 3 was used for all tests except where noted. A patch, supplied by Microsoft, was required to complete certain tests.

      System/ Model

      Qty

      Processor

      Memory Configuration

      Hard Disk Capacity

      Network Card

      RAID Level

      Servers

                 

      Dell PowerEdge 4350

      3

      SinglePIII 450MHz

      256 MB SDRAM

      2 – 9.1 GBUltra-Wide SCSI

      3COM 3c905B TX 100 Mbps

      RAID 0

      Compaq Proliant1850R

      2

      DualPII 450 MHz

      256 MB SDRAM

      2 – 9.1GBUltra-Wide SCSI

      Compaq100 Mbps

      RAID 0

      Compaq Proliant2500

      1

      DualPentium Pro200 MHz

      480 MB RAM

      4 – 8.4GBWide SCSI

      Compaq 100 Mbps

      RAID 5

      PC Clients

                 

      Dell Precision610

      3

      SinglePIII Xeon 550 MHz

      256 MB SDRAM

      1 – 17 GBUltra-Wide SCSI

      3COM 3c905B100 Mbps

      N/A

      Dell Optiplex GX1P

      1

      SinglePII 450 MHz

      128 MB RAM

      1- 14.4 GBUltra-Wide SCSI

      3COM 3c905B100 Mbps

      N/A

      Gateway E5250

      2

      SinglePIII Xeon 500 MHz

      256 MB SDRAM

      1 – 9.1 GBUltra-Wide SCSI

      3COM 3c905B100 Mbps

      N/A

      CE Devices

                 

      Hewlett - Packard Jornada820

      1

      StrongARM RISC 190MHz

      16 MB RAM

      N/A

      Socket Comm.LP-E

      N/A

      NCD ThinStar 300

      1

      Intel Pentium

      16 MB RAM

      8 MB Flash memory

      100baseT

      N/A

      Laptops

                 

      IBM ThinkPad 380XD

      2

      Pentium233 MHz MMX

      64 MB RAM

      4 GB HD

      3COM 3c589PCMCIA100 Mbps

      N/A

      IBMThinkPad380ED

      1

      Pentium 166 MHz MMX

      81 MB RAM

      6 GB HD

      3COM 3c589PCMCIA100 Mbps

      N/A

      Devices

                 

      U.S. Robotics 56K Voice FaxModem Pro (External)

      2

               

      Spectra 10000 (Spectra Logic)

      1

               

F. Scenario Test Results

MANAGEABILITY

Summary

Implementation of Microsoft's Windows 2000 Server offers significant advantages over Windows NT Server 4.0 in the area of manageability. A number of functions missing in Windows NT Server 4.0 can reduce overall operating costs. For example, management of users, groups and security to network resources has been centralized, allowing administrators to gain a holistic view of their network environment.

  1. User Setup – The initial creation of users takes longer in Windows 2000 Server than in Windows NT Server 4.0. However, Windows 2000 Server offers tools and functions that improve the manageability of users and desktop resources that, in the long run, can save a company money through reduced labor and administrative costs.

  2. Group Policy – Windows 2000 Server offers advantages in management of groups and access rights over Windows NT Server 4.0. In a Windows 2000 Server environment, network applications, can be made available to selected users via Group Policy.

  3. Delegation of Control – Windows 2000 Server simplifies control of specific functions that can be assigned to qualified heads of department or designees. This cannot be done as effectively or easily in Windows NT Server 4.0.

  4. Domain Trusts - The concept of the Domain model has vastly changed in Windows 2000 Server compared to Windows NT Server 4.0. The addition of the organizational unit, coupled with the transitive nature of Windows 2000 Server trusts, increases manageability by allowing departmental administrators (rather than a central administrative authority) to manage their resources.

  5. Disk Quota - Disk Quota enables administrators to quickly shift computing storage resources from one business function to the next, without significant labor or disk storage expense. The ability to manipulate storage capacity on an "as-needed" basis helps companies avoid downtime caused by storage capacity. Windows 2000 Server comes with a disk quota solution out of the box.

  6. Hierarchical Storage Management (HSM**)** – Works with Remote Storage Services (RSS) to automatically migrate file data that has not been recently accessed to less expensive storage media (such as tape). The system keeps the directory entry and property information for a file online so those files are automatically accessed when recalled. Windows 2000 Server provides this functionality as a standard option.

  7. Printing and IPP– Internet Printing Protocol (IPP) expands the scope of resources available to end-users. By implementing IPP along with Active Directory, Windows 2000 Server provides such functionality as managing print devices across an enterprise via Web browsers rather than the small domain or server-centric paradigm of Windows NT Server 4.0.

  8. Terminal Services – Windows NT Server 4.0 offers terminal services with the Terminal Edition of Windows NT Server 4.0, while Windows 2000 Advanced Server offers this functionality as an add-on component. Terminal Services in Windows 2000 Advanced Server offer additional functionality as well.

  9. Backup Comparison Testing - During this test, the backup utility that comes with Windows NT Server 4.0 and Windows 2000 Server is compared using a DEC DLT 2000 15/30 GB SCSI device. Major improvements are observed in Windows 2000 Server, including the ability to select a backup medium other than tape.

  10. Secondary Logon Testing – A specific user with special permission provides increased security and prevents errors. In Windows NT Server 4.0, the use of a secondary logon requires logging off and logging on as that user. In the Windows 2000 Server environment, the secondary logon service allows the user to access privileges by providing the required user ID and password without logging off the system.

Key IT Area

Manageability

Test Number

1

Test Name

User Setup

serv18

Test Description

This test involves creating users and their associated information. The Windows NT Server 4.0 SP 5- and Windows 2000 Advanced Server–based environments are configured in separate domains. A group of 55 users are divided into four functional departments. Each department has designated account administrators who are delegated (see Account Delegation test below) to perform specific tasks. They monitor user logon and user password problems, create and delete user accounts, and manage user server storage space using disk quota. The user breakdown is as follows:

User Breakdown

Computer Technology Services Administrators

2 Network/Domain

Human Resources Department

10 HR employees
1 HR Account Administrator

Finance Department

15 Account Representatives
1 Finance Account Administrator

Customer Service Department Representatives

25 Customer Service
1 CSR Administrator

Each department belongs in a Global group in a Windows NT Server 4.0–based environment. Organizational Unit containers are created for each department in the Windows 2000 Server domain.

The users are created with login user name, user's full name, profile path, home directory drive, and general password, and added to the membership of the appropriate Global groups in the Windows NT Server 4.0 domain. The same configuration is created in the Windows 2000 Server domain with the exception of the Global group creation, which is replaced with Organizational Units.10

We create email accounts for all users in an Exchange Server in both domain environments and utilize the Active Directory Connector snap-in that comes with Windows 2000 Server to update Exchange information from the Active Directory.

We test account delegation on two users who belong to two different Organizational Units (OU). There are four phases in this test: the first three phases involve setting up OU Administrator settings by the Domain Administrator and the fourth phase involves the OU Administrator installing the administrator tool components. The following information shows the approximate amount of time an administrator spends on these options.

Features Having Impact

  • Active Directory Users and Groups

Metrics

Windows 2000 Serverbased Environment:

Task

Time to Perform

Windows 2000 OU creation

3 minutes

Setup of ADC with Exchange

14 minutes

11 Users setup in HR

13 minutes

16 Users setup in Finance

19 minutes

26 Users setup in CSR

21 minutes

2 Users setup in CTS

3 minutes

55 Users in Exchange

Automatic creation through ADC

Setup Time Per User

1:05 hours

Windows NT Server 4.0based Environment:

Task

Time to Perform

Windows NT 4.0 Global group creation

3 minutes

11 Users setup in HR

9 minutes

16 Users setup in Finance

13 minutes

26 Users setup in CSR

16 minutes

2 Users setup in CTS

3 minutes

55 Users setup in Exchange

73 minutes

Setup Time Per User

2:00 hours

Account Delegation Time Log

Phases

Time to perform

Delegate control in Active Directory OU

0:40 seconds

"Managed by" account assigning

0:35 seconds

Administration Tool Components (.msi) package setup

3:20 minutes

Administration Tool Components install by OU Admin

2:10 minutes

Total time to perform account delegation

6:45 minutes

Test Environment Notes

Server:

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5

DLT:

DEC DLT 2000 15/30 GB SCSI

Network:

Switched Ethernet 10/100 MBps

Test Results and Technical Analysis

It takes approximately 20 seconds longer to create a user in a Windows 2000 Server–based environment than a Windows NT Server 4.0–based environment. The time difference is caused by the ability of Windows NT Server 4.0 to use account templates to fill in the necessary user information when creating a new user. Templates can be created and customized to fit a department's specific needs. This functionality is not available in Windows 2000 Server. Users are created in Windows 2000 Server individually, and profile settings are applied individually.

Account policies can be created in Windows 2000 Server on the Organizational Unit level as opposed to Windows NT Server 4.0 where account policies are applied on the Domain level. This improves manageability on Windows 2000 Server, enabling domain administrators to create specific policies based on departmental functions. Windows 2000 Server also features the ability for domain administrators to use a Group Policy Object (GPO) previously created for one OU and apply it to another OU without the task of recreating a similar GPO. The ability to delegate certain administrative tasks to another individual reduces centralized Domain Administration time, allowing those resources to focus on centralized management issues.

Another useful manageability tool that comes with Windows 2000 Server is the Active Directory Connector snap-in. This enables domain administrators to create a connection agreement between a Windows 2000 Server DC and an Exchange Server. The DC and the Exchange Server then synchronize user accounts between themselves, eliminating the time it takes to manually create or change a mailbox in Exchange.

Key IT Area

Manageability

Test Number

2

Test Name

Group Policy

serv19

Test Description

Group Policy enables domain administrators to specify settings for users and computers in a domain. This improves an organization's ability to give domain administrators a wider range of options in customizing user and computer configurations.

Our scenario involves a test user with desktop restrictions applied on that user's account by Group Policy settings. The user is allowed only to logon and run Microsoft Office 2000 and save work to the user's home directory. The user may not change any desktop settings or Start Menu items. No local drive access, change of password, or shutdown is permitted.

These tests, conducted on both Windows 2000 Server and Windows NT Server 4.0 platforms, utilize Windows 2000 Professional on the Windows 2000 Server side and Windows NT Workstation 4.0 on the Windows NT Server 4.0 side.

Features Having Impact

  • Group Policy Objects

  • Active Directory Users and Groups

Test Environment Notes

Server:

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5

Clients:

Dell Precision 610; Pentium III Xeon 550 MHz

 

256 MB SDRAM; 17GB Ultra-Wide SCSI

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

  1. Windows 2000 Server–based environment:

    Microsoft Windows 2000 Active Directory service enables Group Policy. Group Policy Objects (GPO) store policy information linked to Active Directory sites, domains, and organizational units (OU). For our testing, we create a new GPO linked to an OU container.

    We test the user's login, using a Windows 2000 Professional–based client and a Windows NT Workstation 4.0–based client. The policy is retained on the Windows 2000 Professional–based client, but ignored on the Windows NT Workstation 4.0–based client. The user is unable to change any system settings and is only allowed to use the specified applications on the Professional–based client. The user attempts to change the display settings to access the local drives, but does not succeed. The user then logs off and uses another user login that does not have a GPO applied to it and retains all default desktop configurations.

    Result: Succeeds on the Windows 2000 Professional–based client.

    Fails on the Windows NT Workstation 4.0–based client.

  2. Windows NT Server 4.0–based Environment:

    System Policy is applied differently on a Windows NT Server 4.0–based environment. Windows NT Server 4.0 policies are created using the System Policy Editor. The policy is then saved in the \\PDCServer\Netlogon directory, and that netlogon path must be specified during user account creation. In contrast, Windows 2000 Server GPO is applied to the OU and can be managed centrally per OU.

    Our user 'lockdown' test succeeds using both Windows 2000 Professional- and Windows NT Workstation 4.0–based systems to logon. The user is unable to change any desktop or system settings on either platform. No local drives are accessed and only permitted applications are launched. A major difference in comparison to the Windows 2000 platform is that the applications have to be installed on the machine prior to user logon since the Windows NT Server 4.0 platform does not have IntelliMirror capabilities.

    Result: Succeeds on the Windows NT Workstation 4.0–based client.

    Succeeds on the Windows 2000 Professional–based client.

Test Results and Technical Analysis

Group Policy, combined with IntelliMirror features, proves to be an advantage in centrally managing user data and applications. By setting up policies on both server platforms, the user is restricted from changing any system and desktop configuration specified during the GPO creation.

The software installation feature in Windows 2000 Server is found to be advantageous, based on user's policy settings. The software application is packaged and then specified in the GPO. The application becomes available to the users linked to the GPO when they logon.

As the environment grows in complexity and distribution, centralized management in a Windows NT Server 4.0–based environment becomes geometrically more difficult since policy is applied and managed at a server level. In the Windows 2000 Server–based environment, centralized policy management by OU makes large widely distributed organizations controllable.

Key IT Area

Manageability

Test Number

3

Test Name

Delegation of Control

serv20

Test Description

This is a test of the Windows 2000 Server administrative delegation function. This is a Windows 2000 Server capability that we simulate as closely as possible in a Windows NT Server 4.0–based environment. The test involves delegating some administrative controls to a user who belongs in the Human Resources OU. We create a Group Policy on that OU to publish the administrator tools for the selected user who installs the appropriate tools to perform the delegated administrative tasks.

To delegate control to an individual user in a Windows NT Server 4.0–based environment, a user must be a member of either the Domain Administrators group or the Account Administrators group for the entire domain. We expect the ability to allow security rights at a granular and flexible level in Windows 2000 Server will reduce the administration requirements of an IT person.

Features Having Impact

  • Control Delegation

Test Environment Notes

Server:

Compaq Proliant 1850R; Dual Pentium II 450 MHz

 

256 MB SDRAM; RAID Level 0

Client:

Dell Precision 610; Pentium III Xeon 550 MHz

 

256 MB SDRAM; 17GB Ultra-Wide SCSI

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

Windows 2000 Server provides a granular approach in setting up security rights and permission to an individual user or an OU. This cuts the administrative time of an IT professional by assigning some user account administration tasks to qualified heads of departments or designees. A good scenario would be to delegate administrative tasks to a qualified Human Resources person for creating and deleting user accounts. The typical HR administrator would have more information to input when they create the new user account upon admitting new hires into the company. The delegating administrator can select which OU that Human Resources person can manage. This selectivity is not possible in Windows NT Server 4.0 where delegation grants that person the right to manage all user accounts in the domain.

Test Results and Technical Analysis

All predefined tasks are successful on Windows 2000 Server and in Windows NT Server 4.0. In addition to the ability to control what resources can be managed, the tasks are maintained at a more granular and complete form in Windows 2000 Server, allowing domain administrators to be very selective about what rights other user administrators are permitted.

Key IT Area

Manageability

Test Number

4

Test Name

Domain Trusts

serv21

Test Description

In our testing, we establish a trust between two Windows 2000 Server domains with separate DNS servers on each domain. We evaluate the setup between the two Windows 2000 Server domains and compare it with the process of creating a trust between two different Windows NT Server 4.0 domains. Next, we test domain trust creation between a Windows 2000 Server domain and a Windows NT Server 4.0 domain. Each domain in our setup consists of at least two Domain Controllers (DCs).

Features Having Impact

  • Active Directory Sites and Domain

Test Environment Notes

Server:

Windows NT Server

 

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5

 

Windows 2000 Server

 

Compaq Proliant 1850r; Dual Pentium II 450 MHz

 

256 MB RAM; RAID Level 0

Network:

Switched Ethernet 10/100 Mbps

Test Assumptions and Notes

Setting up a two-way trust relationship in Windows 2000 Server is similar to setting up a two-way trust relationship in Windows NT Server 4.0. There are no obvious improvements in trust relationship configuration on Windows 2000 Server. Rather, the biggest improvement is the addition of organizational units in the new domain structure. New environments will require fewer domains. Where there are domain trusts required, the transitive nature of Windows 2000 Server trusts reduces the number and complexity of those trusts.

Test Results and Technical Analysis

Trust is established successfully on both environments with minor difficulty as noted below.

Test Notes:

Windows 2000 Server Environment

Since each of the domains in Windows 2000 Server–based environment has its own DNS server and we use static IP addresses for DCs in both environments, we have to manually insert a new host entry through the MMC DNS console on each domain to have a successful trust relationship setup.

Windows NT Server 4.0 Environment

We do not have to adjust any DNS entries because the Windows NT Server 4.0 domain is initially set up to use the DNS service on the Windows 2000 server. Had we configured the Windows NT Server 4.0–based server as a DNS server, we would have required the same DNS additions as in the pure Windows 2000 Server environment.

Key IT Area

Manageability

Test Number

5

Test Name

Disk Quota

serv22

Test Description

Disk quota allows administrators to manage the amount of storage space users can access on a disk volume. Windows 2000 Server comes with disk quota ready out of the box. Windows NT Server 4.0 does not come with any disk quota management tool out of the box. For testing purposes, we acquired one of the third party applications for Windows NT Server 4.0 from Quota Advisor.

The primary test objective is to determine whether the Disk Quota utility that comes with Windows 2000 Server executes as expected. Minimal comparison is done between Quota Advisor and Windows 2000 Server Disk Quota utility. We focus primarily on ease of use and manageability.

Features Having Impact

  • Disk Quota

Metric(s)

Windows 2000 Server and Windows NT Server 4.0

Quota Limit: 5 MB Storage space

Test Environment Notes

Servers:

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5; Windows 2000 Advanced Server and Windows NT Server 4.0 SP 5

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

Windows 2000 Server – Disk Quota is included

Windows NT Server 4.0 3rd Party Software "Quota Advisor"

Test Results and Technical Analysis

The Disk Quota utility that comes with Windows 2000 Server helps a network administrator manage server storage in the enterprise. In some large environments, running out of server storage and managing storage data is a significant problem for network administrators. Even though the cost of storage space is decreasing, the downtime cost associated with running out of space can be significant. The ability to limit/manage storage space provides network administrators flexibility in determining the storage needs of users and processes based on the company's business objectives.

Key IT Area

Manageability

Test Number

6

Test Name

Hierarchical Storage Management (HSM)

serv23

Test Description

This is a test of the Remote Storage and Removable Media components of Windows 2000 Advanced Server combined to support HSM. This test is conducted with a Spectra 10000 as the removable storage device.

The objectives of the scenario are to test that the conditions entered into the system for archiving of data to removable media would be executed and that any files archived could be recovered without additional user tasks.

Similar software is not available in Windows NT Server 4.0. Third party components must be purchased. In addition, the mechanism used in the Windows 2000 Server relies on the newer version of NTFS that is not compatible with Windows NT Server 4.0.

Features Having Impact

  • Removable Media

  • Remote Storage Services

Test Environment Notes

Server:

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5

Tape Drive:

Spectra 10,000 (Spectra Logic)

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

The process for configuring Windows 2000 Advanced Server for HSM is relatively simple once the supported hardware is correctly configured. The Spectra 10000 is recognized by the Removable Storage snap-in and available media are shown in the Free Media pool. On first installation, this process takes quite a while as the media are cataloged for use.

The parameters applied to a managed disk are simple to understand. Once the parameters are configured, the system automatically archives files to the appropriate media and leaves a directory entry on the system so that the file can be accessed online as required.

When the user accesses the file, an animated dialog box appears indicating that the file is being read from tape. Access times are dependent on the device and media where the file is located and the position of the file on the media.

The process for discovering which files are archived is convoluted and requires a number of steps that should not be necessary. Since the solutions included with the Windows 2000 Server family are third party applications, it is fair to expect that functions in the HSM area will be extended by third party vendors shortly after release.

The system as delivered is fully functional. There are some limitations that may impact the utility of these services as delivered. The removable storage device must be attached to the same system as the managed disks. Again, this limitation may be resolved in third party products. The inclusion of these components in the Windows 2000 Server family demonstrates that the function can be implemented, and provides a useful implementation of HSM.

By contrast, HSM solutions for Windows NT Server 4.0 are few and difficult to implement. In addition, the new NTFS includes features that further enhance the ability to build HSM solutions that cannot be taken advantage of in Windows NT Server 4.0.

Test Results and Technical Analysis

The basic tests of functionality execute well. The only issue with the testing process is the method by which backed up files are determined. Although the current mechanism for this is difficult, this will not be a critical issue for most implementations.

Key IT Area

Manageability

Test Number

7

Test Name

Printing and IPP

serv24

Test Description

We test printer configuration on both Windows 2000 Server and Windows NT Server 4.0 SP5 platforms. We also explore Windows 2000 Server support for Internet Printing Protocol (IPP). The first phase of the testing involves print server setup on both platforms. We also review manageability of publishing print devices to the Active Directory and using the Internet Explorer web browser on the Windows 2000 Server platform. These tests are conducted within an intranet site and also through a remote access dial-up connection.

The second test phase involves printer configuration on Windows 2000 Professional- and Windows NT Workstation 4.0–based clients. Printer setup and connectivity are conducted on the client side to test new ways for clients to attach to print devices published in the Active Directory on a Windows 2000 Server–based environment.

During the third and final phase of our tests, we apply Group Policy in an OU to test printer availability and customization based on specific OUs.

Features Having Impact

  • Internet Printing Protocol

  • Internet Explorer

Test Environment Notes

Server:

Compaq Proliant 1850R; Dual Pentium II 450 MHz

 

256 MB RAM; RAID Level 0

Printer:

Hewlett-Packard Laser Jet 5si

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

Internet Printing Protocol expands on the options of how print resources are made available to end-users. Internet Printing Protocol, combined with Active Directory, increases the options for network administrators to manage print devices across an enterprise network. We see improvements in setting up print devices on the client side and administering printers. We also explore the ability to manage the printer through a remote access session by using Internet Explorer. The ability to apply Group Policy Objects for printer location, settings and configurations on a specific OU helps lower time to manage and setup printers for clients compared to setting up each client individually.

Manageability of print devices is greatly enhanced with the ability to publish print devices in the Active Directory where users can easily locate their assigned printer. Implementing the use of an Internet browser to manage print devices in the Active Directory makes the management and configuration functions more accessible, enhancing manageability.

It is easier to set up a print server in the Windows 2000 Server–based environment than in a Windows NT Server 4.0–based environment because of print driver and port availability. Specifically, it is easier to configure a printer connection from a Windows 2000 Professional–based client connecting to a Windows 2000 Server–based print server because the client has the ability to search the whole directory. Other methods are also added for clients to connect easily to print devices across the enterprise by using an Internet browser. What makes searching for printers even easier is the ability to create a Group Policy Object to an OU for printer location selection which narrows the printer selection process to the physical location of the client.

Test Results and Technical Analysis

1A. Setup printer and print device in Windows 2000 Server as a print server:

We test the HP LaserJet 5si printer for the Windows 2000 Server–based environment. We create a new printer connection locally, configure a new TCP/IP printer port, and provide the setup with the printer's IP address and port name. We then select the appropriate printer driver for the printer device, give the print device a printer name, and share it to the "everyone" group. We also provide printer location and comment information during the printer setup.

Time:

1:55 minutes to setup on a Windows 2000 Server

1B. Setup printer and print device in Windows NT Server 4.0 SP5 as a print server:

In a Windows NT Server 4.0–based environment, we have to install the DLC network protocol to support our test of the HP LaserJet 5si printer and reboot the server. After the server reboots, we add a new Hewlett-Packard port and set its hardware address that we acquired from printing a test page off the print device. We give the printer a name, select the necessary drivers to install and finish the installation by sharing it.

Time:

2:15 minutes to install the DLC protocol and reboot the server

 

1:35 minutes to setup printer on Windows NT Server 4.0

2A. Printer connection to a Windows 2000 Server–based print server:

We test connectivity to a Windows 2000 Server–based print server from a Windows 2000 Professional–based client. We add a printer using the "add printer" wizard and choose the network printer option. We specify to look for print devices in the entire directory and choose the printer options appropriate for the client's location.

We test another printer setup method by providing the URL of the print device published in the Active Directory. On the client side, we launch the add-printer setup wizard, supply the given URL address, and are able to connect immediately.

We also test connecting to a print device through Internet Explorer by using the specified URL. The URL address is entered into the browser and the printer properties page comes up. With the web page, we choose the connect option and the print device is setup automatically.

Time:

0:43 seconds to setup searching the directory

 

0:24 seconds to setup using URL supplied

 

0:15 seconds to setup using Internet Explorer

Result:

Succeeds

2B. Printer connection to a Windows NT Server 4.0–based print server:

We test connectivity to a Windows NT Server 4.0–based print server from a Windows 2000 Professional–based client. We add a printer using the "add printer" wizard and choose the network printer option. The setup requires us to provide the network printer name or browse the network for the printer. The user then needs to know the network printer name of the print device to choose it in the selection. As soon as the user knows what the printer name is, setup can be completed and the printer can be used.

Time:

1:12 minutes to setup to Windows NT Server 4.0 print server.

Result:

Successful.

3. Applying group printer policy to an OU:

We create a new Group Policy to an OU and setup printer location and printer search location text on the Group Policy object. We then test one of the users under that OU by logging on to a Windows 2000 Professional–based client and creating a new printer connection with the printer connection wizard. By entering the location of the test user, the search narrows to the available print device on that particular location instead of the whole directory.

Result: Succeeds

Key IT Area

Manageability

Test Number

8

Test Name

Terminal Services

serv25

Test Description

This is a test of Terminal Services on Windows NT Server 4.0, Terminal Edition and Windows 2000 Advanced Server. The Terminal Server feature of Windows provides a thin client solution in which Windows–based applications are executed on the server side and remotely displayed on the client workstation. In order to test this feature, we chose Office 2000 as our test application. We submit basic requests to the servers such as opening, editing and saving new files from the following platforms:

  • Windows 3.11

  • Windows 95

  • Windows NT Workstation 4.0

  • Windows 2000 Professional

  • Windows CE

In the Windows 2000 Server–based environment, we also test multi-lingual/language functionality. Multi-lingual enables users to read and write in a user-specified language, and the multi-language feature provides options to display operating system menus and dialogues in the specified language format.

Two servers are configured with Terminal Services. The Windows NT Server 4.0, Terminal Edition is installed on a Dell Poweredge 4350 with single processor Pentium II 450Mhz and 250 MB of RAM. A Compaq 2500, dual processor with Pentium II 200 Mhz and 450 megs of RAM also is configured as a dual boot server to test multi-language/lingual features.

Features Having Impact

  • Terminal Services

Test Assumptions and Notes

This test does not address performance or scalability issues regarding Terminal Service in Windows NT Server 4.0 or Windows 2000 Server.

Test Procedure/Steps

1. Windows NT Server 4.0 Server Terminal Edition Service Pack 411

Terminal Service functionality in the Windows NT Server 4.0 family is not offered as an add-on or a native feature. Instead, a separate copy of Windows NT Terminal Server Edition 4.0 is required to use terminal services. In essence, the Terminal Server Edition is a separate operating system in the Windows NT Server family.

Once installed, the Terminal Server version of Windows NT Server 4.0 behaves very similarly to its other family members. It includes the standard Windows NT Server 4.0 features, in addition to the following:

  • Terminal Server Administration. Manages and monitors users, sessions and processes on any Terminal Server on the network.

  • Terminal Server Client Creator. Creates 16-bit Windows for Workgroup and 32 Bit Windows 95-/Windows NT–based clients.

  • Terminal Server Connection Configuration. Manages the terminal server connections that provide the links clients use to logon the server.

  • Terminal Server License Manager. Administers and manages licenses stored on the Microsoft License Server.

Terminal Server Clients Windows NT 4.0 Server

Operating System

Windows 3.11

Windows 95

Windows NT Workstation 4.0

Windows 2000 Professional

Windows CE

Hardware
Description

486DX-100 Mhz
16 Meg RAM

Pentium 233Mhz
64 Megs
RAM

Pentium III
500Mhz
265 Megs
RAM

Pentium 233Mhz
64 Megs
RAM

NCD
Thin Star

From the terminal client we create and save a new file and then disconnect the terminal session. We reconnect to the server to edit the saved file. The server responds to all 32 bit clients with reasonable performance and supports multiple terminal connections. The Windows CE device and the Windows 3.11–based system only support one connection to the server per client.

In addition, in the Windows 3.11–based environment, the terminal connection to the server periodically drops or freezes.

Terminal Server Client Applications Results

Access 2000

Excel 2000

PowerPoint 2000

Word 2000

Windows For Workgroups

YES

YES

YES

YES

Windows 95

YES

YES

YES

YES

Windows NT
Workstation 4.0

YES

YES

YES

YES

Windows CE

YES

YES

YES

YES

Windows 2000 Professional

YES

YES

YES

YES

2. Windows 2000 Advanced Server

In Windows 2000 Advanced Server, Terminal Services are available as an optional component. Similar to Windows NT Server 4.0 SP 4 Terminal Server Edition, Windows 2000 Advanced Server with the Terminal Services installed offers remote clients complete access to Windows-based programs running on the server. The Terminal Services option can be enabled during the installation of the operating system or installed later as an additional component. To manage Terminal Services, Windows 2000 Server provides the following utilities:

  • Terminal Server Client Creator – Creates 16-bit Windows for Workgroup and 32-Bit Windows 95-/Windows NT-/Windows 2000-based clients.

  • Terminal Services Configuration – Configures new connections for Terminal Services, modifies settings of existing connections, and deletes connections with Terminal Services connection.

  • Terminal Server Licensing - Manages licenses stored on the Microsoft License Server.

  • Terminal Server Manager – Views information and manages terminal services, including all sessions, and processes for all Terminal Servers.

Windows 2000 Advanced Server supports the same connection platforms as the Windows NT Server 4.0, Terminal Edition environment, including Windows 3.11/Windows 95/Windows NT /Windows CE and Windows 2000 Professional.

Terminal Server Clients Windows 2000 Advanced Server

Operating System

Windows 3.11

Windows 95

Windows NT Workstation 4.0

Windows 2000 Professional

Windows CE

Hardware
Description

486DX-100 Mhz
16 Meg RAM

Pentium 233Mhz
64 Megs
RAM

Pentium III
500Mhz
265 Megs
RAM

Pentium 233Mhz
64 Megs
RAM

NCD
Thin Star

The connections to the Windows 2000 Advanced Server yield similar results to the Windows NT Server 4.0–based environment. We experience the same issue with the Windows 3.11–based client as we do in the Windows NT Server 4.0–based environment.

Another phase of this test is the validation of multi-lingual/language feature via a Terminal Service connection. By enabling multi-language/multilingual, the Windows 2000 Server operating system is able to display its GUI in the language format selected by the user from the installed options.

Similar to the Windows NT Server 4.0–based environment we create and save a new file, and then disconnect the terminal session. We then reconnect to the server to edit the saved file.

Terminal Server Client Application Results

Access 2000

Excel 2000

PowerPoint 2000

Word 2000

Windows For Workgroups

YES

YES

YES

YES

Windows 95

YES

YES

YES

YES

Windows NT
Workstation 4.0

YES

YES

YES

YES

Windows CE

YES

YES

YES

YES

Windows 2000 Professional

YES

YES

YES

YES

Test Results and Technical Analysis

Terminal Services on the Windows NT Server 4.0, Terminal Server Edition and Windows 2000 Advanced Server work cleanly and without any noticeable errors. The instability of the connection is limited to the Windows 3.11–based client environment.

We are able to successfully use the Terminal Services clients in both Server environments to perform basic functionality described above. The implementation of Terminal Services allows legacy hardware to take advantage of new Windows-based applications such as Office 2000.

An added feature tested in Windows 2000 Server is multi-lingual/language. By displaying the operating system in the user's native language and allowing the user to input and display information in the required text format, Windows 2000 Server services the needs of many different types of users in today's global business environment. For example, based on the user logging on to the system, the GUI is displayed in the appropriate language and the application input and display may be presented in the user's native character format.

With the exception of multi-lingual/language functionality in Windows 2000 Server, there is no noticeable difference in the implementation of Terminal Service features in Windows NT Server 4.0 versus Windows 2000 Server. We note that Terminal Service is a built-in component in the Windows 2000 Server family, whereas the Windows NT Server 4.0, Terminal Server Edition is a specific operating system in the Windows NT Server family that must be acquired and implemented separately.

Key IT Area

Manageability

Test Number

9

Test Name

Backup Comparison

serv26

Test Description

This is a test of the backup utility that comes with Windows 2000 Server compared to the Windows NT Server 4.0 backup utility. The tests are conducted on both platforms running on the same server configuration. The backup device used is a DEC DLT 2000 15/30 GB SCSI device. The scenario objective is to back up the IIS server to a remote share within the local area network (LAN) and restore it to its original location.

The test highlights Windows NT Server 4.0 backup utility limitations and its inability to backup files to other media devices besides tape drives.

Features Having Impact

  • Backup Wizard

Metric(s)

Windows 2000 Server

Time:

8:56 minutes to backup on DLT

 

8:24 minutes to backup to Remote Disk Storage

 

7:57 minutes to restore from DLT

 

1:38 minutes to restore from Remote Disk Storage

Windows NT Server 4.0

Time:

7:12 minutes to backup on DLT

 

5:55 minutes to restore from DLT

Test Environment Notes

Server:

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5

DLT:

DEC DLT 2000 15/30 GB SCSI

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

The Windows 2000 Server backup process is more flexible because it allows administrators to backup files on media types other than tape. We are able to run backup from any machine, even client machines, and save the backup file (*.bkf) to a remote server or external mass storage device.

One major improvement in Windows 2000 Server is the ability to select backup media other than tape devices. A typical scenario would be the ability to run backup on servers that do not have access to a backup device. The backup files would then be stored locally on those servers, copied to a server that has access to a tape device and then backed up to tape. This may not be more efficient than regular network server backup, but it gives backup administrators more flexibility depending on their environment configuration.

Test Results and Technical Analysis

The backup and restore tests are successful and time varies, depending on platform and media types. Desktop users are able to perform their own backup process and the may opt to save it locally or to a remote storage such as a home drive or to any external devices, such as Zip drives, Jaz drives, LS drives, and so forth

Key IT Area

Manageability

Test Number

10

Test Name

Secondary Logon

serv27

Test Description

This is a test of the secondary logon function of Windows 2000 Server where we perform privileged operations from the same logon session. This function helps prevent the operating system from being vulnerable to accidental administrative errors and virus attacks. We compare the time difference required to perform comparable tasks in a Windows NT Server 4.0 platform. There are multiple ways of running administrative tasks in Windows NT Server 4.0, but Windows 2000 Server provides a single step process.

We are able to run administrative tasks while logged on with a normal user account to a Windows 2000 Server–based server from a Windows Professional–based client. Being able to move around in a server and browse the files with a normal user account is beneficial for securing files from accidental deletion and preventing access to files via viruses. We measure the amount of time taken to logoff and logon to the system to run a specific task or launch a system service, while the administrator manages the system locally or remotely with the use of third-party utilities.

Features Having Impact

  • Secondary logon service

Test Environment Notes

Server:

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

The threat of a virus attack on a server is a high-risk situation for domain administrators. Some viruses have the ability to reformat a disk, create new user accounts with administrative rights, and delete important files that can completely stop a production machine and, ultimately, cause company downtime. Windows 2000 Server allows domain administrators the ability to create a non-administrative logon account, with less privilege than their normal logon account, and choose the "run as" option when they are required to perform administrative tasks. Although a comparable task can be done in a Windows NT Server 4.0–based environment by logging off and logging back on with an administrative account, the implementation of Secondary Logon service makes server administration more efficient.

Test Results and Technical Analysis

1. Test Secondary Logon in Windows 2000 Server:

We start the Secondary Logon Service in the Windows 2000 Server and set the startup of the service to automatic. We then test a user (not an administrator level logon), browse the files, and attempt to add new hardware in the control panel. The user receives an error message because he does not have the privileges required to add hardware. We then test the secondary logon feature by highlighting the "add and remove hardware" icon and pressing shift + right click to execute the program with the "run as" option. We provide the proper administrator user account and password and are able to add new hardware.

Result: Succeeds in running administrative tasks with Secondary logon.

2. Running Administrative Tasks in Windows NT Server 4.0:

In a Windows NT Server 4.0–based environment, we log in as the same user to browse the server and to view services running on the server. The test on the Windows NT Server 4.0 platform requires the non-administrative user to logoff and logon to the system with a user ID that has the proper privileges to add any new hardware devices on the server or to perform any other administrative tasks.

Result: Regular users need to re-log on to run administrative tasks.

AVAILABILITY

Summary

Significant improvements in features and functionality incorporated into Windows 2000 Server–based servers offer higher levels of availability than Windows NT Server 4.0–based servers. These improvements include tools and features that help minimize the number of reboots and increase system uptime.

  1. Plug and Play – This functionality has long been missing in Windows NT Server 4.0. With Windows 2000 Server, Plug and Play support is enabled. For example, when a new Network Interface Card is inserted in Window 2000 Server, it automatically recognizes and installs the necessary drivers without requiring a reboot.

  2. Kill Process – This function addresses the issue of misbehaving applications that spawn multiple processes, taking up valuable system resources and slowing down the system. Windows 2000 Server allows the IT administrator to "end process tree", which not only kills the single process, but also any processes created by that parent application, without rebooting. Windows NT Server 4.0 does not offer this functionality.

  3. Distributed File System (DFS) – A Distributed File System is a network server component that makes it easier for network administrators to find and manage data on their network. The fault tolerance feature in Windows 2000 Server compared to the Windows NT Server 4.0 implementation is a significant advantage.

  4. Quality of Service (QoS) – Windows 2000 Server QoS enables control of the allocation of network resources to user and application traffic. Hence, a mission-critical function needed to finish a task can be granted higher bandwidth, on a temporary basis, to ensure availability. The addition of QoS functionality in Windows 2000 provides easier administration of Network Performance Parameters.

Key IT Area

Availability

Test Number

1

Test Name

Plug and Play

serv28

Test Description

This test scenario involves adding an additional Network Interface Card (NIC) to both of our Microsoft SQL Server based-servers. We test on both Windows 2000 Server and Windows NT Server 4.0 with SP5 platforms and track the number of reboots required to complete the installation process.

We expect Windows 2000 Server to require fewer reboots when changes are made to the system. We also note if a service pack requires reapplication after installing a new NIC card on the system.

Features Having Impact

  • High availability and ease of use

Test Assumptions and Notes

Plug and Play testing is limited to a Network Interface Card. We are not able to test Windows 2000 Server service pack slipstreaming functionality because there are no service packs for Windows 2000 Server available for testing.

Test Procedure/Steps

1. Windows 2000 Server testing:

We bring down the server and install an additional 3Com NIC card. The server is then brought back up, the card is automatically recognized, and the appropriate driver is loaded. We then configure the network settings and the bindings.

Result: No reboot required after NIC driver install.

2. Windows NT Server 4.0 SP 5 testing:

We apply the same test on a Windows NT Server 4.0–based server by installing an additional NIC card. After installing the NIC card driver and configuring network settings, we are required to reboot the system. We are then required to apply Service Pack 5 when the server comes back up. We restart the server after completing the service pack install.

Result: Reboot required after driver install.

Reboot required after service pack install.

Test Results and Technical Analysis

The Plug and Play capability of Windows 2000 Server is a significant benefit. With the combination of fewer required reboots and Plug and Play capabilities, production servers should be inactivated fewer times and for shorter periods in Windows 2000 Server versus Windows NT Server 4.0. Although service pack slipstreaming is not tested, based only on how service packs are handled on Windows NT Server 4.0, we expect a dramatic reduction of system downtime if service packs do not require reapplication.

Key IT Area

Availability

Test Number

2

Test Name

Kill Process Tree

serv29

Test Description

The objective of this test addresses system availability when an application that is capable of spawning multiple processes misbehaves and renders the server susceptible to crashes. This test includes Internet Information Server (IIS) on Windows 2000 Server and Windows NT Server 4.0 platforms to run application requests programmed to run in separate processes. Multiple application requests are tested using the ISAPI extension in IIS and the 'end process tree' functionality is tested in Windows 2000 Server.

Features Having Impact

  • Kill Process Tree

Test Environment Notes

Server:

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

We use IIS as our test application. Using a sample ISAPI extension provided by Microsoft, we are able to create multiple concurrent connections.

Test Results and Technical Analysis

1. Windows 2000 Server 'kill process tree' testing:

In order for us to test the ISAPI extension within IIS, Microsoft supplied us a sample ISAPI extension for testing purposes. We copy the IIS sample files to the IIS server and connect to the server using Internet Explorer from our client test machines. We run five concurrent connections and launch one application for each connected user. We then try to end the Inetinfo.exe with just the 'end' process option and the other spawned applications are not stopped. We run the same test with five concurrent users connected to IIS Server and request an application process per user. We launch task manager on the IIS Server and end Inetinfo.exe with the 'end process tree' option, and the rest of the applications end abruptly.

Result: Spawned applications ended successfully.

2. Windows NT Server 4.0 end process testing:

We use the same ISAPI extensions for IIS that Microsoft provided in a Windows NT Server 4.0–based environment. We copy the IIS sample files to the IIS server and connect to the server using Internet Explorer from our client test machines. We run five concurrent connections and launch one application for each connected user. We then try to end the Inetinfo.exe with just the 'end' process option, but the rest of the spawned applications do not end.

Result: Failed due to lack of functionality.

IT administrators can now end an errant application that spawns separate processes on Windows 2000 Server–based servers. An application capable of spawning multiple processes can affect system performance when system resources reach their limits and applications are unable to function properly. In Windows 2000 Server, choosing the 'end process tree' in the task manager can end such applications and their processes.

Key IT Area

Availability

Test Number

3

Test Name

Distributed File System

serv30

Test Description

We test Distributed File System (DFS) on both Windows 2000 Server and Windows NT Server 4.0 platforms. DFS is a network server component that makes it easier for network administrators to find and manage data on their network. The test involves numbers of shared volumes that exist in different servers as well as shares that are located on a trusted domain.

Distributed file system is a beneficial tool to manage data shared across the enterprise network. We expect to see improvements on the Windows 2000 Server version of DFS because of its fault tolerance feature. This means that a DFS root is replicated to other DFS servers, and provides high availability even if one of the DFS servers is unavailable.

Features Having Impact

  • Distributed File System

Test Environment Notes

Servers:

Compaq Proliant 2500; Dual Pentium Pro 200 MHz; 480 MB RAM; RAID Level 5; Windows 2000 Advanced Server and Windows NT Server 4.0 w/ SP5

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

In a Windows NT Server 4.0–based environment, we are able to connect to all the shared volumes that are located across the network, while the real locations of the shares are transparent to users. During the Windows 2000 Server testing of DFS fault-tolerant root, one of the servers is brought down and an approximate five-second delay occurs before the system recovers and continues accessing files from the DFS volume.

Test Results and Technical Analysis

Microsoft's inclusion of DFS makes it easier to manage shared data on a network. It is a useful tool for creating different DFS root volumes for specific groups in an enterprise environment. For instance, there is no need for administrators to create logon scripts for users in order to map users to multiple shared drives in the network. The administrator simply creates a DFS root for those users and maps them to that single DFS root volume. Furthermore, the ability to create a fault tolerant DFS root, a feature added to Windows 2000 Server, means that data will be available even if one of the DFS roots fails or shuts down for maintenance. All tests are successful on the specified platforms.

Key IT Area

Availability

Test Number

4

Test Name

Quality of Service (QoS)

serv31

Test Description

Our QoS testing includes two client machines running Windows 2000 Professional. One of the machines is configured as the sender of data while the other machine is configured as a recipient. The client machine acting as the receiver of the data is a Dell Precision 610 workstation. The sender machine (Gateway) is connected directly to our 3Com CoreBuilder 3500 switch.

We install QoS Admission Control on one of our Domain Controllers (DC) and directly connect it to our 3Com switch. The DC we use is a Compaq Proliant 1850R configured with dual Intel 450 MHz Pentium II processors and 256 MB memory. All of our test machines are equipped with 10/100 Ethernet PCI Network Interface Card (NIC) adapters.

Different users and/or applications have different requirements regarding the handling of their data as it is transferred on the network. QoS is able to control the allocation of network resources to user and application traffic in a manner that meets the demand.

Features Having Impact

  • Quality of Service

Test Environment Illustration

Bb742519.serv32(en-us,TechNet.10).gif

Test Assumptions and Notes

We are able to successfully maintain our NetMeeting session by having a guaranteed network bandwidth configured through QoS. The QoS Admission Control snap-in is easy to understand and manage. Time difference between testing is not a factor due to the results from testing NetMeeting without dedicated network bandwidth. The NetMeeting session simply freezes and we end the session after it fails.

As the demand for development and deployment of interactive communication, video conferencing, real-time audio, and other bandwidth-hungry applications increases, there is a specific need to control the traffic generated on the network. The addition of QoS functionality in Windows 2000 Server should provide easier administration of network performance parameters to meet this demand.

Note: We did not run any tests on the Windows NT Server 4.0 platform for any comparison due to its lack of any similar QoS functionality.

Test Results and Technical Analysis

1. NetMeeting Session without QoS functionality:

We initiate a NetMeeting session between two client systems with one system designated as the sender and the other as the receiver. We share a Microsoft Word application and collaborate with each other. For our testing, we acquired a network noise generator from Microsoft and we run it from another machine on the network. We point the noise generator to communicate with the sending machine, disrupting the NetMeeting session. Nothing can be done between the sessions until we stop the network noise generator that is pointing to the sending machine.

Result: NetMeeting session failed.

2. NetMeeting Session with QoS functionality:

We configure our QoS server to provide a "guaranteed service level" to the organizational unit (OU) initiating a NetMeeting session. We increase the allowable flow limits and aggregate limits to 10 Mbits/sec on both data rate and peak data rate.

We re-test our NetMeeting session between the sending and receiving machines. We then run the noise generator from another machine in our network to communicate to the sending machine. The NetMeeting session is not disrupted or noticeably delayed.

Result: NetMeeting session carried on without disruption.

SCALABILITY AND PERFORMANCE

Summary

Scalability and performance improvements in Windows 2000 Server allow businesses to support more users and a wider range of applications with fewer machines. Features such as the Enterprise Memory Architecture and Extended System Multiprocessing Support allows a server the flexibility to grow along with the business.

  1. Enterprise Memory Architecture (EMA) – Windows NT Server 4.0 Enterprise Edition has a memory configuration limit of 4 gigabytes of physical memory compared to Windows 2000 DataCenter Server which can utilize up to 64 gigabytes of physical memory. The results of the testing show that additional memory available under Windows 2000 Server is better utilized, thus improving application performance.

  2. Extended System Multi-Processing (SMP) Support – Windows 2000 DataCenter Server extends the maximum number of licensed processors supported in a server to 32 processors, compared to the maximum of 8 licensed processors supported by Windows NT Server 4.0 Enterprise Edition.

  3. File Server Performance – Our test shows a 6% improvement in the average response time under Windows 2000 Server compared to Windows NT Server 4.0. We find that Windows 2000 Server is at least as good as Windows NT Server 4.0 as a file server using existing hardware.

  4. Network Load Balancing (NLB) – The load balancing feature exists in both Windows NT Server 4.0 and Windows 2000 Server. NLB in Windows 2000 Server proves easier to manage and configure. There are fewer reboots and a simpler interface, but there is little difference from the implementation of the technology.

Key IT Area

Scalability and Performance

Test Number

1

Test Name

Enterprise Memory Architecture (EMA)

serv33

Test Description

The objective of this test is to compare the time required to execute a large SQL query using a large Microsoft SQL Server 7.0 database running on Windows 2000 DataCenter with the time required to execute the same query running on Windows NT Server 4.0 Enterprise Edition. The database used is an extended Northwind database created by adding a table to a database containing 700,000 8K records.

Features Having Impact

  • Enterprise Memory Architecture (EMA) Support

Metric(s)

Time: total time to execute query

Test Environment Notes

Server:

Hewlett Packard NetServer LH 3r (pre-production)

 

8 GB RAM

 

Four 500 Mhz Pentium III Xeon processors

 

Hewlett-Packard NETRAID controller with 16 GB cache

 

One 63 GB RAID0 array using eight 9 GB drives for data files

 

One 63 GB RAID5 array using eight 9 GB drives for transaction log files

Test Assumptions and Notes

The test database is created by inserting records into the Northwind database (the sample database that is created during the installation of SQL Server). We extend the database to a size of 6 gigabytes to create a database that can be loaded into memory in Windows 2000 DataCenter Server, but is too large to load into memory under Windows NT Server 4.0 Enterprise Server. We configure the test server in a dual-boot mode so that Windows NT Server 4.0 Enterprise Server or Windows 2000 DataCenter can be booted and access the same database files. We run a query that does an outer join of two tables, selecting thousands of records.

Test Procedure/Steps

  1. Restore the Northwind database from the backup of the original database created at the installation of SQL Server.

  2. Increase the database file to the size specified in the test.

  3. Increase the log file to the size specified in the test.

  4. Start the performance monitor.

  5. Execute the SQL script to create the new suppliers_detail table.

  6. Create an index on the new suppliers_detail table.

  7. Run the SQL script that performs the test query.

Test Results and Technical Analysis

The EMA Test Results table shows the results obtained from the test. These results are represented graphically in the Windows NT Server 4.0, Enterprise Edition EMA Test Performance Monitor Chart and the Windows 2000 DataCenter EMA Test Performance Monitor Chart.

EMA Test Results table

Operating System

Query Execution Time

Average CPU Utilization

Average Disk Utilization

Windows NT Server 4.0

1 min. 30 sec.

35%

100%

Windows 2000 Server

30 sec.

20%

2%

Bb742519.serv34(en-us,TechNet.10).gif

Windows NT Server, Enterprise Edition 4.0 EMA Test Performance Monitor Chart

Bb742519.serv35(en-us,TechNet.10).gif

Windows 2000 DataCenter EMA Test Performance Monitor Chart

These results show that the additional memory available under Windows 2000 Server allows SQL server to access data from memory rather than from disk. This test shows a three-fold performance improvement.

Key IT Area

Scalability and Performance

Test Number

2

Test Name

Extended Symmetric Multi-Processing (SMP) Support

serv38

Test Description

Windows 2000 DataCenter extends the maximum number of processors supported in a server to 64, compared to the maximum of eight supported by Windows NT Server 4.0. In reality, there is little performance gained by going beyond four processors due to the SMP affinity mechanism used in the Windows NT Server 4.0 kernel. As a result, the three leading hardware manufacturers for Windows NT Server (Compaq, IBM and Hewlett-Packard) have not offered servers with more than four processors for that market segment. The SMP affinity mechanism has been rewritten in Windows 2000 DataCenter to enable effective use of larger scale SMP servers and to achieve better performance from smaller scale SMP servers. Our tests confirm the performance improvements in smaller scale SMP servers and we believe the same improvements will enable scalability to larger scale SMP servers when they are available on the market. The leading hardware manufacturers are now announcing new server models with support for more processors in the Windows NT Server hardware market.

Features Having Impact

  • Extended Symmetric Multi-Processing Support

Metric(s)

Throughput: Transactions per second

Test Environment Notes

Client

  • Four processor Pentium Pro 200 megahertz (generic)

  • 512 megabytes RAM

  • One 22 gigabyte RAID0 array using 12.2 gigabyte drives and one controller

  • 1 switched 100 megabit per second network card in full duplex mode

Application

  • Four processor Pentium III Xeon 500 megahertz (generic)

  • 2 gigabytes RAM

  • One 54 gigabyte RAID0 array using 7-9 gigabyte drives and one controller

  • 1 switched 100 megabit per second network card in full duplex mode

Database

  • Compaq Proliant 6500 four processor Pentium Pro 200 megahertz

  • 3 gigabytes RAM

  • One 315 GB RAID0 array using 35-9 gigabyte disks and five controllers

  • 1 switched 100 megabit per second network card in full duplex mode

Test Assumptions and Notes

We compare the number of transactions per second processed by an Internet Information Server (IIS) based application using Microsoft Transaction Server on each operating system. We use the Ibank Active Server Page (ASP) application from the Microsoft WinDNAperf toolkit. The application performs debits on bank accounts with each transaction requiring three updates and one insert. The environment is a three-tier (client, application server, database server) architecture. The client tier drives the transactions by sending them to the ASP application on the middle tier IIS/MTS Server. The transactions are processed on the middle tier resulting in updates and inserts being sent to the SQL Server 7.0 database tier. Since the application tier is the bottleneck in this environment, we focus analysis on that tier.

Test Procedure/Steps

  1. Configure the client, application and database systems.

  2. Start the Performance Monitor.

  3. Run the simulation for five minutes.

  4. Record the results.

Test Results and Technical Analysis

The SMP Test Results table shows the results obtained from the test. The processor utilization during the tests is displayed graphically in the Windows NT 4.0 SMP Test Performance Monitor Chart and the Windows 2000 DataCenter SMP Test Performance Monitor Chart.

SMP Test Results

Operating System

Simulation Run Time

Average Transactions Per Second

Average Processor Utilization

Windows NT Server 4.0

5 min.

210

77%

Windows 2000

5 min.

311

65%

Bb742519.serv36(en-us,TechNet.10).gif

Windows NT Server, Enterprise Edition 4.0 SMP Test Performance Monitor Chart

Bb742519.serv37(en-us,TechNet.10).gif

Windows 2000 DataCenter SMP Test Performance Monitor Chart

Our tests show a 48% improvement in throughput under Windows 2000 DataCenter compared to Windows NT Server 4.0, Enterprise Edition. The Windows NT Server 4.0–based server is able to process 210 transactions per second and the Windows 2000 DataCenter–based server is able to process 315 transactions per second.

We find a significant improvement in SMP scalability on a system with four processors. We believe this will lead to benefits to customers through SMP scalability beyond four processors. This is suggested further by the hardware manufacturers' decisions to invest in the production of servers with more than four processors for the Windows NT Server marketplace.

Key IT Area

Scalability and Performance

Test Number

3

Test Name

File Server Performance

serv39

Test Description

This test to determine whether Windows 2000 Server provides file server performance benefits is based upon the White Paper "Enterprise File Server Scalability and Performance" by David B. Cross, Microsoft Consulting Services. The extensive testing used to create the White Paper found improved performance in Windows 2000 Server that would provide benefits over Windows NT Server 4.0. This test seeks to validate those results without performing such extensive testing.

Features Having Impact

  • Enterprise Memory Architecture (EMA) Support

  • Extended Symmetric Multi-Processing (SMP) Support

Metric(s)

Execution Time: Seconds

Test Environment Notes

Client

  • Dell PowerEdge

  • Pentium III 450MHz

  • 256 MB SDRAM

  • Ultra-Wide SCSI

  • 3COM 3c905B TX 100 Mbps

  • RAID 0

Server

  • Compaq Proliant 2500

  • 480 MB RAM

  • (2) Pentium Pro 200 Mhz processors with 512 GB cache

  • Compaq Smart-2DH RAID controller with 16 GB cache

  • (1) 36 GB RAID5 array using five drives

Test Assumptions and Notes

We test the time required to transfer a collection of assorted files with a total size of 340 megabytes from the server to a workstation. We perform three tests: (1) reading the files from the server to the workstation, (2) writing the files from the workstation to the server and (3) performing a read/write combination of the same files.

Test Procedure/Steps

  1. Copy files from Server to Client.

  2. Record time.

  3. Copy files from Client to Server.

  4. Record time.

  5. Copy files from Client to Server and Server to Client simultaneously.

Test Results and Technical Analysis

Our tests show a 6% improvement in the average response time under Windows 2000 Advanced Server compared to Windows NT Server 4.0. The average time to complete for the Windows NT Server 4.0–based server is 100 seconds and the average time to complete for the Windows 2000 Server–based server is 94 seconds.

We find that Windows 2000 Advanced Server is at least as good as Windows NT Server 4.0 as a file server using existing hardware. We believe Windows 2000 Server enables use of extended architectures that provide the ability to consolidate server farms and meet increasing demand better than Windows NT Server 4.0.

Key IT Area

Scalability and Performance

Test Number

4

Test Name

Network Load Balancing

serv40

Test Description

Mission critical Internet applications require high availability and scalability. As more Internet services are brought online, these applications have become essential to conducting daily business worldwide. For example, downtime for a web server that provides financial data to its users translates into dollars that are lost due to lack of information and productivity. Microsoft addresses these issues with Windows clustering technologies that provide high availability and scalability.

Microsoft Clustering Technology consists of three products: Network Load Balancing Service, Server Cluster and Component Load Balancing. The purpose of this study is to review network load balancing implementations in Windows NT Server 4.0, Enterprise Edition and Windows 2000 Advanced Server.

Load balancing technology should provide high availability to mission critical applications. The feature should install as a standard component and require no additional hardware changes. The balancing of the load or the client request should be transparent to the user in the case of a server failure or "load balancing". Additional servers should be easily added to the cluster without interrupting application service.

Features Having Impact

  • Network Load Balance Cluster

Test Assumptions and Notes

The scope of this test does not include scalability test (up to 32 servers) or generating stress test results.

Test Procedure/Steps

  1. Windows Load Balancing Service on Windows NT Server 4.0, Enterprise Edition SP 5:

    In Windows NT Server 4.0, Enterprise Edition, the network load balancing feature is referred to as "Windows Load Balancing Service (WLBS)". The first thing we notice is that WLBS does not come with our version of the Enterprise Edition. We download the software from:

    https://www.microsoft.com/ntserver/all/downloads.asp#WindowsNTServerFeatures

    WLBS installs as a separate networking driver in Windows NT Server 4.0 from the networking control panel. For simplicity, we choose a single network adapter in unicast mode (See Section 3). IIS 4.0 was configured and HTTP over TCP/IP on Port 80 is selected for load balancing. One of the most challenging aspects in the Windows NT Server 4.0–based environment is that every time a parameter is changed and applied, the server requires rebooting, adding a considerable amount of time to configure and tune the servers properly.

    The configuration menu consists of Cluster Parameters, Host Parameters and Port Rules packed with options and fields. Users should be careful editing attributes in this section to properly configure the WLBS. Since performance impact on the cluster host is most easily measured by observing CPU load, we use a Microsoft Web Server Testing tool called Homer to create a load utilizing 50% CPU capacity on a single server. Once the load is balanced on two servers, the Performance monitor displays an average CPU load of 25% on the servers. We also validate redirection of client requests by physically disconnecting the servers from the cluster. Although we notice a slight delay in the client, the NLB cluster successfully redirects the client to the next available server.

  2. Network Load Balancing Cluster on Windows 2000 Advance Server

    In Windows 2000 Advanced Server network load-balancing functionality has been renamed from WLBS to Network Load Balancing (NLB) Cluster. Installation is simpler in Windows 2000 Advanced Server since NLB is an installation option and no additional downloads are required. The configuration menu layout is better organized into tabs, but the general information remains the same. Optimizing NLB is much simpler since no reboots are required when settings are adjusted. Using the same configuration option in Homer, we create a 50% CPU load on a single server. The same test is then performed again in an NLB Cluster environment. The results are similar to Windows NT Server 4.0, Enterprise Edition. We see a 25% distribution of CPU capacity over the NLB cluster. The redirection test is also performed in Windows 2000 Advanced Server. The server is physically disconnected from the network and reconnected at various intervals. The client experiences minor delays in response time, but the end result is a highly available system that is relatively transparent to the users.

  3. Configuration

    The test uses two single processor Pentium III Xeon 500 Mhz systems interconnected to a 100 Mbps LAN. The servers are configured to dual boot into Windows NT Server 4.0, Enterprise Edition and Windows 2000 Advanced Server.

    Several Load Balancing Card Configuration options are available. For the purpose of testing we use a Single Network Card with Multicasting enabling. Microsoft recommends a second network card be used for optimum performance.

Mode Options

Advantages

Disadvantages

Model 1: Single network adapter in unicast mode

Only one network adapter required.
Most straight forward configuration, because unicast mode is default.
Works properly with routers.

Ordinary network communication among cluster hosts is not possible.
Network performance may suffer due to one network card.
May not work properly with switches.

Model 2: Single network adapter in multicast mode

Only one network adapter required.
Ordinary communication allowed with cluster hosts.
Works properly with switches.

Network performance may suffer due to one network card
Multicast Mac addresses may not be supported by the routers.

Model 3: Multiple network adapters in unicast mode

Performance is enhanced because there are at least two network adapters.
Network communication with other cluster hosts.

Needs another network card.
May not work properly with switches.

Model 4: Multiple network adapters in multicast mode.

Performance is enhanced because there are at least two network adapters.
Network communication with other cluster hosts.
Works properly with switches.

Needs another network card.
Multicast Mac addresses may not be supported by the routers.

Test Results and Technical Analysis

Microsoft's Network Load Balancing feature facilitates higher availability and greater scalability to front-end servers. Whether it is in Windows NT Server 4.0, Enterprise Edition or Windows 2000 Advanced Server, proper load balancing immediately increases availability of applications such as web servers. Although both Windows NT Server 4.0 and Windows 2000 Server offer similar functionality, Windows 2000 Server offers a more manageable and useable interface to the network load balancing feature. In Windows NT Server 4.0, almost every configuration change requires a reboot of the server. However, after the initial configuration in Windows 2000 Server, reboots are not required while making adjustments to the NLB cluster. Usability is improved by reorganizing the configuration menu into its appropriate sections in Windows 2000 Server versus the Windows NT Server 4.0 approach of a single dialogue box. Another time saving factor is that NLB was part of the Windows 2000 Server option, whereas in Windows NT Server 4.0, Enterprise Edition, we have to download the software from the Microsoft Website.

INTEROPERABILITY AND SECURITY

Summary

Windows 2000 Server includes additional features to address issues of managing interoperability and security.

  1. Directory Synchronization – Through the Active Directory Service Interfaces (ADSI), Windows 2000 Server-based servers open up the ability to synchronize directory data between Active Directory and other directories. The test of the DirSync add-on for NDS demonstrates this ability.

  2. Extranet Testing – Windows 2000 Server provides additional functionality, such as certificate server, that did not exist in Windows NT Server 4.0. In a Windows NT Server 4.0-based environment, the outside vendor must also be responsible for providing security to its users in order to access another company's secured web site. In the Windows 2000 Server-based environment, however, the management and distribution of certificates can be delegated to a third party while maintaining a secure site.

Key IT Area

Interoperability and Security

Test Number

1

Test Name

Directory Synchronization

serv41

Test Description

This test confirms the ability of the Windows 2000 Server family to interact with other directories through services running on the server. In this test, the synchronization of the Active Directory to a Novell NDS directory is facilitated through the DirSync add-on. This add-on is currently available separate from the Windows 2000 Server family.

The objective of DirSync is to synchronize the directory information between Active Directory and NDS. DirSync is able to replicate from a hierarchical structure in Active Directory to a hierarchical structure in NDS, creating any NDS objects in the appropriate place within the structure. The tests are performed on the first version of the beta and additional functionality is expected in future versions.

Features Having Impact

  • DirSync (beta 1)

  • Active Directory

Test Environment Notes

Server:

Compaq 1850R; Dual Pentium III 400 MHz

 

256 MB RAM; RAID Level 0

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

The DirSync testing goes well after all of the software components are in place. This version requires the beta version of the Novell Windows 2000 client software available from Novell. The DirSync add-on must be run on a domain controller (DC) and the Novell client must be installed on that DC.

The configuration of the synchronization is simple except for the notation required for the user on the NDS side that is used for the replication. There are some limits to where the user can reside in the context of the structure on the NDS side. Once the user is created in the appropriate context, the first synchronization works well.

At first, the regular synchronization schedule seems problematic, but after reviewing the known limitations of the beta and making adjustments accordingly, the add-on works within the documented functionality. We do not have an opportunity to test additional versions of the DirSync add-on, but we expect it will eventually work with the same relative ease as other components.

Test Results and Technical Analysis

While the beta version of DirSync works as documented, there are currently known limitations within the synchronization process. These limitations may or may not be addressed in future versions of DirSync.

Ultimately, the results demonstrate that a connection between Active Directory and another directory can be built through use of the Active Directory Service Interfaces (ADSI). Any connection will be limited by the ability to programmatically synchronize what may be very disparate directory structures.

Key IT Area

Interoperability and Security

Test Number

2

Test Name

Extranet

serv42

Test Description

With the increased business conducted on the web, companies are constantly looking for ways to expand their "web presence". The Windows 2000 Server family allows companies to share specific information with customers and vendors while maintaining their existing security. This level of secured business practice is made possible by using public key technology. Public key certificates in Windows 2000 Server will allow a company's web site to be more secure in a public environment.

Our objective is to test the improved efficiency in which companies can deploy extranets in Windows 2000 Server. The test is conducted on both Windows NT Server 4.0 and Windows 2000 Server platforms. The extranet scenario involves two hypothetical companies, Inwonderland.com and Bee-Company, where Inwonderland.com uses certificates with its partner, Bee-Company, and Bee-Company mutually authenticates access to a portion of Inwonderland.com's Web server. On the Windows 2000 Server platform, Bee-Company issues certificates internally to their users. Inwonderland.com allows Bee-Company to issue and manage certificates internally. Further, Inwonderland.com can map all Bee-Company's certificates to a specific profile of authorization within the Inwonderland.com domain environment. In the Windows NT Server 4.0–based environment, however, Inwonderland.com is responsible for issuing certificates directly to Bee-Company's users.

Features Having Impact

  • Certificate Services

  • Internet Information Server version 5

Test Environment Notes

Server:

Windows 2000 Server–based environment

 

Compaq Proliant 2500; Dual Pentium Pro 200 MHz

 

480 MB RAM; RAID Level 5

 

Windows NT Server 4.0–based environment

 

Compaq Proliant 1850r; Dual Pentium II 450 MHz

 

256 MB RAM; RAID Level 0

Network:

Switched Ethernet 10/100 MBps

Test Assumptions and Notes

In the Windows NT Server 4.0 scenario, a vendor accessing a distributor or manufacturers web site takes much more time to setup. In addition to the amount of time related to the setup of an extranet in Windows NT Server 4.0, it is also much more difficult to manage. The burden of authenticating new users from the other vendor to the secured web site falls upon the secured site.

In contrast, Windows 2000 Server, running Certificate Services, allows other vendor sites to issue certificates to their users. This internal management of certificates makes issuing certificates more efficient and less time consuming. The steps required to setup an extranet are reduced compared to the effort required in Windows NT Server 4.0.

Test Results and Technical Analysis

In order for an outside vendor to access a company's secured site in the Windows NT Server 4.0–based environment test, that company is responsible for granting the certificate to that vendor's users. With Windows 2000 Server, vendors can be given the right to issue certificates directly to their own users, thereby reducing overhead on secured site while maintaining an appropriate level of security.

Contact Information

For further information regarding this report, or about Arthur Andersen, please contact:

Mike Jarrett, Partner

Business Consulting Arthur Andersen One Market Suite 3500 Spear Street Tower San Francisco, California 94105-1019 Tel. 415-546-8831

Cory Elliott, Senior Manager

Business Consulting Arthur Andersen 633 West 5th Street Los Angeles, California 90071 Tel: 213-614-8222

JD Tengberg, Manager

Business Consulting Arthur Andersen 633 West 5th Street Los Angeles, California 90071 Tel: 213-614-8302

Arthur Andersen has granted to Microsoft permission to disclose this report, or the "Executive Summary" portion of the report, provided that the report or the Executive Summary are disclosed in whole, without modification, and include the Arthur Andersen disclaimers.

© Copyright 1999 Arthur Andersen LLP. All rights reserved. All trademarks referred to are the property of their respective owners.

1 At the conclusion of Andersen's analysis, Release Candidate 1 became available.

2 Fail-Over Clustering was not tested in the Windows 2000–based environment due to the lack of approved drivers for the available hardware. We assume it functions similarly to the Windows NT4.0 implementation

3 Service Pack Slipstreaming was not tested since there were no service packs available from Microsoft.

4 Technically, if the hardware is fully Plug and Play compliant and the hardware manufacturer approves of it, the system may not have to be shut down in order for the additional device to be recognized. We did not test this with any internal devices but did test it with an external modem.

5 Microsoft has indicated this functionality will be available for Exchange 6.0 when it is released.

6 In our tests, and according to the beta documentation, synchronization from an NDS directory to Active Directory does synchronize child Organizational Units.

7 When this function becomes available depends primarily on when other operating systems vendors adhere to the Kerberos 5 standard.

8 Leslie Helm, "World Wide Web Living Up to Its Name," Nando Time News, March 28, 1999.

9 "Making the Corporate Memory Retrievable: Information can be gathered, stored, organized and recovered to result in productivity improvement," Chemical Engineering. January 1999.

10 Groups are also available in Windows 2000 Server, but are not required in this scenario.

11 Service packs for Windows NT Server 4.0 Terminal Server Edition are not compatible with Windows NT Server 4.0 services packs. A separate service pack (SP4) is required for Terminal Server to be able to access NTFS 5 on a Windows NT Server 4.0 SP5 or Windows 2000 Server Family.