Why Is Interoperability Important?

It should come as no surprise that many medium- and large-sized organizations usually have mixed or what is sometimes referred to as heterogeneous computing environments. Research data shows that there are 2 million UNIX servers, 300,000 AS/400 systems, and more than 50,000 IBM-compatible mainframes. The Microsoft® Windows® operating system is also broadly deployed in these organizations, but with so many disparate systems, the issue of interoperability becomes increasingly important. As an example, consider that many organizations are deploying distributed n-tier client/server applications, many of which require access to data or transactions on existing systems. In general, Microsoft has seen at least three main drivers for interoperability:

  1. Reduces operational cost and complexity – Customers will continue to have mixed environments for the foreseeable future. The ability for these systems to work together reduces the cost of building and supporting a heterogeneous infrastructure. Having said that, it's worth noting that homogeneity may, in fact, offer substantial benefits in reducing the operational cost and complexity of an organization's infrastructure. However, it is unlikely that many organizations have the ability to create a totally homogeneous environment.

  2. Enables “best-of-breed” deployments – Customers may have business requirements that can only be delivered with specific applications or platforms. For example the Windows NT® platform provides a rich platform to either build solutions or buy commercial “off-the-shelf” application packages. This best-of-breed environment meets the requirements for rapidly deploying solutions. However, it's clear that Windows NT needs to work with the other environments in use in the organization; otherwise, the potential benefits of the new solution would be reduced. So interoperability is a key requirement that can help ensure customers meet their demanding business needs.

  3. Leverages existing investments – Customers have a large and diverse range of systems installed in their environments. The move to new platforms needs to be gradual and evolutionary. Interoperability between new environments such as applications based on Windows DNA and existing systems is critical to the success of the Windows platform in the enterprise. Another key trend is the requirement to “Web enable” existing applications, allowing access to the key systems on host environments, such as the IBM mainframe, from the intranet or Internet. This Web enabling effectively extends the functionality of existing applications and protects the investments that organizations have made.

What is Microsoft's Interoperability Strategy?

Microsoft is committed to ensuring that the Windows platform works with other key platforms and systems in the heterogeneous computing environment of our customers.

In order to achieve this goal, Microsoft will broadly adopt two tactics: it will build bridges and/or gateways to the key platforms in use by our customers; and it will support key standards that can provide interoperability to systems that also support the standard.

In summary, rather than advocate replacing equipment in a piecemeal fashion, the Microsoft goal is to help customers evolve their information technology infrastructures in ways that capitalize on new technologies and products. This commitment to interoperability improves information sharing, reduces computing costs, and capitalizes on past investments.

Microsoft Interoperability Framework

In order to communicate and fill out its interoperability strategy, Microsoft has defined a framework for the technologies that enable Microsoft products to work in a multivendor environment. This framework is divided into four layers: Network, Data, Applications, and Management—or NDAM for short. Standards that play a key role in enabling interoperability between systems span all four layers of the framework.

ndam

For a more detailed look at the Microsoft interoperability strategy and the NDAM framework, you can read the Microsoft Interoperability Strategy white paper. If you're interested in interoperability with a specific platform or at a specific layer of the NDAM framework, the navigation menus on the right will help you find the resources you need.

Network Interoperability

Network interoperability provides the core foundation for Interoperability between systems. We can think of network interoperability as the ability for multivendor systems to communicate with each other using common protocols. Network interoperability includes the following:

Protocols – Protocols are the base technology for all interoperability. Microsoft has invested heavily in a wide range of protocols such as TCP/IP, IPX/SPX, and SNA, which provide support to access a wide range of platforms other than Windows, including UNIX, Apple Macintosh, Novell Netware, and IBM hosts. However, other base interoperability is delivered with support for protocols such as the Dynamic Host Configuration Protocol (DHCP), Domain Name Service (DNS), and internetworking protocols such as Router Information Protocol.

TerminalAccess – Support for terminal protocols is clearly a very important first step in providing access to existing systems. In the past, terminal support has often been the major requirement for access and interoperability for many customers. Microsoft has developed support through Microsoft SNA Server for 3270 and 5250 as well as the TN3270 and TN5250 protocols. However, it is clear many customers require more sophisticated interoperability than simple terminal access, and these requirements are more likely categorized as data or application interoperability.

Print Services – The ability for applications or users to take advantage of print resources on multiple systems can be a very important requirement. Microsoft continues to support this requirement through support for a variety of protocols including new emerging protocols such as the Internet Printing Protocol.

Data Interoperability

Data Interoperability delivers the ability for users and applications to access and query information stored in both structured and unstructured storage engines. As an example, users very often need to get access to data from multiple sources as well as the ability to search across these sources.

File System – Network file servers such as Novell Netware or UNIX NFS hosts contain large amounts of information that often needs to be shared among users, and often these users are on different systems. Microsoft supports access to these multiple file systems in numerous ways including the native protocols such as IPX/SPX, NCP, and NFS.

Database – Clearly a great deal of corporate data resides in databases such as Oracle and IBM DB/2. Having database access to these systems is critical for both users and applications. Additionally, the ability to perform heterogeneous queries against multiple databases provides extremely powerful capabilities. Another requirement is the ability to provide replication between multiple database systems including bidirectional support. This is important in applications such as data warehousing, for example, where data is extracted from transactional systems on a scheduled basis and is replicated to a data warehouse so that users can perform complex analysis on this data. Through Data Transformation Services, OLEDB, and other Microsoft technologies, Microsoft is enabling comprehensive database interoperability capabilities.

E-mail Access and Storage – While a great deal of corporate data resides in either databases or flat files, an increasing amount of data resides in the semi-structure storage engine of messaging systems. Additionally, not only is access to e-mail data important, but the ability to send and receive e-mail as well as schedule information is essential for most companies. Microsoft supports key messaging standards such as SMTP, IMAP4, and POP3 to enable interoperability as well as build support for connectors to foreign mail systems such as OfficeVision and cc:Mail.

Applications Interoperability

Application interoperability refers to the key infrastructure required to ensure that new applications built in the n-tier client server model can interoperate with existing applications, business logic, and data.

Presentation Services – The demand for organizations to build applications that reach out to customers and partners is increasing as organizations look to the Internet as a new channel for working with customers and partners. This creates the challenge of having to build applications that run on clients that you have little control over. The use of the browser and support for HTML provides the broadest reach, but for a richer experience, the use of DHTML offers a combination of good reach and rich interactive functionality.

Transactions – The ability for applications to participate in transactions between multiple systems is a key requirement that follows on the simple access to data across systems. This ability to update the data across multiple databases and systems can be achieved in a number of ways, from proprietary mechanisms such as IBM CICS and APPC protocols through to standards-based protocols such as XA and TIP.

Components – As organizations rearchitect and develop applications in the n-tier client server model, the ability to create reusable components that can be invoked by many applications becomes increasingly important. This componentization occurs across all levels of the application, and the ability to have the same programming model brings many benefits. The COM programming model provides the ability to wrap existing business logic on systems not running Windows into a COM object that can be invoked by a Windows-based application. Additionally, third parties provide interoperability between different component models such as COM and CORBA.

Management Interoperability

Management interoperability focuses on reducing the burden of administration of multiple systems, in particular the challenges of user accounts management. However, it should be noted that management interoperability is substantially more than this, and while the current focus is not addressing this broader set of requirements, future developments by Microsoft and other vendors are likely to address this requirement.

Security – The administrative overhead of managing users and passwords on multiple systems is a very substantial burden. Additionally, the user experience of having to input multiple user names and passwords as well as remember all the passwords is clearly an undesirable situation. Microsoft has been developing technology to help with single sign-on and password synchronization to platforms including UNIX, Novell Netware, and IBM S/390 and IBM AS/400. In addition, Microsoft is supporting standards such as Kerberos that provide the foundation for more comprehensive single sign-on solutions to platforms that support the Kerberos v5 protocol.

Directory – Directory services provide a consistent way to name, locate, access, manage, and secure network resources. On the network, directory and security infrastructure is typically very tightly integrated within the operating system to ensure maximum security as well as ease of deployment and administration. Having to manage multiple, incompatible directories can significantly increase the total cost of ownership. Therefore, directory interoperability is a priority for many customers. Microsoft's Active Directory provides support for standards such as LDAP, which provides a level of interoperability as well as support for specific directory service synchronization with NDS.

Systems Management – The ability to manage events, alerts, performance characteristics, and so on across platforms is an important requirement especially as n-tier applications may pass through multiple layers on different systems, and the ability to manage the performance or events is critical. Again Microsoft is working on technologies that will provide the basis for management across multiple environments. For example, Microsoft has supported SNMP for many years and with the Windows Management Architecture, a range of additional protocols and functionality is being developed in Windows to provide the basis for a richer management infrastructure for the heterogeneous environment. Additionally, third-party vendors have a key role to play in facilitating an enterprise management architecture that includes the Windows Management Architecture.

Standards

Standards play a key role in enabling interoperability between systems. There are many examples of how standards are in use today as the basis of interoperability with industry standards defined by groups of companies cooperating in consortia (such as SMTP or HTML) and technologies defined by individual companies that are so widely adopted as to become "de facto standards" (such as IPX/SPX or SNA).

There is not a simple one-to-one relationship, however, between industry standards and interoperability. The relationship is far more complex. Some industry standards are initially narrow or incomplete, and a variety of extensions and common practices grow up around them, driven by market forces. For example, the early leading Web browser vendors created a number of non-standard extensions and practices (for example, ignoring the absence of many HTML end-tags when the end could be inferred from context) that are now required to render a large amount of HTML content. Other standards solve a certain key issue in a general problem space and leave other aspects of the problem space undefined. For example, the LDAP standard for directory services focuses on client enumeration and simple query of a directory server but does not specify optimized server-to-server communication or replication, does not define how secure directory access should occur, and does not define a number of the important data structures or "schemas" needed in a fully functional directory service. (These limitations may be addressed in future versions of LDAP.)

Finally, even in its functional area, the best and most precisely specified industry standard is never able to capture on paper the full range of expected functionality and semantics of a complex system. Implementation and cross-testing of implementations is always necessary to work out all the details necessary for true interoperability. That is why the IETF (Internet Engineering Task Force), for example, wisely refuses to create final standards unless multiple implementations of a given specification have actually been built and tested against each other. That is also why Microsoft and other companies seeking real interoperability aggressively test their implementations against each other from an early beta phase either in informal "bake-offs" or through industry bodies that provide interoperability and testing services

Clearly, industry standards have a key role to play in establishing and enhancing interoperability. But "standards" need to be conceived of more broadly than simply as paper specifications produced by independent bodies. A true standard includes both a specification and a set of industry practices that emerge from that specification. Moreover, specifications that are never broadly adopted in the marketplace can hardly be called "standards" in any meaningful sense; at best they can be thought of as potential standards that might eventually be actualized through industry adoption.

Microsoft is committed to interoperability and therefore committed to standards. Microsoft will continue to work hard within the standards process to help develop and adopt key standards that solve interoperability problems in a market-relevant way.