Export (0) Print
Expand All

Enabling Data as a Service for Self-Service Business Intelligence

Using Microsoft PowerPivot for Excel 2010 and SharePoint 2010 with Windows Azure and SQL Azure to Deliver Data as a Service for Centralized Reports, Dashboards, and BI Applications

Technical White Paper

Published: October 2011

Download

Download Technical White Paper, 17 MB, Microsoft Word file

Download Video

Situation

Solution

Benefits

Products & Technologies

While operating a production environment for centralized, managed self-service BI, Microsoft IT saw an improvement opportunity to empower business users by enabling them to consume data as a service.

Microsoft IT implemented data as a service via data services layer (DSL) accessible as a Web site for users to explore, publish, and consume EDW and other external data from on-premises and SQL Azure, making it available for applications, devices, and users to use in BI solutions without extensive programming or IT intervention.

  • Decrease lead-time for intelligence gathering and solution development.
  • Leverage scalability, flexibility, and expandability features of SQL Azure.
  • Expose new sources of data for users.
  • Standardize available data sets and make new data available for distribution.
  • Lower IT overhead associated with permissions management and data provisioning.
  • Excite users about having control over obtaining data to gain new insights.
  • Take advantage of pay-as-you-go cost model.
  • Increase user adoption while decreasing onboarding time.
  • Provide straightforward means to access data through multiple options such as PowerPivot and SQL Server Analysis Services.
  • Use existing security capabilities of Windows Server, SharePoint, and SQL Server.
  • SQL Server 2008 R2
  • SQL Server Analysis Services
  • SQL Azure
  • Windows Azure
  • AppFabric
  • SQL Server Reporting Services
  • SharePoint 2010
  • PowerPivot
  • Excel Services
  • Microsoft Office

Hh499043.arrow_px_down(en-us,TechNet.10).gif Executive Summary

Hh499043.arrow_px_down(en-us,TechNet.10).gif Hosted Self-Service BI Story

Hh499043.arrow_px_down(en-us,TechNet.10).gif Benefits of Hosting Data as a Service

Hh499043.arrow_px_down(en-us,TechNet.10).gif Hosted Self-Service Architecture and Technical Details

Hh499043.arrow_px_down(en-us,TechNet.10).gif Onboarding and Evangelism

Hh499043.arrow_px_down(en-us,TechNet.10).gif Collaborative BI Example

Hh499043.arrow_px_down(en-us,TechNet.10).gif Best Practices

Hh499043.arrow_px_down(en-us,TechNet.10).gif For More Information

Executive Summary

On a daily basis, Microsoft departments generate and consume many terabytes of data relevant to making decisions. The data encompasses common business categories related to financial operations, such as for processing orders, managing inventory, and handling payments, as well as more specialized data applications for human resources and tracking general business performance.

To make the data actionable, Microsoft workers rely on a SharePoint-based, centralized Microsoft Consolidated Business Intelligence (CBI) portal that houses reports, scorecards, and other BI solutions. This portal provides a catalog of BI solutions and enables the sharing of intelligence across the company. Users create BI applications and work with data using familiar tools such as Excel, PowerPivot, Report Builder, and SQL Server Analysis Services. The CBI portal consumes data from an Enterprise Data Warehouse (EDW) that Microsoft IT created as a centralized data repository for the company.

Even when designing and deploying the CBI portal in 2006, the Business Intelligence Engineering (BIE) team at Microsoft IT made it a priority to give users as much control as possible to access data, analyze it, and make the data meaningful. This goal has guided subsequent changes and releases since then. Part of following this objective has meant giving users the capability to create their own solutions through PowerPivot and a service infrastructure based on SharePoint 2010 and SQL Server 2008 R2. PowerPivot enables Microsoft workers to analyze data by using the familiar Office interface without needing extensive programming experience.

However, PowerPivot is a tool for manipulating data; it does not contain data until workers add data to a solution. For high adoption rates, business users need to be able to access and consume enterprise data in a convenient way. To address this challenge, Microsoft IT provides enterprise data packages through a data abstraction layer (DSL). Acting as a bridge between front-end applications and both on-premise and SQL Azure cloud data, DSL provides an abstraction of underlying data without replicating it, and exposes it for consumers.

Prior to implementing DSL, Microsoft IT provisioned, granted permissions, and managed requests for enterprise data packages individually based on each request. An established onboarding process helped to reduce overhead, yet it was not as simple as going to a Web site and creating a request to access packaged data sets. Deploying DSL increased ease of use and gave users a single location to explore, publish, and consume EDW, SQL Azure, and external data from multiple data sources.

This technical white paper features the DSL solution, as it exists within the infrastructure that supports self-service BI. DSL uses publicly available software from Microsoft, even though it is a customized, internal Microsoft solution. This paper contains information for technical decision makers who seek more information about using creating cloud-based or hybrid approaches to hosted data and business intelligence. It is helpful to have a high-level understanding of SQL Server, Analysis Services, PowerPivot, and SharePoint as a background to understanding this paper.

Note : For security reasons, the sample names of organizations and other internal resources mentioned in this paper do not represent real resource names used within Microsoft and are for illustration purposes only.

Hosted Self-Service BI Story

Although the full story of self-service BI at Microsoft goes back many years, 2006 marked a year of major change. During this time, Microsoft IT deployed the CBI portal to centralize and share business intelligence tools and information across the entire organization. The CBI portal is a key component of the self-service BI strategy in that it helped Microsoft IT to create a multi-year strategy for the overall direction of BI, and plan for the desired changes in phases to provide improved BI service offerings over time.

To create a strategy for BI, Microsoft IT considered the best fit of people, processes, and technology and their interactions during the growth phases. It was vital to consider the fit because each improvement cycle brought about changes and required different levels of engagement from the participants. For example, the initial phase required bringing about standardization, whereas later phases emphasize adoption and usability improvements. Figure 1 shows the phases in Microsoft�s BI strategy.

Figure 1. Major phases of BI implementation at Microsoft

Figure 1. Major phases of BI implementation at Microsoft

The idea of empowering people to turn information into insight for making decisions forms the foundation of Microsoft�s vision for BI. This vision is relatively straightforward as a concept and entails the following three dependencies: give workers clean and useful data, provide robust and intuitive tools, and reduce IT involvement in the process. These three dependencies pose the following challenges in implementing any BI service offering.

  • Providing reliable source data. When transactions take place, the resulting data at times has elements in non-standard format, or there is legacy data remaining from previous transaction retained before standardization was enforced. Microsoft, like other companies, ensures that the data available in cubes and data marts is usable and correct before disseminating it to users. Microsoft IT enforced data validation early on in creating a data warehouse.

  • Avoiding duplicate data housed in silos. When many business applications use disparate data repositories, it is possible for data duplication to take place. Microsoft IT responds to this challenge by having master data in a central EDW.

  • Meeting developer and business user needs in a unified way. To ensure high adoption rate and provide the best user experience, Microsoft IT uses the familiar functionality and interfaces of Visual Studio and Office Excel to deliver data modeling and analysis tools for all types of users through Analysis Services and PowerPivot.

  • Putting control in the hands of the business users. Business users who manipulate and analyze data best understand their own needs. Naturally, Microsoft IT wants to give workers this control not only to use familiar tools, but also to centralize reports and dashboards, collaborate, and reduce the dependence on IT to create and share BI solutions

Centrally hosting, managing, and sharing underlying data, as well as resulting reports, metrics, scorecards, and KPIs is one part of the BI story. It takes ETL processes and security and infrastructure dependencies to acquire data, a massive data warehouse in the EDW separated into marts and cubes to integrate the data, and the CBI portal with SharePoint, SQL Server, and PowerPivot to disseminate and use the data. The entire solution creates an end-to-end scenario of data aggregation, processing, distribution, and usage, as shown in Figure 2.

Figure 2. Microsoft BI landscape

Figure 2. Microsoft BI landscape

Step 1: Creating a Single Catalog in a Centralized BI Portal

The Microsoft self-service BI story begins in 2006 with the effort to create a single, centrally managed, and highly available report catalog hosted on a SharePoint CBI portal. This report catalog consolidates many repositories of reports, dashboards, and other BI solutions into a single user interface portal that provides seamless access to reporting tools and data across the company. Figure 3 shows the CBI portal design and interface.

Figure 3. Microsoft CBI portal and architecture

Figure 3. Microsoft CBI portal and architecture

Prior to deploying the enterprise report catalog in the CBI portal, Microsoft IT faced an environment where each department maintained individual reporting solutions with their own data, processes, and reports. Effectively, each department maintained control over its own infrastructure, duplicating efforts in a non-standardized way. To ensure the applicability and reliability of data, Microsoft IT validated and sanitized data during the onboarding process of moving all the reporting solutions to the CBI portal. Microsoft IT carried out this initiative to ensure clean data while preserving individual department needs. A lab environment and testing ensured that the transition happened smoothly in a phased approach with integration and acceptance testing. Each department retained the ability to designate contact people who could assign rights and permissions for accessing and working with reports to help groups maintain control over their BI needs.

You can read the full story of how Microsoft IT deployed CBI at http://technet.microsoft.com/en-us/library/bb735122.aspx and for more information about the CBI portal, see http://technet.microsoft.com/video/consolidated-business-intelligence-portal-lowers-costs.aspx.

Step 2: Establishing a Single Version of the Truth through EDW

As more and more departments adopted the CBI portal over their individual solutions, Microsoft IT pursued the next stage of its BI vision: to create a centralized EDW store that houses master data with the goal of transitioning BI solutions from using individual data housed in department silos to authoritative data that exists in the EDW.

The EDW made it possible to have reliable master data through acquisition and integration procedures that pull source data from multiple systems and use ETL processes to integrate it with the EDW. Microsoft IT worked with groups company-wide to identify enterprise data, acquire the data, integrate it, and make it available for reports in the CBI portal, as shown in Figure 4. The overall approach to integrating data requires acquiring data from enterprise systems, creating ETL processes for it to integrate it with EDW, and enabling it for access from the CBI portal.

Figure 4. EDW Architecture

Figure 4. EDW Architecture

The deployment of EDW enabled Microsoft IT to establish an authoritative source for enterprise master data by using previous and latest versions of Master Data Services (MDS). MDS helps Microsoft IT to standardize data across diverse systems, manage master data directly, and ensure the integrity of this information over time. In this way, an authoritative EDW data source provides a single version of the truth for various dimensions. Microsoft IT continues to expand and develop EDW to acquire, integrate and distribute data for many LOB applications.

For more information about how Microsoft IT designed and deployed EDW, see http://technet.microsoft.com/en-us/library/gg456498.aspx and http://technet.microsoft.com/en-us/library/dd794467.aspx.

Step 3: Using Clean Data

Once EDW made it possible to use clean data in BI reports and applications, Microsoft IT focused its attention on transitioning reports in the CBI portal to use the EDW data. Some reports incorporate data that does not exist in the EDW; therefore, it is not possible to switch all reports to use EDW data.

Using EDW data is a beneficial value proposition for teams. Within EDW, the clean data from the enterprise resource planning (ERP) and customer relationship management (CRM) systems, as well as departmental data warehouses is separated into data marts and cubes. The cube and mart design is optimized for performance and reliability based on the storage capacity and traffic patterns for each type of housed data. In this way, switching to EDW results in reliability and performance improvements. To help teams transition, BIE dedicated technical and business resources that onboarded BI solutions to work with EDW data on the CBI portal.

Step 4: Enabling Self-Service BI with PowerPivot and SharePoint

The next milestone in Microsoft IT�s vision for BI was to empower people to achieve pervasive insight through PowerPivot and other Microsoft BI tools, including SQL Server Reporting Services, and the dashboard, scorecard, and analytics components of PerformancePoint that are part of SharePoint 2010. Having created a centralized EDW, the next challenge entailed enabling business users to create reports, scorecards, and dashboards without involving IT for personal BI, and then sharing those BI applications with other team members, and even the entire company.

By the time that self-service BI became a reality with the release of PowerPivot, SQL Server 2008 R2, and SharePoint 2010, the EDW contained many terabytes of data, and the CBI portal contained over 18,000 reports serving more than 30,000 users. Even with these accomplishments, self-service BI offered business users the ability to use their own data sets in combination with EDW data, and to be more agile in developing BI applications on their own terms without involving IT.

The BIE team enabled business users for self-service BI by updating the infrastructure and adding new services and capabilities. Among other improvements, Microsoft IT added support for PowerPivot, SQL Server 2008 R2 Reporting Services, and upgraded servers to SharePoint Server 2010.

Business users are the ones who best understand their own needs for the type of intelligence they want to gather and create in order to make decisions. Recognizing this, Microsoft IT wanted to leverage the strengths that IT has of securing, managing, and hosting infrastructure as well as the strengths that business users have of creating BI solutions relevant to them. Microsoft IT extended self-service BI capabilities to the CBI portal and enabled designated power users to create reports against EDW data. Figure 5 shows the self-service BI solution.

Figure 5. Self-Service BI Architecture

Figure 5. Self-Service BI Architecture

To encourage people in developing BI solutions, and to facilitate teams during the onboarding process, BIE relies on a sandbox environment. BIE uses this environment to test updates before implementing them, and makes it available for general BI application development to teams.

Step 5: Providing Data as a Service using Windows Azure, SQL Azure, and On-Premise Data

Having enabled the BI infrastructure for self-service BI, BIE supported more and more users of the CBI portal, expanding to more than 22,000 reports, and over 39,600 users. However, even with so many reports, and terabytes of data in the EDW, users still wanted to bring in their own data, integrate it with EDW and create reports, dashboards, and other BI applications on their own using PowerPivot.

Even though there was a central catalog of reports in the CBI portal, there was no single catalog of all databases at Microsoft. Business users, developers, and Web services pulled data from multiple locations. No single approach existed to manage and grant permissions to consumers of data, and the process of data provisioning for users needed additional streamlining. The solution to deal with these challenges is a Windows Azure-based application called DSL, which is an abstraction layer that seamlessly exposes data from SQL Azure and from on-premise data sources in a single unified catalog of databases. Through DSL, data publishers expose data in a secure-by-design way without making copies of the data. Users browse a common URL to locate the data packages they need and consume these centrally available data packages without replication to maintain a single authoritative source of data. Figure 6 shows how DSL integrates with the existing infrastructure.

Figure 6. DSL Integration with EDW

Figure 6. DSL Integration with EDW

Anticipating that with self-service BI capabilities, reports on the CBI portal would consume cloud data from SQL Azure sources, the BIE team built DSL with the capability to expose data hosted on SQL Azure, as well as data contained on premise in the EDW. At Microsoft, previous efforts in improving EDW have resulted in the majority of critical data being stored in EDW. Therefore, although DSL exposes both EDW and SQL Azure data, most of the initial BI solutions that take advantage of DSL consume only EDW data.

Benefits of Hosting Data as a Service

By creating DSL, Microsoft IT enables workers with the capacity to use more data to gain insight on their own terms, without involving the IT department, and without requiring extensive training in programming or data modelling. DSL not only makes use of existing BI investments that Microsoft IT made in developing EDW and the CBI portal, but also provides flexibility to meet future needs. With the trend of moving more and more data to SQL Azure, and the increased development of cloud-based applications, by developing DSL, Microsoft IT ensures that BI solutions continue to meet the needs of decision makers.

Give Workers Power to Innovate

Providing data as a service through DSL is yet another way for Microsoft IT to empower workers to benefit from self-service BI and provides additional benefits:

  • Get more data when needed. Data as a service increases the types of connections and relationships that it is possible to make between data sets because more data packages are available by default, and it is easy to add new data sources.

  • Make data more transparent. Before disseminating a data catalog and creating DSL, Microsoft IT responded manually to user requests for data. The IT overhead aside, the request process itself was challenging because if users do not know the possible categories of data, they cannot request it. Greater transparency and availability of data sources improves the ability to discover new insights.

  • Look at data in new ways. With new ways of looking at business information, there is the possibility to realize better team and organizational productivity through dashboards and improved visibility into key team and organizational metrics.

Improve Organizational Effectiveness

Microsoft does business in more than 100 countries across products and services ranging from consulting to computer and gaming hardware, to personal and business software.

  • Increase efficiencies. DSL enables Microsoft�s many business units to capture and use their business information more efficiently. By packaging data sets in a standardized way and distributing them through a central Web site,

  • Keep teams accountable. Microsoft IT helps business units to achieve greater transparency and accountability. When workers can access data packages of enterprise data from the EDW, it opens up a world of possibilities to revisit existing BI solutions

Facilitate IT Efficiency to Reduce Costs

For Microsoft IT, DSL offers the following benefits as part of the BI solution offering:

  • Use familiar tools. As Microsoft IT increasingly adopts cloud-based solutions to meet business needs, this trend takes advantage of the inherent benefits of cloud computing such as pay-as-you go payment, and rapid scale-out, while using familiar and intuitive development and management tools found in Visual Studio, SQL Server Analysis Services, and PowerPivot.

  • Leverage existing investments. DSL builds upon the previous investments Microsoft IT made to develop the CBI portal infrastructure and EDW, and exposes data stored in SQL Azure, yet also enables BI solutions to leverage data in the EDW.

Hosted Self-Service Architecture and Technical Details

The CBI portal serves BI solutions to nearly 40,000 users, out of a potential user pool of more than 90,000 employees and contingent staff, across more than 60 business units and 100 applications. The datacenters that handle traffic are located in Redmond, Dublin, and Singapore to support users worldwide. Although Microsoft has more than 70 major sites all over the world, the majority of its users are concentrated in several sites, such as its headquarters in Redmond. As a result, the server distribution design supports more users accessing Redmond than other locations. Figure 7 shows the server distribution for CBI in the production environment.

Figure7. Network Architecture

Figure7. Network Architecture

In addition to the production environment, Microsoft IT runs a user acceptance test (UAT) environment used to verify scenarios, for onboarding new applications, and for general acceptance testing. The test environment is an exact replica of the production environment for both DSL and CBI.

DSL Service Architecture

The essential purpose of DSL is to create an abstract schema representing underlying databases, and to enable SQL APIs, Web services, and UI controls to explore and consume the data. DSL as a platform centralizes access to EDW and other data, and makes it available for use.

DSL uses concepts similar to databases such as tables and fields. It also supports version control to enable schema changes, Table 1 shows a comparison between the terminology of database technologies and the terminology that DSL uses.

Table 1. DSL Terminology

Relational Database Term

DSL Term

Database

Data Package

Table

Data Group

Field

Data Element

Logical Architecture

The DSL architecture uses a classic three-tier approach consisting of front-end servers that are load-balanced and handle requests, a middle-tier server to handle business logic, and SQL Server cluster for housing the back-end databases.

Clients access DSL in one of two production URLs: one for the corporate production environment, and one accessible on the Internet. The test environment also has two URLs to mirror the configuration. The front-end servers run a custom Silverlight application for browsing and consuming data packages, as shown in Figure 8.

Figure 8. DSL site

Figure 8. DSL site

The DSL client that runs as a service on front-end servers upon receiving a request and relying on IIS to handle authentication communicates with the SQL Server back-end that houses metadata, and may communicate with the underlying databases specified in the metadata. As a user browses the DSL Web site and performs typical tasks, such as viewing a data package, the DSL client calls the metadata database to obtain the metadata, and then the built-in business logic may retrieve the underlying data that the metadata describes. Figure 9 shows the DSL architecture.

Figure9. DSL Logical Architecture

Figure9. DSL Logical Architecture

The architecture relies on the following components:

  • DSL Client Workhorse of DSL that both processes requests and talks to databases, exposing a logic layer for consumers

  • Metadata on SQL Server This is stored in databases on the server, and is set up during the onboarding process. All the details of the data package, such as the schema maps of the underlying database are included in the metadata.

  • Supporting services for client functionality Two services provide supporting functionality to the DSL client service: an admin service and a data handler service.

  • SQL Azure Because DSL was deployed both on premise and in the cloud, Microsoft IT wanted to use the same metadata and support a unified approach through a bridge service.

Exploring and Browsing Data Packages

Upon accessing the DSL site, the initial display enables people make choices about the data package they want to consume. The packages are organized according to permissions, with some data available for anyone in the company. Most data is specific to a team or group who requested the data package. Upon selecting a data package version, display shows various metadata associated with the package, such as the description, owner name, contacts, security model and permissions, and start and end dates for availability, as shown in Figure 10. If a user does not know what data to use, a search feature is also available.

Figure 10. Data Packages in DSL

Figure 10. Data Packages in DSL

Consuming Data

Once a user selects a specific package, DSL provides for multiple ways to consume the data through code that displays sample calls, as shown in Figure 11. These sample calls rely on a special DSL function that is part of the API, and which exposes DSL functionality. As Figure 11 shows, the sample calls include the possibility to use tools such as Excel and PowerPivot by simply copying the code from DSL and pasting it in the target application.

Sample calls contain all data elements available in a data package version and limit the number of rows returned to 10. Users wishing to consume DSL data can modify the sample calls to limit the data elements returned, add where, join, and filter conditions.

DSL offers the following types of sample calls:

  • Excel For use with Excel Services and PowerPivot.

  • SQL Full Pull SQL Query for pulling a full set of data from the underlying database.

  • SQL Delta Pull SQL Query for pulling the changes of data based on snapshot dates.

  • Web Service Sample Web services calls to use in a custom Web service consumer.

  • Sample Data Displays 100 rows of sample data for the data group, not including Personally Identifiable Information (PII) or High Business Impact (HBI) data. Any columns that have been designated as PII or HBI have PII or HBI shown in the column

Figure 11. Sample Calls and Data in DSL

Figure 11. Sample Calls and Data in DSL

A useful capability that sample calls introduce is using DSL queries as a data source within PowerPivot. All that is required is to copy and paste the sample call into PowerPivot, validate it, adjust the code as necessary to filter results, and use it to create a BI solution.

Server Building Blocks for Scalability and Capacity

To increase make manageability, scalability, and facilitate capacity management, the BIE team uses a componentized approach in designing the server infrastructure. As shown in Figure 7, servers are clustered for high availability. Each server cluster serves a specific function for either front-end, or back-end functionality for all associated services, such as Excel and SQL Service Reporting Services. Each server cluster constitutes a building block to be added as necessary to add capacity for rapid scale-out. Back-end servers support over 10,000 users, depending on the nature of the reports, concurrency, and data volumes. Front-end servers support more than 5,000 users, with similar considerations regarding concurrency and data volumes. Both DSL and the CBI portal use this approach.

Because DSL is a lightweight solution that does not store underlying data, and only metadata, the servers have high capacity to handle concurrent users and data packages. In fact, the original design established a basic high-availability infrastructure using clusters, and with less than 2,000 users, the server load on disks, CPU, and memory utilization has routinely remained below 40%.

Security

DSL relies on the integrated security model of SharePoint, SQL Server, PowerPivot, Windows Server, and the underlying Active Directory infrastructure for user authentication and authorization. This includes common aspects such as security accounts that services use, delegation, and access. At the SharePoint level, BIE grants a single person or a team of power users within a team with site-level access to manage permissions for users within the team. For more information about the PowerPivot security model, see http://technet.microsoft.com/en-us/library/ff955761.aspx.

DSL includes its own security considerations for ensuring the appropriate access and use of data packages. Additionally, DSL provides for an option to enable column level security to specify access levels for sensitive data.

Onboarding and Evangelism

The success of the CBI portal and DSL depends on a customer-focused onboarding process and tireless evangelism. In its infancy, the CBI portal encountered gradual adoption rates common to new service offerings. The portal offered a hosted, managed solution with the typical benefits of centralized access, such as reduced overhead and reduced need to tie up group IT resources to support team BI solutions. Yet, even with these and other benefits, groups at Microsoft had to first find out about the service offering, then evaluate it, allocate resources to transition BI solutions, and then go through the onboarding process. The entire process begins by finding out about the BIE platforms and services, making evangelism vital to the process.

Running BI and Data as a Service

With self-service BI capabilities, Microsoft IT has transitioned away from the previous model of hosting servers and creating reports to a model of providing services that enable business users to create their own reports. Teams use the CBI portal not only to consume the reports contained in the report catalog, but also to create their own reports and add them to the catalog.

One challenge for Microsoft IT in operating BI as a service is that the dependencies of BI, such as the data, tools, and collaboration environment, also need to run as services. For example, in offering DSL to enable data as a service, BIE created an abstraction layer that envisions the sources of data and the consumers of data as separate parts of the BI service offering. This decoupling makes it possible to consider data sources as a service, such as by providing data via SQL Azure. Similarly, reporting, analytics, collaboration, and solution development may also be considered as individual hosted services for users to consume.

Evangelism

At the core of the success of the CBI portal and DSL is the enthusiastic evangelism that introduces the BI service offering to Microsoft teams. Sanjay Soni led the effort to win the hearts and minds of customers by tirelessly promoting BI capabilities, and how they may benefit not only existing BI solutions, but also facilitate the development of new ones.

The evangelism approach incorporates many forms of media and outreach to introduce the possibilities of CBI and other platforms like DSL. Several ways in which BIE promotes and educates includes the following:

  • Dedicated video channel For education and promotion, BIE has recorded videos available for on-demand viewing that help answer common questions about the CBI portal, DSL, SQL Server Reporting Services, and other BI tools and features.

  • Conferences Sanjay presents at various conferences, such as TechEd, on the capabilities of the Microsoft BI solution, and how it makes the CBI portal and DSL possible.

  • Lync sessions For individualized support, BIE conducts demos and presentations that demonstrate the portal and DSL benefits.

  • Onboarding site To support teams during onboarding, BIE maintains a dedicated site that guides teams through the process. BIE also hosts an informational site that answers FAQs and provides on-demand content.

  • Personal outreach BIE also reaches out to stakeholders within Microsoft who are responsible for team BI solutions to understand and meet team needs.

Onboarding

The onboarding process that the BIE team uses fosters a close collaboration between BIE and business groups. During onboarding, BIE dedicates SMEs for both DSL and the CBI portal who work with business users to provide ongoing support. BIE also holds weekly calls to check on the status and progress of onboarding. The actual workflow on the DSL site to request a package is relatively straightforward. It consists of clicking a request button to open a window with a form, completing the form with data package and underlying server information, editing the schema to include only relevant information, tagging any PII columns, and then submitting the data package for approval.

By the time that it is possible for a person to request a data package, the associated team has been onboarded and trained specific BI application that the team uses and that is onboarded to the CBI portal.

Project Teams

To onboard a team and change their internal processes of internal BI solutions to centralized, hosted, self-service BI requires a delicate hand, a great deal of communication, and engagement of both technical and business experts in the group and at the BIE team.

Several teams engage at various points in the onboarding process, as follows:

  • Business Within the business unit being onboarded, an IT or technical business team who handles BI applications and report creation continues to work with the applications to migrate them to the CBI portal, and to test the functionality. This team serves as the liaison between technicians at BIE and the business unit.

  • Technical BIE includes a production support technical team who support the production environment. They typically focus on servers, operations, and management, to implement designs and change requests.

  • Combined business/technical The platform onboarding team (part of BIE) manages all aspects of the onboarding process, evangelizing the process, training, and prototyping for customers. It works very closely with the business IT team and reports the progress to a product manager within the business unit, who is responsible for defining business requirements, and provides feedback during onboarding.

Process

The onboarding process that is used for both CBI portal and DSL makes space for a great deal of training, education, evaluation, and prototyping to try and find any challenges and sources of frustration before rolling out to the production environment. Figure 12 shows the onboarding process for CBI.

Figure 12.Onboarding Process

Figure 12.Onboarding Process

The process includes the following four high-level steps that are broken out per Figure 12.

  1. Initiate After initial evangelism, to start the onboarding process, the business unit fills out an onboarding questionnaire with estimated number of reports, size of the reports, power users, end users, primary contacts who will work with the onboarding team during the process, and the desired onboarding timeline. Often, they complete this questionnaire as part of or after attending a presentation on the CBI portal and DSL.

  2. Educate The next step is to learn more about the BI platform. The onboarding team provides custom learning opportunities in the forms of pre-recorded training videos, classes, and live conference calls. In addition, a training kit and online documentation is available to help people learn.

  3. Prototype and onboard to UAT After discovering more about BI platform and being trained on, the business unit prototypes their solution and then move to the sandbox, and UAT environment. During this time, they are onboarded and given an opportunity to evaluate the service, performance, and experience. Technically, this is the point when there is agreement on the technical specifications and security needs.

  4. Go Live Once prototyping and testing completes, the BIE team onboards the customer.

DSL Onboarding Workflow

The onboarding process begins by a data publisher from business or IT selecting to create a new data package, and entering the details about it, as shown in Figure 13.

Figure 13. Request form to add new data package

Figure 13. Request form to add new data package

The detail entry screen is relatively straightforward with form entries for name, details, package expiration, and so on. Some of them are optional. Figure 14 shows the next tab, where the server information for the data package is required because the servers entered in this screen are the source(s) of the data package and where the DSL queries are executed. It is possible to add multiple servers for load balancing. It is important to note that Microsoft configures the access account with read permissions.

Figure 14. Server details about new data package

Figure 14. Server details about new data package

The test connection button is available to ensure that all servers and databases are reachable prior to submitting the request. If the data package is sourced from a SQL Azure database, use the check box at the top of the form and supplied the required data fields as described above. The Version Info tab captures information about the data package version to be published, as shown in Figure 15.

Figure 15. Versioning details for data package

Figure 15. Versioning details for data package

After entering the general info, the next step in the workflow is to submit it for approval to DSL SMEs. After approves the package, the requester has an opportunity to edit the schema and add table filters Once the schema updates complete, DSL generates a notification, as shown in Figure 16.

Figure 16. Notification of data package request

Figure 16. Notification of data package request

Support and Incident Management

To support users who rely on DSL, the BIE team provides multiple options and escalation paths that create many avenues to handle any issue, from simple questions on usability and training, to reporting performance and functionality issues.

In order to ensure proper support for a team that has been onboarded, the first line of contact for these users is an alias that contains all SMEs local to the onboarded team, so they are notified of any support issues immediately. One benefit of this approach is to categorize support issues and answer training-related questions or handle use errors instead of sending all questions directly to the main support team. The approach of e-mailing local SMEs also personalizes the experience and enables the BIE team to evaluate ticket categories and implement improvements based on support trends.

Once the SME team receives an email requesting for support, they determine if it is a training-related issue or a technical issue involving the servers. If the issue is technical and related to infrastructure, the e-mail is routed to the 24/7 support team. If the support request is a training-related issue, the SME team contacts the user with training guides or sets up a meeting either in person or through Microsoft Office Live Meeting to resolve the issues immediately.

Issues related to the test environment are escalated to the support desk, which obtains more information and works with the user to categorize, prioritize, and assign the issue to a support staff for resolution.

Collaborative BI Example

BIE from the beginning sought to serve users by enabling them to access the information they need and make decisions based on clean data. As the service offering matured, BIE added more and more capabilities for self-service BI. DSL is the latest addition in that effort, and works with the CBI platform solution to support, SQL Server Reporting Services, and other development tools that are SharePoint-enabled for a convenient means of collaboration.

To collaborate and share BI solutions using common tools such as PowerPivot and SQL Server Reporting Services, users can publish them to the relevant SharePoint document library directly from Excel 2010 using the Publish option, or upload directly to the document library through the SharePoint site, as shown in Figure 17. Microsoft IT configures SQL Server Reporting Services to enable users to upload reports.

Figure 17. Report Gallery

Figure 17. Report Gallery

Training Reporting Portal

The Microsoft Sales, Marketing and Service (SMSG)) Readiness group is one important customer of BIE that takes advantage of the CBI infrastructure to generate reports and analyze data. The many products and services that Microsoft offers require a sizeable sales and marketing organization that interacts with potential and existing customers to meet their needs. Over 45,000 people do work related to sales and marketing, most of them interfacing directly with customers in the field.

SMSG Readiness customized the CBI portal platform by creating a custom URL for access and developing a SharePoint template that works with a Silverlight wrapper for displaying content. Prior to developing this solution, SMSG Readiness used multiple reports spread across various repositories with no unified taxonomy or access point. With CBI, the SMSG Readiness group enjoys a centralized report catalog that houses 78 group-specific reports and makes it convenient to access favorites, search for pre-defined reports, and create custom ones. Figure 18 shows the SMSG Readiness report catalog.

Figure 18. Report catalog

Figure 18. Report catalog

To best equip workers with product and solution knowledge to meet customer needs, the SMSG Readiness group conducts training and education courses. For example, it is possible to enroll in courses that train on the capabilities and functionality of new product releases, or acquire in-depth knowledge regarding a specific technology, such as messaging and collaboration.

It is crucial for SMSG Readiness to have insight into various aspects regarding training and courses, such as feedback, types of sessions, enrollment numbers, and so on. These insights enable management to make general decisions about which courses to develop in the future, course mix per region, and instructor performance, as well as more specific decisions such as auditing for compliance for individual workers or teams. Figure 19 shows an example of a report showing evaluation for a specific course.

Figure 19. Report example

Figure 19. Report example

Many types of users consume reports on the SMSG Readiness portal, from executives evaluating strategic direction, to manager, and instructors. The report diversity meets these varied needs. For example, one key report that applies to many types of users is the evaluation report, which displays all feedback for a course, and enables business users to drill down and evaluate trends to spot improvement opportunities, or perform other types of psychometric analysis. Managers may use this on an ad ad-hoc basis to verify training quality, or to plot general trends and create long-term baseline and improvement goals.

The report types available to users span the continuum from completely pre-defined reports refreshed on demand to provide up-to-date information, to completely self-defined Excel views that enable users to browse cube data and create custom PowerPivot solutions using any data that the cube contains. In this way, SMSG Readiness enables a broad range of self-service BI capabilities for users of all levels of expertise.

Best Practices

Microsoft IT uses the following best practices in implementing and operating the DSL solution and the CBI portal.

  • Build your consolidated BI infrastructure in parallel with (or even before) the enterprise data warehouse Microsoft IT learned that it is best to be realistic about EDW adoption rates. In the case of balancing back-end development and BI applications, this lesson means business users must have access to their BI solutions at all times, even if the reports might contain data that has not been audited. The value proposition that EDW brings to existing solutions is clean, reliable, centralized data. Microsoft IT follows the best practice of only allowing clean data in the EDW by working with departments to onboard their BI applications.

  • Architect with the enterprise vision in mind Any BI solution that a single department uses may potentially become a company-wide application or gain wider adoption after it is made available for collaboration and use. The data-intensive nature of BI applications requires planning for enterprise-level availability, scalability, and reliability.

  • Evangelize to achieve enterprise adoption For many creators and consumers of BI applications, those solutions are developed in house for individual or team needs. When a change opportunity arrives to improve BI capabilities through centralization, it often requires direct outreach to ensure adoption. Microsoft IT uses demonstrations, direct outreach, training, conference calls, and other means to onboard and support teams.

  • Support users to find relevant data sources In order to achieve new insights, users need to be able to create relationships and have access to many types of data that is easy to find and incorporate. With the DSL approach, Microsoft IT makes it possible to use data packages conveniently in a way consistent with the idea of self-service BI.

  • Empower users to create their own business intelligence Historically, creating BI solutions was slow and expensive, making dashboards and reports available for key personnel, such as executives. This is no longer the case with the availability of tools such as PowerPivot and Report Builder. Yet, power users need training and support to make the most use of the new possibilities. Microsoft IT follows the best practice of making data available, supporting users, and ensuring they are not alone in creating and consuming BI applications.

  • PowerPivot is not a replacement for EDW Microsoft IT centralizes enterprise data to achieve a single version of the truth in the EDW. PowerPivot is a tool for rapidly creating BI solutions and helps with designing better requirements for EDW. Yet, a secured and high-quality data store for the essential business data is still required to make data meaningful and consistent among reports.

  • PowerPivot uses Access or No Access for permissions In PowerPivot 2010, once you create a solution and store data in a PowerPivot file, data level security is not an option because the same credentials are used for access as when used when the report was created. Microsoft IT uses SQL Server Analysis Services For more granular security. Many teams have already started using a beta version of the next PowerPivot release, which provides users with more granular security for the PowerPivot solution.

  • Use the 64-bit version of PowerPivot Because PowerPivot relies on an in-memory engine, large BI solutions require more RAM. For best performance, Microsoft power users depend on the 64-bit version of Windows 7 and Excel.

For More Information

For more information about Microsoft products or services, call the Microsoft Sales Information Center at (800) 426-9400. In Canada, call the Microsoft Canada information Centre at (800) 563-9048. Outside the 50 United States and Canada, please contact your local Microsoft subsidiary. To access information through the World Wide Web, go to:

http://www.microsoft.com

http://www.microsoft.com/technet/itshowcase

The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information presented after the date of publication.

This White Paper is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED, OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT.

Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place, or event is intended or should be inferred.

© 2011 Microsoft Corporation. All rights reserved.

Microsoft, Active Directory, Excel, Lync, SharePoint, SQL Azure, SQL Server, Windows, Windows Azure, and Windows Server are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.

All other trademarks are property of their respective owners.

Was this page helpful?
(1500 characters remaining)
Thank you for your feedback
Show:
© 2014 Microsoft