Important implementation considerations

Updated: 2009-04-30

There are important decisions to make as you plan the implementation of PerformancePoint Planning. Before you start building the application, analyze and make decisions about the following areas.

How many applications do you need: one or many?

Many planning considerations will affect your decision to create one or multiple Microsoft Office PerformancePoint Server 2007 applications. The primary consideration is how much of your metadata and business object definitions will be shared. While multiple applications are more scalable and flexible, each application can be thought of as an independent container for your data, and data cannot be shared across applications.

PerformancePoint Server allows one calendar per application, so if your company has multiple businesses on different calendars, two applications must be built to accommodate your organizational structure. Use care to ensure that your application calendar accurately matches your business processes. The calendar cannot be modified; a new application has to be created.

When planning your application's model sites, determine whether your organization has centralized processes and data or if you have several divisions with very different business types within your organization.

If your organization is centralized, multiple model sites may not be necessary. Even if your organization is decentralized, but processes are consistent and data reported to corporate headquarters is normalized across the businesses, you could have multiple model sites that summarize data at the corporate level.

However, if your organization is not centralized and data is not consistent across divisions, you may need different model structures for the dissimilar parts of your organization. To accomplish this, you may need to create multiple applications.

Separating the corporate root model site from divisional or business unit model sites also enables user- and role-based access to data. This is important in maintaining corporate control over assumption models used in the planning process.

Security considerations can also affect site-planning decisions. If you want to specify varying permissions for members of the Modeler role, you can do so by creating model sub-sites and setting permissions at the model-site level.

Some other things to take into account are:

  • Different project schedules: The timetables used by those responsible for planning, budgeting, forecasting, or actuals could differ, making it difficult to initially implement the project in a single application.

  • Consider starting small: Using a number of applications to start, with the long term goal of migrating to a single large application, might be the best way to begin your rollout. This approach involves less coordination between business teams and allows for separate schedules for each team.

  • Different local requirements: If a remote site or subsidiary wishes to track site-specific or work-specific items, you might wish to use a sub-site for the remote location. This allows the subsidiary to used shared data, but allows customization.

  • Handling multiple geographies: While you can install Planning Server in a central location (such as headquarters) and have remote users connect remotely, you might want to create a sub-site for remote locations. The remote sites could then use the Data Integration feature of PerformancePoint Server to move their data to headquarters on a periodic basis. It is the responsibility of the sub-site to ensure that it still conforms to corporate site requirements.

  • Scalability: If all users share the same application, scalability should be an important consideration in the project planning cycle.

Choosing a model type

When creating models in your PerformancePoint Server application, you can choose from five model types: the Global Assumption model, the Exchange Rate model, the Financial model with shares calculations, the Financial model without shares calculations, and the Generic model.

The primary consideration when determining which model types to include in your application is how you will use the models and their data, such as whether you are creating a business model or an assumptions model. You may also find that different end users will want differing levels of data, so it is important to include a broad spectrum of users in initial planning.

For example, the data that you want to see in reports may need to be modeled differently than data used solely for financial reporting. A model used primarily for financial reporting may not include salesperson data or be at the correct level for salesperson-level reporting; however, this data would be vital for sales scorecards. In this scenario, you may wish to design your models to provide detailed sales data that can be summarized for financial reporting.

Additionally, you can create one model of a given model type or multiple models of the same type for different purposes, so this may be an alternate solution to the above scenario.

Model types

Model type Description

Generic

The most basic model type. It can be used for any other model type needed and contains no predefined rules for accounting logic.

Global Assumption

Used for baseline data that will apply throughout a business, such as headcount or price-list information, or for business drivers in a financial model, such as revenue and expenditure assumptions.

Exchange Rate

A special-purpose assumption model that tracks foreign exchange values by period and exchange type for all currencies in your system.

One important consideration for Exchange Rate models is whether you will want to track exchange rates by multiple frequencies — such as daily, monthly, and annually. Because the Exchange Rate model does not provide aggregation, you will need a separate Exchange Rate model for each frequency in your application that uses exchange rate assumptions.

Financial without shares calculations

Includes built-in logic to perform consolidations, but does not include shares calculations. If you have one entity, or multiple entities that are analyzed individually, use this model type. For example, this model could be used to create a corporate costs model or a departmental model, such as a model specifically for use by the Human Resources department.

Financial with shares calculations

Includes built-in logic to perform consolidations that include shares calculations and also to calculate inter-company eliminations. For example, if you have multiple entities and you will provide consolidated reporting at the corporate level, or if you have multiple entities that are not fully owned, then use this model. This model type could also be used to create a strategic-planning model.

Dimension-planning considerations

PerformancePoint Planning provides two categories of dimensions: predefined and user-defined. Predefined dimensions can be used as generated, but typically some modifications are necessary to accommodate your current data structure and naming conventions. The scope of customization for predefined dimensions is more restricted than for user-defined dimensions.

When you are planning your application, it is also important to consider the number of dimension members you will include in each dimension. For dimensions that will contain many dimension members, you can create additional dimension member sets to limit the number of accounts per model, therefore aiding application performance.

Calculations

Business calculations in PerformancePoint Planning can be achieved either by using a server-side calculation or a client-side calculation. Server-side calculations in Planning Server are presented as business rules. These business rules use PerformancePoint Expression Language (PEL) to express the calculation that the user wants to perform. Client-side calculations are achieved by using calculation capabilities embedded in PerformancePoint Add-in for Excel.

Server-side calculations

Calculations in a server business rule expressed in PEL are actually carried out by either the Microsoft SQL Server engine or the SQL Server 2005 Analysis Services (SSAS) engine. To carry out a calculation in the SQL Server engine, the rule needs to be compiled into a SQL stored procedure, which is then executed to carry out the calculation. To carry out a calculation in the Analysis Services engine, the rule needs to be compiled into either MDX query statements or MDX calculation scripts, which then get sent to the SSAS engine for execution.

In general, a rule can be carried out by any of the above three (SQL Server engine, SSAS engine by using MDX query, or SQL Server Analysis Services engine by using MDX scripts) platforms. Each calculation platform has different performance and behavior characteristics. The table below summarizes these characteristics.

Execution platform Behavior characteristic Performance characteristic Recommendation

SQL Server engine

Calculation is invoked either by the end user explicitly or by a re-processing event.

Calculated data is materialized and written back to the fact table.

Certain complex expressions are not supported.

Calculation handles data sparsity well, namely that the execution time is proportional to the fact data size rather than to the calculation scope.

In most of the cases, this is the desired choice if the calculation can be invoked on demand or triggered by a re-processing event.

Things to look out for are data explosion and complex expressions beyond the capability of SQL Server processing.

SQL Server Analysis Services engine by using MDX query

Calculation is invoked either by the end user explicitly or by a re-processing event.

Calculated data is materialized and written back to the fact table.

Supports full PEL expressions.

Calculation does not handle certain kinds of sparsity well. If you have a calculation which covers a large region, then the performance could be slow even if the actual (non-null) calculated results are relatively small.

It is suited for calculations that cover a small region. This is the easiest way to test and debug a calculation. The expression power is more powerful than the SQL Server engine.

Things to look out for are large calculated regions, data explosion, and slower performance.

SQL Server Analysis Services engine by using MDX script

Calculation is automatically invoked and maintained by SSAS engine. No user invocation or Microsoft Office PerformancePoint Server 2007 triggering event is necessary for the calculation to be carried out.

Calculation result is not materialized, so there is no data explosion issue at the back end.

Calculation has poor sparsity characteristics. Performance is especially slow if you have multiple calculations and one of them triggers large calculation regions.

Use this platform if you want the calculation to be real-time and automatic.

Look out for slow query performance when you have this type of calculation.

A simple ratio calculation can easily be carried out either by the SQL Server engine or the SSAS engine by using MDX script. The difference between the two is the trade-off of data expansion versus slower run-time query performance.

It is not efficient to carry out a large-scope, leaf-level-only calculation in the SSAS engine by using an MDX script. This is because a simple query at a higher aggregated level will trigger a large-scope automatic calculation every time such a query is issued. This can potentially have a big impact on overall query performance.

Client-side calculations

Planning Server users can input formulas to the cells of matrices in the workbook during authoring. Planning uses the term "design-time formula" to reference these formulas. Design-time formulas enable client-side calculation by using rich calculation features in Microsoft Office Excel. The design-time formula can be defined at the root level and automatically inherited at the lower levels. Design-time formula is a convenient way to implement business rules at form authoring time.

Note

Overuse of design-time formulas may slow down form rendering and data submission.

Using Excel formulas allows companies to leverage existing business rules that they already have defined in their existing spreadsheets. A company can take an incremental approach to migrating its important formulas to Planning Server, allowing them to be centrally managed.

During data entry time, users can also input formulas in the writeable region. Planning Server uses the term "run-time formulas" to reference these formulas. A run-time formula can be used to enforce business rules and warrant data integrity.

Analysis Services write-back

Planning uses the Analysis Services "write-back" feature to forecast performance results. If the proposed performance results are not good enough, we recommend using local assignments to reduce the load on the SSAS server. See the following section, "Local cube (offline assignments)," for more information.

Local cube (offline assignments)

PerformancePoint Planning can be configured to allow users to work in online, offline, or mixed mode when they are performing data-entry tasks. The configuration can be set at a per-model level or per-user level.

Planning modelers can enable the offline caching for a large model or a model that contains sensitive data by setting the "AllowOffline" model flag in Planning Business Modeler.

Planning contributors can turn offline caching on or off in their run-time environment by setting the "Automatically cache assignments" option in PerformancePoint Add-in for Excel. By default, both options are turned on.

Online mode: If either option is set to "Off," the Planning user works in online mode. The data is not downloaded to the user's computer.

Local mode: If both options are set to "On," the Planning user is automatically switched to the Local mode (mixed mode) after the data is successfully downloaded to the user's computer. Users might achieve faster calculations than in online mode because the calculations are based on information cached on their local computer.

Offline mode: After the data has been successfully downloaded to the user's computer, the user may choose to work completely offline (that is, without an active connection to Planning Server). Use this mode in the following circumstances:

  • When you are not in the office and unable connect to the computer that is running Planning Server.

  • The computer that is running Planning Server is unavailable because of maintenance.

  • Changes are not saved to the Planning Server database until the user goes back online and submits the form.

Sub-cube

Report authors should consider using sub-cubes when creating reports and forms. Because the size of a whole set of the data could be large, downloading the offline data could be time-consuming and add burdens to the SSAS server. Report authors can define sub-cubes for offline assignments when they design their reports. In this way, when users download assignments for working offline, only the relevant portions of the database are downloaded to their computers.

Data-loading considerations

Planning Server is not a complete extract, transform, and load (ETL) process and cannot be used to construct jobs for extracting, transforming, and loading data from multiple data sources.

You need to define your data sources and the mapping of source data to the Planning models. Dimension and member types will need to be matched to Planning types (for example, the account type on account members must be mapped). You also need to determine if any transformations are possible, define who is responsible for the ETL process, and define the schedule for moving data.

There are several steps involved in data integration:

  • The Planning Staging Database needs to be created. (This step is performed once.)

  • The Planning Staging Database needs to be synchronized with PerformancePoint Planning. This must be done every time there is a structure change.

  • Data must be loaded from the data sources to the Planning Staging Database using an external ETL tool.

  • Reference (dimension and hierarchy) data and fact data have to be validated and errors corrected.

  • Data is loaded from the Planning Staging Database to the Planning Application Database.

This process can be automated by using PPSCmd and scripts.

Planning stores data for accounts natively as debits and credits. For example, a credit account with a credit balance is stored as a positive value; a credit account with a debit balance is stored as a negative value. A debit account with a debit balance is stored as a positive value; a debit account with a credit balance is stored as a negative value.

This means that if your source system stores data with a sign according to its type, then the signs of these accounts must be changed before the data is loaded. Planning Data Integration provides a feature to handle this on data load, but only handles source credit-type accounts stored as negative numbers. If your system uses a different convention, the ETL process used to transfer data from your source system to the Planning Staging Database needs to take care of changing the sign.

Careful planning of data-loading processes is important, as failure to properly plan can result in data that does not reconcile between models, dimensions, and related member sets.

Key data-loading considerations include how often you will load data, whether you do a full versus incremental data load, the amount of data involved, and performance.

You will want to consider timing and frequency of data loading and what users will be responsible for in these processes. This can become significant with multi-site, multi-model implementations.

Business-process considerations

To coordinate collaboration between data contributors, reviewers, and approvers during business processes, Planning Server uses process cycles and data-entry forms. Process management is performed through Planning Business Modeler, whereas form authoring and data submission is performed through PerformancePoint Add-in for Excel.

Planning for data collaboration includes system and workflow planning, in addition to business and process-planning considerations.

System considerations and workflow

When you plan your PerformancePoint Server application, you should consider the logistics involved with the geographies where your users reside. For example, will you have globally dispersed users who will contribute, review, and approve data?

To handle this scenario, you will have to consider such things as time zone differences when you set assignment deadlines, system bandwidth during times when users from multiple geographies will use the system at the same time, and multi-currency planning issues.

Business and process planning considerations

When you begin to plan your business processes implementation, it is important to consider data flow. A useful planning step is documentation of your overall process, users and their roles in the process, and the flow of data.

Another important consideration is process type. Some typical business process types that are used in Planning Server include:

  • Rolling or recurring processes, such as quarterly forecasts

  • Top-down or bottom-up target setting, such as a budget process

  • A step in a larger financial process, such as consolidations

  • General data submission to support other processes

Data security is also an important consideration when you implement business processes. Planning Server secures data on a per-user basis and as part of the process cycle, so identifying which users will need to view, edit, review, and approve data is an important part of business processes planning.

Locale-specific accounting rules that may affect business processes, such as currency conversions or financial consolidations, should also be considered during planning.

When you plan what data-entry forms you will use in your business processes, several things should be considered:

  • The use of calculations in forms

  • Creation of dynamic forms that automatically update each month on a per-user basis

  • The skill set of your users

  • Form complexity

Additionally, any process that has complex calculations or many users can affect performance. Careful planning during form configuration, calculation definition, model design, and data-flow planning can improve performance. Utilization of PerformancePoint Add-in for Excel offline functionality can also help in improving performance and scalability.

When you plan the number of forms to create in your application, you should also consider the life cycle of your data, as data from more than one model can be viewed in a data entry form, but the data entry part of the form can only be linked to one model. If you will need multiple-model data entry, you will have to either create multiple model sites or multiple forms.

It is important to note that Planning Server provides a review and approval process for data, and this functionality meets basic requirements for review and approval workflow. For more complex collaborations, we recommend that you integrate Windows SharePoint Services Web Parts with your deployment.

Security considerations

The security model for Planning Server is based on roles. Users are assigned to roles, and their permission levels in the Planning Server system are dictated by the roles to which they belong. The two types of roles are administrative roles and business roles.

Business roles

Business roles are defined for users who work with actual business data. Note that members of the Data Administrator and Modeler roles have unrestricted access to all business data in the model site, even if they belong to a business role that has restricted settings.

Default permissions apply for all member sets in the model site and for all users in the business role unless explicit permissions have been specified.

Explicit permissions override default permissions. You can explicitly specify Read or Write access to specific member sets or members.

By default, all users who belong to a business role inherit the same permissions. Permissions can be restricted if the custom user permissions feature has been enabled for the member set. When this option is used, permissions will have to be individually configured for each user who requires Read or Write permissions to the member set.

Design business roles for users who require similar access to business data. Before you create a role, identify the shared set of permissions for role members and select the default permission level as well as explicit permissions that best match the common access requirements. If you use this setting as the starting point for defining explicit permissions for data, you can minimize the modifications that you will have to make to the role definition. As a security best practice, use the most restrictive setting that applies.

For best performance, try to minimize the need for custom user permissions when you define business roles. When you customize permissions for a user, you create a new Analysis Services role. The number of roles affects the time that is required to perform some tasks such as deploying a model site.

Using business roles makes process management more dynamic and easier to set up, as roles can be used as user groups for assignment definitions. When a user is added to a role that is used in an assignment definition, the user is also added to applicable process-management tasks.

Administrative roles

The administrative roles are Global Administrator, User Administrator, Data Administrator, and Modeler. Global Administrators cannot connect to a server in Planning Business Modeler unless they also belong to another administrative role. User Administrators do not get read or write access to business data. Modelers and Data Administrators have unrestricted Read and Write access to all business data in the model site, even if they belong to a business role that has restricted settings.

Deployment accounts

There are two accounts that need to be considered when deploying PerformancePoint Server: the Planning Server Service Identity (SI) account and the account used by the database administrator. The SI account is used to communicate to the system databases and source data. The database administrator is the person who creates and configures Planning Server databases. For more information, see the PerformancePoint Server 2007 Deployment Guide.

Note

For many proof-of-concept systems, it is convenient to implement everything using a single server installation and to add user accounts to the computer. However, if you add users of Planning to the Administrators role of the computer where Analysis Services is installed, then by default these users will be granted full access to all data in your Analysis Services databases, including the databases created by Planning Server. For this reason you should follow the principle of least privilege when adding user accounts to your computer.

Additional information on setting up administrative roles and business roles can be found in Planning Business Modeler online help.

Other considerations

  • Performance implications of member sets versus member views: Member sets are translated into parent-child hierarchies in Analysis Services; member views are translated into leveled hierarchies. Analysis Services handles leveled hierarchies more efficiently than parent-child hierarchies, as it pre-aggregates the members. If your member sets are very large or very deep, you may experience performance problems. If this is the case, you should consider using member views when designing reports.

  • Analysis Services: If you determine that Analysis Services is a performance bottleneck, see the SQL Server 2005 Analysis Services Performance Guide (https://go.microsoft.com/fwlink/?LinkId=103090&clcid=0x409) for guidance on how to cluster SSAS servers.

  • Hierarchies: If you have hierarchies that change over time or that are reworked, you can create multiple member sets and use them in different models.

Custom extensions

PPSCMD

The Planning Command Utility (PPSCmd.exe) is a tool that assists in the administration and limited modification of Planning Server. The tool consists of twelve commands that allow for scriptable actions with Planning Server. For example, PPSCMD allows you to automate the data loading process.

For more information on PPSCMD, see the PerformancePoint Server 2007 Operations Guide.

Client-side macros

In PerformancePoint Add-in for Excel, you can use the Excel macro feature to create and run macros. When you create a macro, you can use Visual Basic for Applications (VBA) to specify commands and functions that are stored in a module.

Note

The module must be titled "PerformancePoint" if you want to call it from events within PerformancePoint Add-in for Excel.

You can use the following PerformancePoint Add-in for Excel events to call the PerformancePoint module:

  • AfterRefresh: This macro is run after any worksheet, workbook, or matrix is refreshed, or when a page filter is changed.

  • BeforeAssignmentAction: This macro is run when the Go button (the arrow button next to the Actions drop-down list) is clicked, but before the specified assignment action is run.

By default, the ability to run macros on the AfterRefresh and BeforeAssignmentAction events is disabled. This option can be enabled or disabled by the system administrator in Planning Administration Console; contact your system administrator for help. In addition, for these events, PerformancePoint Add-in for Excel uses the Excel macro security model by allowing only signed macros to run.

Customization in Analysis Services

You can add custom MDX or calculated measures directly to the Analysis Services cube. You will need to reapply these custom changes each time that you deploy the model site.

Custom SQL stored procedures and MDX scripts (native rules)

In the event that the PerformancePoint Expression Language (PEL) is too restrictive for advanced users, Planning allows you to write custom native SQL or MDX scripts. Since these implementations can pose a security risk, Planning requires that the rules be approved. Microsoft Office PerformancePoint Server 2007 runs these rules as a low-privileged user.

See the PerformancePoint Server 2007 Operations Guide for information on how to enable these types of rules.

Sample application

Another resource available when you plan your PerformancePoint Server application is the Alpine Ski House (ASH) Sample Application Guide. The ASH Sample Application Guide provides a profile of the fictitious Alpine Ski House Corporation. It also discusses the drivers behind the ASH decision to move to a PerformancePoint Server solution and the structure ASH decided to use when it created the sample application. The guide also provides an overview of the application implementation process used by ASH. A sample budget process that uses the ASH sample application is included. See the Documentation map for Microsoft Office PerformancePoint Server 2007 (https://go.microsoft.com/fwlink/?LinkId=103091&clcid=0x409) for the link to the ASH Sample Application Guide.

Download this book

This topic is included in the following downloadable book for easier reading and printing:

See the full list of available books at Downloadable content for PerformancePoint Planning Server.