Export (0) Print
Expand All

Chapter 9 — Developing Phase: Database

Published: June 21, 2004 | Updated: October 29, 2004
On This Page

Introduction and Goals Introduction and Goals
Team Focus during the Developing Phase Team Focus during the Developing Phase
Starting the Development Cycle Starting the Development Cycle
Developing the Solution Components Developing the Solution Components
Performing an Assessment Performing an Assessment
Extracting Sybase Schema Information Extracting Sybase Schema Information
Creating the Database Creating the Database
Transforming Transact-SQL Transforming Transact-SQL
Transferring Data Transferring Data
Testing the Database Testing the Database
Closing the Developing Phase Closing the Developing Phase

Introduction and Goals

The purpose of this chapter is to describe how to migrate a Sybase database to Microsoft® SQL Server™. During the Planning Phase, you will have identified which database or databases to migrate, as well as defined the target hardware and operating system.

In the Developing Phase, you will complete all of the transformations of database objects that are necessary to migrate the databases, as well as the scripts used in their administration and their associated data sources. If your environment requires heterogeneous communications, the Developing Phase also addresses these requirements.

It is recommended that the development team automate as much of the migration work as possible in this phase. By automating tasks, you can reduce manual effort and the potential errors that manual code changes introduce.

Team Focus during the Developing Phase

The following table is repeated from the UMPG to help project team members quickly identify which material in the Developing Phase chapters they need to read. The primary team roles driving this phase are Development and User Experience.

Table 9.1: Role Cluster Focuses and Responsibilities in Developing Phase

Role Cluster

Focus and Responsibility

Product Management

Customer expectations

Program Management

Functional specification management; project tracking; updating plans

Development

Code acquisition and development (including unit testing); integration, infrastructure development; configuration documentation, issues identification

User Experience

Training; updated training plan; usability testing; graphic and information design

Test

Functional testing; performance evaluation, issues identification; documentation testing; updated test plan, evaluation reporting; test and test infrastructure development

Release Management

Rollout checklists, updated rollout and pilot plans; site preparation checklists

Starting the Development Cycle

Migrating a database from one platform to another is not a trivial task. Before starting the development effort, you should review the project risk management strategy to confirm that a solid plan is in place to minimize development risks.

You must have installed SQL Server and have access to it from the development computer that is being used to perform the migration. Using the Server Network Utility supplied with SQL Server, you must also ensure that you have enabled all the network protocols that will be required by client applications,

You will require system administrator access to SQL Server.

You will also need to make certain that you can connect to the Sybase database from the computer that is being used to perform the migration tasks. Ideally, the Sybase client tools should be installed, and the Sybase interfaces file (sql.ini) should be updated to reference the Sybase server. For more information about configuring the sql.ini file, see Appendix I, "Sybase Migration Toolkit Installation Guide." During this phase, you will also need the name and password of a Sybase account that you can use to access the database.

Although Sybase and SQL Server databases have many similarities, there are also a number of fundamental differences between the two technologies. The Planning Phase will have drawn attention to the areas of concern that are particularly relevant to the database that you are migrating. You should familiarize yourself with these items and be prepared to handle any unforeseen issues that arise as a result of them.

Before beginning development, you should read Appendix B, "Sybase and SQL Server Architectural Differences," for information about the key differences between Sybase and SQL Server, and Appendix D, "Transact-SQL Behavioral Differences," for additional information on Transact-SQL syntax.

Change Management

The focus of the Developing Phase is baselining and managing changes to the build environment and the code, and controlling changes to the production environment into which the solution is eventually deployed.

Creating repeatable, reusable methods maximizes development effort while constraining risk. During the Development Phase, you can mitigate considerable risk by using the Sybase Migration Toolkit (SMT) and custom scripting to standardize the implementation of changes to database objects.

Developing the Solution Components

During the Developing Phase, the database migration effort focuses on the following tasks that you must complete in sequence:

  1. In each target database, identify Sybase Transact-SQL syntax and dependent database objects that require code changes for the SQL Server environment. In the remainder of this chapter, these changes are referred to as transformations.

  2. Extract the schema information from the Sybase database and generate scripts that can be used to create the database objects in SQL Server.

  3. Create the SQL Server database.

  4. Edit the scripts generated in step 2 and perform all Transact-SQL transformations and other code changes. Run the scripts to create the objects in the SQL Server database.

  5. Transfer data from the Sybase database to the SQL Server database.

  6. Test the database.

To support the development effort, Microsoft created the Sybase Migration Toolkit (SMT), which automates many of these tasks. The tool comprises Perl scripts behind a Microsoft Windows-style tabbed graphical user interface, as shown in Figure 9.1. The script behind each tab performs a specific migration function, as indicated by the tab names. The migration functions indicated by the tabs can be used to perform many of the sequential tasks itemized in the preceding list. The following sections describe the use of each tab at the appropriate point, when completing each of the six steps

Figure 9.1 The Sybase Migration Toolkit user interface

Figure 9.1 The Sybase Migration Toolkit user interface

The following list describes the purpose of each tab:

  • Assessment. This tab identifies some of the specific instances of Sybase Transact-SQL syntax that you must convert. This step was covered in Chapter 7.

  • Extract Schema. This tab extracts the Sybase data definition language (DDL) schema.

  • TSQL Conversions. This tab converts the Transact-SQL syntax using automated scripts. You can customize the scripts to extend the range of code changes that can be automated.

  • SQL Comparison. This tab thoroughly tests Transact-SQL behavioral and syntax changes.

  • Object Comparison. This tab tests the behavior of migrated schema objects to ensure they operate in the same manner as the original items in the Sybase database.

For more information about using the SMT, see Appendix I, "Sybase Migration Toolkit User Guide." The information under the following headings describes each of the SMT Toolkit's tabs in detail.

Performing an Assessment

It should be noted here that the assessment has already been completed during the initial assessments performed during the Planning Phase. The assessment must actually be performed again at this time for every database before using any other SMT functions. This action, while seemingly repetitive, will actually provide a second check and validation that the correct database connection was established. Additionally, the list of objects to be migrated can be reviewed and compared to previous assessment reports.

The first task in developing the solution is to analyze the database for the various database objects that require code changes. The Assessment tab of the Sybase Migration Toolkit (SMT) automates this task. It runs the Sybase to Microsoft SQL Server Analysis Wizard, which prompts you for connection information for accessing the Sybase database (you must have installed an appropriate Sybase Open Database Connectivity [ODBC] driver) and generates a report that identifies the specific issues that your migration must address. The Wizard allows you to examine every item in the database, or just selected types of objects (for example, logins, tables, views, stored procedures, and so on).

The Assessment report is created in Hypertext Markup Language (HTML) format. Figure 9.2 shows an example:

Figure 9.2 Example assessment report

Figure 9.2 Example assessment report

The report provides the following three types of information:

  • Summary Information . This section of the report lists the number of each type of database object found in the target database, the number that pose migration issues, and the percentage of the total that each number represents. These database objects will require the majority of conversion work. The Analyzed Objects section of the report provides details about each object type analyzed.

  • Source Configuration Information . This section displays the configuration options in effect in the Sybase database. This information can help you configure the relevant server and database options in the new SQL Server database.

  • Analyzed Objects Information . This section lists each object analyzed (grouped by type) and, for each migration issue identified, suggests appropriate actions.

Using the Assessment report, you can identify which issues fall into each of the following task categories so that you can estimate time frames for the actual code-change database migration tasks:

  • Transact-SQL conversion issues that can be performed automatically by using the Transact-SQL Transformation tab of the SMT.

  • Transact-SQL conversion issues that you can automate by customizing the script provided with the SMT.

  • Transact-SQL conversion issues that require manual coding.

  • Database objects that require code changes.  

Extracting Sybase Schema Information

The second task in performing a migration is to generate scripts that can be used to build the objects found in the Sybase database. The Extract Schema tab of the SMT runs a script that extracts the Sybase database schema into a series of SQL scripts and creates a directory that corresponds to the structure of the database. This directory tree provides the specific code elements on which your development team will be working.

Like the Assessment Wizard, the Extract Schema Wizard (shown in Figure 9.3) requires connection information for accessing the Sybase database. However, the Extract Schema Wizard connects to Sybase using Sybase client connectivity tools instead of ODBC, so the server entry specified should correspond to an entry in the sql.ini file and not the name of an ODBC data source:

Figure 9.3 The Extract Schema Wizard

Figure 9.3 The Extract Schema Wizard

The Extract Schema script performs several functions:

  • It creates several files that contain DDL scripts for building the objects in the Sybase database, including:

    • The DDL for all of the database objects (except for indexes) extracted from the target database. This file is called sybase_database_name.sql, where sybase_database_name is the name of the original Sybase database; for example, pubs3.sql.

    • The DDL that creates all of the indexes for the targeted database. This file is called sybase_database_name_indexes.sql; for example pubs3_indexes.sql.

    • The DDL to create the user logins. This file is called sybase_database_name_serverDDL.sql; for example pubs3_serverDDL.sql.  

  • It creates a tree structure, including a directory named for the database, with subdirectories for each type of database object in the schema, as shown in Figure 9.4. You can specify the location for this tree structure. These subdirectories are named after the object types: Defaults, Groups, Logins, Rules, Sprocs (stored procedures), Tables, Triggers, Types, Users, and Views.

    Figure 9.4 The Extract Schema Wizard directory structure

    Figure 9.4 The Extract Schema Wizard directory structure
  • The DDL statements from the sybase_database_name.sql file are extracted and parsed into individual files that are placed in the folder for the appropriate data type. For example, tables are parsed into individual files, and the files are placed in the C:\SMT\Database\sybase_database_name\tables\ folder, where sybase_database_name is the name of the original Sybase database. In the task described in the "Transforming Transact-SQL" section of this chapter, you will convert the Sybase DDL in this directory tree structure to use SQL Server Transact-SQL syntax.

    Note  If the Extract Schema Wizard does not generate any files, check that you have specified a valid server name, login, and password; the Wizard fails silently if any of these items are incorrect, and they should be fixed.

The Perl script that performs the transformation, syb2ms.pl, is intended to be extensible. See Appendix I for further details on running this script outside the SMT front end, and on how to extend the tool.

Creating the Database

The third step of the migration process is to create the SQL Server database. The Extract Schema Wizard generates a script that you can use to create the physical SQL Server database, called sybase_database_name_create.sql, in the sybase_database_name folder, where sybase_database_name is the name of the original Sybase database. However, this script creates a default database, and it is unlikely that it will match the requirements of a production system. You should edit this script to create the database following the physical design established during the Planning Phase. Place data and transaction log files on the appropriate devices, and ensure that SQL Server itself is configured appropriately. For more information, see "Physical Database Design" at http://msdn.microsoft.com/library/default.asp?url=/library/en-us/optimsql/odp_tun_1_8mni.asp.

Transforming Transact-SQL

The fourth task to complete is to perform the Transact-SQL transformations and migrate database objects. There are four major steps in this task, and you must perform them in sequence:

  1. Manually migrate dependent database objects.

  2. Manually migrate tables.

  3. Migrate stored procedures, triggers, and views.

  4. Manually resolve additional Transact-SQL syntax issues.  

    Note  The sequence of operations is important. Users and logins must be added before creating the tables that they own and access, and these tables must exist before you create the stored procedures and triggers that reference them.

Each of these steps is discussed under the following subheadings. The "Transforming Transact-SQL" section concludes with checklist that serves as an overview of the processes that are described.

Manually Migrating Dependent Database Objects

The first step to complete the transforming Transact-SQL task is to perform the manual migration of dependent database objects. You should manually migrate the following dependent database objects:

  • Logins

  • Groups (SQL Server Roles)

  • Users

  • User-defined data types

  • Rules

  • Defaults  

The Sybase Migration Toolkit Extract Schema script will have generated SQL scripts and placed them in the appropriate folders under the tree structure it created earlier. You can use these scripts as a starting point, editing them as appropriate for your target SQL Server database (in many cases, they may not require further changes).

You should also review the assessment report generated by the Assessment Wizard because it will tell you of any issues that must be resolved before you can migrate these objects.

Each of the dependent database objects is discussed in detail under the following subheadings.

Migrating Logins

Logins apply at the SQL Server instance level instead of an individual database. How you define the SQL Server login accounts depends on the security model that you implemented when you installed SQL Server. The different security models available to SQL Server are described in Chapter 4, "Planning Phase: Database."

Both SQL Server and Sybase provide a number of built-in fixed server roles with their own inherent privileges, but these roles are different in Sybase and SQL Server. You should review these roles carefully and map any logins granted a specific Sybase role to the appropriate SQL Server role. For more information on Sybase roles, see Roles in SQL Server in the Sybase ASE products manuals, available online at http://manuals.sybase.com/onlinebooks/group-asarc/srv10024/sag/@Generic__BookView/2689.

The following two headings offer prescriptive guidance for migrating Sybase logins for Windows authentication mode and mixed authentication mode.

Windows Authentication Mode

If you are using Windows Authentication mode, follow these guidelines to migrate the Sybase logins:

  1. Review the addlogins.sql file, which is located in the Logins folder. This script contains the login names that were extracted from the Sybase database.

  2. Work with a Windows administrator to associate each Sybase login ID to its corresponding Windows user, if one exists. If there is no association between a Sybase login user and a Windows user, you must create a new Windows user account for the Sybase login, placing that Windows user account in the proper Windows domain.

  3. Based on the security model that you are going to use — either Windows users or groups — review the guidelines in the "SQL Server 2000 SP3 Security Features and Best Practices" white paper available at http://www.microsoft.com/sql/techinfo/administration/2000/security/securityWP.asp to help implement your security model.

  4. Assign the appropriate server roles to each login.  

Mixed Authentication Mode

If you are using Mixed Authentication mode, follow these guidelines to migrate the Sybase logins:

  1. Review the contents in the addlogins.sql file with a Windows administrator to determine which Sybase logins will authenticate using Windows Authentication and which will authenticate using SQL Server authentication.

  2. For those Sybase logins that are using Windows Authentication, work with a Windows administrator to associate the Sybase login ID to its corresponding Windows user, if one exists. If there is no association between a Sybase login user and a Windows user, you must create a new Windows user account for the Sybase login, placing that Windows user account in the appropriate Windows domain.

  3. For those Sybase logins that are authenticating using Windows Authentication, implement your security model using the guidelines outlined in the "SQL Server 2000 SP3 Security Features and Best Practices" white paper available at http://www.microsoft.com/sql/techinfo/administration/2000/security/securityWP.asp.

    For those Sybase logins that are authenticating using SQL Server authentication, you must:

    1. Edit the addlogins.sql file, removing those Sybase login IDs that were defined to use Windows Authentication. The Sybase login IDs that will be authenticated by SQL Server authentication should not be removed.

    2. Using Microsoft Query Analyzer, log in to Microsoft SQL Server as an administrator, load the addlogins.sql script, and execute it to add the logins to SQL Server.

    3. Verify that the logins were created.  

  4. Assign the appropriate server roles to each login.  

Migrating Sybase Groups

Sybase uses groups as a convenience for assigning privileges to sets of users in an individual database. Sybase uses the sp_addgroup stored procedure to create a new group, and users can be assigned to a group when they are added to a database, or they can be assigned later using the sp_changegroup stored procedure.

SQL Server 2000 supports roles instead of groups. Like Sybase groups, roles apply at the individual database level, and privileges can be assigned to roles in much the same way as Sybase groups. Roles are created using the sp_addrole stored procedure. However, the sp_addgroup stored procedure can still be used for backward compatibility — groups are mapped to roles with the same name. Despite this, it is recommended that you review the addgroups.sql script generated by the SMT and modify it to create roles instead of groups.

Execute the following procedures to migrate the Sybase groups to SQL Server roles:

  1. Review the contents in the addgroups.sql file, which is located in the Groups folder, to determine what Sybase groups you should keep and map to SQL Server roles.

  2. Using Microsoft Query Analyzer, load the addgroups.sql script and execute it to add the new roles to SQL Server.

  3. Verify that the roles were created.  

Migrating Database Users

The SMT generates the addusers.sql file that can be used to create the same set of users that existed in the original Sybase database. This file assumes users will belong to the same groups that were defined in the Sybase database. If you have migrated Sybase groups to SQL Server roles, you must modify this script accordingly.

A Sybase user can belong to no more than one group (apart from the public group that all users must belong to). However, a SQL Server database user can have multiple roles. You should review the addusers.sql file carefully and modify it to ensure that users are given the correct roles when they are created. Users can be assigned roles using the sp_addrolemember stored procedure.

You should perform the following steps to migrate the Sybase database users:

  1. Review the contents in the addusers.sql file, which is located in the Users folder, to determine the Sybase users that you should keep. Add any new users if necessary, and assign users to their appropriate roles.

  2. Using Microsoft Query Analyzer, load the addusers.sql script and execute it to add the new users to SQL Server.

  3. Verify that the users were created and assigned the correct roles.  

Migrating User-Defined Data Types

Perform the following steps to migrate the Sybase user-defined data types:

  1. Review the contents in the addtypes.sql file, which is located in the Types folder, to determine which Sybase user-defined data types you should migrate. Add any new user-defined data types, if necessary.

  2. Using Microsoft Query Analyzer, load the addtypes.sql script and execute it to add the new types to SQL Server.

  3. Verify that the user-defined data types were created.  

Migrating Rules

Perform the following steps to migrate the Sybase rules:

  1. Review the contents in the addrules.sql file, which is located in the Rules folder, to determine what Sybase rules you should migrate. Add any new rules if necessary.

  2. Using Microsoft Query Analyzer, load the addrules.sql script and execute it to add the new rules to SQL Server.

  3. Verify that the rules were created.

    Note  The rules will be bound to columns belonging to tables when the various scripts creating those tables are executed later.

Migrating Default Objects

Perform the following steps to migrate the Sybase default objects:

  1. Review the contents in the addefaults.sql file, which is located in the Defaults folder, to determine what Sybase defaults you should migrate. Add any new defaults, if necessary.

  2. Using Microsoft Query Analyzer, load the addefaults.sql script and execute it to add the new default objects to SQL.

  3. Verify that the default objects were created.

    Note  The default values will be bound to columns belonging to tables when the various scripts creating those tables are executed later.

Manually Migrating Tables and Indexes

The second step to complete the transforming Transact-SQL task is the manual migration of database tables. You should review the assessment report generated by the Assessment Wizard because it will tell you about any issues that must be resolved before you can migrate the tables.

When you performed the steps associated with the Extract Schema tab in the SMT as part of the assessment task, a Transact-SQL script file was created in the Tables folder that contains all the table definitions for insertion into SQL Server. This script is called MS_database_name_tables.sql, where database_name is the name of the Sybase database. If there are any problems reported by the Assessment Wizard, you will need to edit this file and make the appropriate changes.

Perform the following steps to migrate the Sybase table definitions:

  1. Using the assessment report, identify any invalid column or table names. Where necessary, change the names to valid SQL Server identifiers in the MS_database_name_tables.sql script and document the changes.

  2. Using Microsoft Query Analyzer, load the MS_database_name_tables.sql script and execute it to create the tables in SQL Server.

  3. Use the Microsoft Query Analyzer to identify any additional syntax violations in the table definitions. Usually, these are data type violations that you will have to resolve. This is an iterative process, and you should use the compiler to identify the syntax violations for the entire file and fix them accordingly. At the end of this step, you should have a Transact-SQL script that contains clean table definitions.

  4. Save the updated MS_database_name_tables.sql file.

  5. Depending on the nature of the keyword or data type conflict, you must propagate any of those fixes to the appropriate database objects, such as stored procedures, triggers, or views. The definitions of these items are held in scripts that are generated by the Extract Schema Wizard.

The Extract Schema Wizard also generates a script you can use to build the indexes on all tables in the database. This script is called database_indexes.sql in the root folder (not the Tables folder) of the directory tree created by the SMT. Although you can execute this script from Microsoft Query Analyzer after the tables have been built, it is recommended that you do not create the indexes until the tables have been populated with data. For more information, see the "Transferring Data" section later in this chapter.

Migrating Stored Procedures, Triggers, and Views

The third step to complete the transforming Transact-SQL task is to migrate stored procedures, triggers, and views. Stored procedures, triggers, and views are the objects that will require the most attention. Some of these objects can be transformed automatically into their SQL Server equivalents, but others may need manual intervention. The report generated by the Assessment Wizard will help you to identify which objects are likely to require the most attention. Also remember that Sybase holds system-wide stored procedures in the sybsystemprocs database — you must transfer any of these system-wide procedures to the new SQL Server database.

Performing Automatic Transformations

Only straightforward syntax conversions are good candidates for automated transformation. The SMT provides the TSQL Transformation Wizard (shown in Figure 9.5) to generate scripts that will convert stored procedures, triggers, and views.

Figure 9.5 The TSQL Transformation Wizard

Figure 9.5 The TSQL Transformation Wizard

You can convert the Sybase syntax of the following Transact-SQL statements to SQL Server Transact-SQL using the TSQL Transformation Wizard:

  • CHAR_LENGTH

  • DBO prefix

  • DEALLOCATE

  • INDEX HINTS

  • RPC syntax

  • PRINT

  • RAISERROR statements

  • @@SQLSTATUS

  • SET TRANSACTION ISOLATION

  • UPDATE alias

    Note  A limitation in the TSQL Transformation Wizard requires every source statement to be on one line. You must edit the extracted files for each object and ensure that statements do not span multiple lines.

When you run the TSQL Transformation Wizard, it generates scripts in the folder called MS in the Sprocs, Triggers, and Views directories (these folders were created by the Extract Schema Wizard). These scripts contain the SQL statements for creating the database objects whose Sybase syntax was converted automatically. There are scripts for each object, but the primary script of interest is called MS_database_object_type.sql, where database is the name of the database and object_type is the type of the database object being scripted. For example, the script that holds transformed stored procedure code for the pubs3 database is called MS_pubs3_sprocs.sql. This script is an amalgamation of all the other scripts in the folder.

Note  If the MS folders do not exist or have been deleted, you should recreate them before running the TSQL Transformation Wizard. If the MS folders are missing, the Wizard will not recreate them automatically.

The exceptions.txt file in the Sprocs, Triggers, and Views folders contains information about which objects were successfully converted and which failed. The scripts in the MS folders for the failed objects can be edited by hand.

If no issues are reported, you can use Microsoft Query Analyzer to load and execute the MS_database_object_type.sql scripts generated by the TSQL Transformation Wizard. However, there is no guarantee that these scripts will run successfully because there may well be other issues that the SMT is not capable of resolving. In these cases, you will need to take further steps to complete the transformation manually.

Adding Automated Transact-SQL Transformations

The TSQL Transformations Wizard invokes a Perl script called syb2ms.pl, which is located in the Scripts folder under the SMT installation directory. This script is editable, affording you the option of creating a Perl subroutine to handle the conversion of any additional Transact-SQL statements that can be handled by automated conversion.

Transact-SQL statements pose varying degrees of difficulty for automation. Choosing which Transact-SQL statements to convert using automated scripts depends on how difficult it is to code the conversion of the Transact-SQL syntax.

For more information about how to create Perl conversion modules, see the "TSQL Transformation Tab" section of Appendix J, "Sybase Migration Toolkit User Guide."

Note  Extending the syb2ms.pl script requires that you have some proficiency with Perl programming.

Manually Resolving Additional Transact-SQL Syntax Issues

The fourth and final step as part of transforming Transact-SQL is to resolve any outstanding Transact-SQL syntax issues.

There are a number of differences between the different dialects of SQL and Transact-SQL used by Sybase and SQL Server that are difficult for an automated tool such as the SMT to identify and transform correctly. For example, Sybase did not strictly conform to the ANSI-SQL standard. Specifically, Sybase does not require columns in the SELECT clause to be in the GROUP BY column list, but SQL Server requires that the nonaggregate columns in the SELECT clause also appear in the GROUP BY clause. Hence, this is an area of intense manual intervention because you must decide the right order of the columns that belong in the GROUP BY clause without altering the expected results.

Errors such as these will be flagged when you use Microsoft Query Analyzer to attempt to run the scripts that create the various objects — the SQL Server parser and compiler will report error messages that you must correct. You should also save the updated scripts in case you need to recreate the objects in the future. For more information, see the earlier discussion in this chapter in the section "Manually Migrating Tables and Indexes."

There also may be dependencies between objects, and the scripts will fail on the first execution (even though the Query Analyzer has shown the SQL to be valid) because dependent objects have not been created. Normally, running the script twice will resolve these issues, but it is possible to have circular dependencies (where object A depends on object B, while object B depends on object A) and the script will never successfully complete. In such cases, you may need to create a dummy object to satisfy one of the dependencies.

Handling RAISERROR Statements

Deciding how to convert RAISERROR statements has a significant impact on stored procedures, triggers, and client applications. During the automated Transact-SQL conversion process, the SMT script comments out those Sybase RAISERROR statements that raise user-defined error messages. This change allows the database objects to compile successfully. The automatic Transact-SQL conversion process should have migrated a majority of the RAISERROR statements that do not use user-defined error messages.

The key to migrating user-defined messages is to understand is how Sybase and Microsoft RAISERROR statements use the error number parameter. In addition, the Sybase construct “rollback trigger with raiserror” is not supported by SQL Server.

The following is the syntax for the Sybase RAISERROR statement:

raiserror error_number      [{format_string | @local_variable}] [, arg_list]     [extended_value  = extended_value  [{,      extended_value  = extended_value}...]]

error_number
is a local variable or an integer with a value greater than 17,000.
  • If

    error_number
    is between 17,000 and 19,999, and
    format_string
    is missing or empty (""), Sybase Adaptive Server retrieves error message text from the sysmessages table in the master database. These error messages are used chiefly by system procedures.
  • If

    error_number
    is 20,000 or greater and
    format_string
    is missing or empty, raiserror retrieves the message text from the sysusermessages table in the database from which the query or stored procedure originates. Adaptive Server attempts to retrieve messages from either sysmessages or sysusermessages in the language defined by the current setting of @@LANGID.

The following is the syntax for the SQL Server RAISERROR statement:

RAISERROR ( { msg_id | msg_str } { , severity , state }    [ , argument [ ,...n ] ] )    [ WITH option [ ,...n ] ]

msg_id
is a user-defined error message that is stored in the sysmessages table. Error numbers for user-defined error messages should be greater than 50,000. Ad hoc messages raise an error of 50,000.

Table 9.2 identifies the issues between the two implementations of the RAISERROR statements and suggests actions to resolve each issue.

Table 9.2: Resolving RAISERROR issues

Issue Number

Issue Description

Action

1

Sybase sysusermessages system table

There is no sysusermessages table in SQL Server. In SQL Server, the corresponding table is the master..sysmessages table.

You must migrate the Sybase user-defined messages in the sysusermessages table to a staging table in SQL Server.

The staging table, call it usermessages, should be an exact duplicate of the SQL Server master..sysmessages table. In addition, you should add a column to represent the original Sybase value of the error message. Call this column sybase_error. Never alter this value.

Use Data Transformation Services (DTS) or the bulk copy program to extract the following columns from the Sybase sysusermessages table: error, description. Map the Sybase error column to the sybase_error column in the staging table and the Sybase description column to the staging table description column.

You need to supply the following column information for the migrated Sybase error messages: severity and with_log information.

You must change the Sybase description field from Sybase RAISERROR syntax to SQL Server syntax.

Load the staging table accordingly.

Decide whether you are going to fix the format strings in the staging table or fix the format strings in the actual database object.

2

If the Sybase error number is between 17,000 and 19,999.

First, determine whether you have any error numbers that fall into this range.

If any of the Sybase error numbers fall into this range, you must identify a corresponding message in SQL Server and change the error number reference to reflect the error number in the master..sysmessages table.

Modify the database objects that this affects accordingly.

3

If the Sybase error number is 20,000 or greater.

You must change the value of the Sybase error numbers to a value greater than 50,001. SQL Server uses 50,000 for ad-hoc RAISERROR statements.

Choose a seed value that will not decimate legitimate Sybase error messages that are greater than 50,000. A good seed value may be 1,000,000.

Update the Microsoft staging usermessages table. Populate the column error with a new Microsoft error number, which is calculated as follows:

seed value + sybase_error

4

In the Sybase format string, RAISERROR allows argument substitution in any order, but the arguments must be of data type varchar or char.

The SQL Server RAISERROR statement requires positional argument substitution, like printf in the C language, and supports integer and string substitution using the %d, %i, %s placeholders.

5

Client applications.

In the client applications in which it applies, you must change the old Sybase error numbers and format strings (if they apply) to the new SQL Server error numbers.

6

SQL Server RAISERROR requires a severity and a state

The severity is the user-defined severity level associated with this message. Severity levels from 0 through 18 can be used by any user. Severity levels from 19 through 25 are used only by members of the sysadmin fixed server role. For severity levels from 19 through 25, the WITH LOG option is required.

Severity levels from 20 through 25 are considered fatal. If a fatal severity level is encountered, the client connection is terminated after receiving the message, and the error is logged in the error log and the application log.

State is an arbitrary integer from 1 through 127 that represents information about the invocation state of the error. A negative value for statedefaults to 1.

The key outcomes from the RAISERROR conversion are as follows:

  • A user-defined staging table is created and populated with error messages from the Sybase sysusermessages table.

  • For those Sybase user-defined error messages that have an error number greater than 50,001, a seed value is determined and applied to create a new SQL Server user-defined error number.

  • Sybase user-defined error numbers are cross-referenced against their new SQL Server user-Þfined error numbers. This simplifies the migration process.

  • A decision is made to either fix the user-defined format string in the usermessages staging table or to fix the user-defined error messages in the database objects:

    • If you choose to fix the user-defined format string in the usermessages staging table, at the end of that process you must place the new SQL Server user-defined messages into the master..sysmessages table.

    • If you choose to fix the user-defined error messages in the database objects, you must update the usermessages staging table accordingly, and then place the new SQL Server user-defined messages into the master..sysmessages table.

Microsoft SQL Server does not support rollback trigger with raiserror. Such constructs can usually be handled by separating out the raiserror and replacing the rollback trigger with a rollback transaction and a return statement.

Note  You may need to use transaction savepoints to fully emulate the operation of rollback trigger.

Transact_SQL Transformation Checklist

The following list summarizes the changes you should make to the objects in the SQL Server database. Many of these changes are described in detail in Appendix C, "Sybase Stored Procedure and Trigger Migration to Microsoft SQL Server."

  1. Change references to chained transaction mode to either unchained transactions or SQL Server 2000 implicit transactions. Change @@TRAINCHAIN references to @@OPTIONS. Change @@TRANSTATE references to @@ERROR handling logic.

  2. Convert ROLLBACK TRIGGER statements to SQL Server savepoints.

  3. Change transaction isolation levels from Sybase numeric-level identifiers to SQL Server 2000 string-based identifiers.

  4. Move user-supplied stored procedures from the sybsystemprocs database to the master database.

  5. Delete (preferably) or change optimizer hints to use SQL Server 2000 format.

  6. Change permanent temporary tables to global tables.

  7. Change the error number range of user-defined messages and add severity levels.

  8. Remove or change the following items where they occur in the Sybase database:

    • Arguments and argument substitution from the PRINT statement

    • Sybase server roles

    • User-defined data cache references

    • Thresholds

    • Table partitioning.  

  9. Make the following required syntax changes:

    • Change DUMP and LOAD statements to BACKUP and RESTORE statements.

    • Change @@SQLSTATUS to @@FETCH_STATUS when processing cursors.

    • Change the default name of identity columns in a table to IDENTITYCOL.

    • Rename columns and tables that use reserved SQL Server 2000 keywords.

    • Modify or delete Database Consistency Checker (DBCC) statements.

    • Update any RAISERROR statements to use SQL Server syntax and change the range value specified by messages to values that are greater than 50,000.  

  10. Ensure that the tempdb database is of sufficient size to hold the temporary objects created by stored procedures and queries.  

Optionally, you can make the following changes to the database and applications after migrating to SQL Server 2000:

  • Change the tuning options for read-ahead cursors to values suitable for SQL Server.

  • Change scrollable server cursors.

  • Encrypt stored procedures in syscomments.

  • Replace nested IF statements with the CASE statement.

  • Use remote procedure calls (RPCs) with result sets in INSERT statements.

  • Use SQL Server Agent to schedule automatic maintenance tasks, alerts, and replication.

  • Use extended stored procedures for e-mail notification, paging, and scheduling.

  • Record user messages in the Windows application log and the SQL Server error log by using xp_logevent.

  • Make use of Windows-integrated security and trusted connections.

  • Use stored procedures that can run automatically when SQL Server starts up to perform initialization tasks.

  • Change the strings used by EXECUTE statements.

  • Change the SELECT statements that are encoded in stored procedures and triggers to use SQL-92-style joins (inner join, cross join, left outer join, right outer join, full outer join).  

Scripting Migrated Schema Objects

When you have completed the development process and the code is ready for unit testing, create the implementation scripts and store them in a secure area. These scripts will allow you to recreate the objects in a reliable manner when you move the database to the staging and production environments.

There are some basic rules that you should follow when scripting the database for promotion to a staging environment for unit testing:

  • Script everything

  • Provide support documentation

  • Protect the scripts

Each of these basic rules is discussed under the following subheadings.

Script Everything

Script everything associated with the database implementation. Use the Generate SQL Script feature of SQL Server Enterprise Manager to script each database object.

After you have created the scripts, conduct a validation process to verify that the number of objects scripted is the same as the number of objects that exist in the development database. This process is very important because it helps to ensure that the database being promoted to the staging environment is identical to the database in the development environment.

Provide Support Documentation

Provide support documentation for each script. Comment the code to describe what is happening in the script and include any observations that you have from when it was last executed, such as how long execution took, how many rows were affected, and so on.

Protect the Scripts

Protecting the scripts is very important for the database implementation. Treat the scripts like production objects at this point. To ensure that the appropriate scripts are used when the database is promoted between development, staging, and production environments, you should keep the scripts in a secured directory or under version control. Only the development and deployment teams should have access to them.

Transferring Data

The fifth task to be performed in the Developing Phase is to transfer the data. After the Transact-SQL transformations are complete and the database and database objects are created, your next task is to extract the data from the Sybase database and use it to populate the SQL Server database.

The initial load process consists of filling the tables in the SQL Server database and verifying that the data is acceptable for use. Several methods of loading data into a database are available; the method that you choose should focus on efficiency and performance to maximize the availability of the database for unit testing.

SQL Server provides the following three tools for loading a database:

  • The bulk copy program (bcp utility).The bcp utility is an external application provided with Microsoft SQL Server and Sybase to facilitate the extracting and loading of data files into a database. You can use the bcp utility to copy the data from a Sybase table into a flat file that can be transferred to the computer hosting SQL Server. You can then use the bcp utility on the SQL Server computer to fill the corresponding table.

    For information about using the bcp utility with Microsoft SQL Server, see "bcp Utility" at http://msdn.microsoft.com/library/default.asp?url=/library/en-us/coprompt/cp_bcp_61et.asp.

    The bcp utility is primarily useful if Sybase and SQL Server use the same collation sequence. If data must be transformed between character sets and sort orders, you will need to specify which character set and sort order is required as part of the load operation. Transformations can slow down the load process.

  • The BULK INSERT statement. The BULK INSERT Transact-SQL statement lets you copy large quantities of data from a data file into a SQL Server table from within SQL Server. You cannot use this command to copy data from SQL Server into a data file.

  • Data Transformation Services (DTS). DTS is a set of tools that is provided with SQL Server that can automate complex import and export operations of data that require additional transformations. DTS uses scripts to perform the operations, and it provides wizards that allow you to define and edit these scripts.

Each of these methods has different capabilities and characteristics; you should evaluate each one for the performance that best suits the data loading process for your database. For more information, see Appendix F, "Data Migration."

Load Operation Performance

In environments where free access to databases and servers is not a problem, random testing and data reloading can be accomplished easily. In environments where access to database and servers is limited, you may need to carefully select the methods used to accomplish loading and transfer in minimized time frames. Therefore, it is important to minimize the time that it takes to load data into a SQL Server database.

In addition, tests should be repeatable and will usually require that the tables being scrutinized are reloaded with known data before each run. Several options exist for specifying how data should be bulk copied into SQL Server using the bcp utility or the BULK INSERT statement, including:

  • Using logged and nonlogged bulk copies

  • Loading data in parallel

  • Controlling the locking behavior

  • Using batches

  • Ordering data files

Two factors determine which of these options you should use to increase performance of the load process:

  • The volume of existing data in the table compared to the amount of data to be copied into the table. Ideally, you should transfer data into tables that are initially empty, but this may not always be possible. In situations in which the target table contains data already, the data being imported will have a bearing on the distribution of existing data in the database. For large tables, you should load data in parallel using a nonlogged bulk copy operation.

  • The number and type of indexes on the table. It is usually recommended that you remove all indexes on a table before importing data, and then rebuild the indexes after completing the import. Although this can significantly improve the performance of the load operation, if the table already contains data, this may not be possible. If it is not possible to drop indexes, then it will be beneficial to sort the data based on the primary key of the table before loading it into the database. It is also advisable to ensure that other users are not attempting to access tables being loaded as this can disrupt the process.

Loading Data in Parallel

Parallel loading is useful for tables that contain a large number of rows. If SQL Server is running on a server that has more than one processor, and data that will be bulk copied into the table can be partitioned into separate data files, it is recommended that you load the data into the same table in parallel. This method improves the performance of the bulk-copy operation. When copying data into a table in this manner, it is recommended that you follow the advice in the following list:

  • Drop all indexes on the table before loading data, and then recreate the indexes afterward. Create clustered primary indexes first, and then build any secondary indexes. You should consider recreating the secondary indexes in parallel.

  • If Sybase and SQL Server use different character sets of sort sequences, transform the data after exporting it from Sybase but before importing it into SQL Server. The BULK INSERT and bulk copy utility commands are capable of performing these transformations, but at the considerable cost of the additional processing required. Preprocessing this data will help to ensure maximum performance when loading the data.

  • Use ordered data. Using ordered data can help to minimize the time that is required to build a clustered index on the table. The clustered index can even be created before the table is populated, but you should first drop all secondary indexes. The ORDER hint of the BULK INSERT command will not affect performance if the clustered index is not present during loading.

  • Use the TABLOCK hint of the BULK INSERT command. This hint causes a table-level lock to be taken for the duration of the bulk-copy operation and can reduce the degree of contention for the data in the table that might otherwise occur.

  • Use a large batch size, using the ROW_PER_BATCH hint of the BULK INSERT command. A single batch that spans the entire contents of the file being loaded is recommended.

  • Set the select into/bulkcopy database option to TRUE to enable nonlogged operations. This ensures that the data inserted is not recorded in the transaction log.

For more information about improving performance of the bulk copy operation, see "Bulk Copy Performance Considerations" at http://msdn.microsoft.com/library/en-us/adminsql/ad_impt_bcp_5zqd.asp.

Additional Considerations for Transferring Data

When loading a large amount of data — for example, 100,000,000 or more rows — the bulk copy utility or the BULK INSERT statement provides an efficient mechanism for loading large data sets. In situations in which small to moderate amounts of data are involved, consider using the DTS Wizard. This wizard is supplied with the SQL Server client tools.

DTS is the best choice if the data must undergo significant transformations. DTS allows you to script how the transformations should occur, which provides a high degree of flexibility. You can also use DTS to "clean" data (that is, to discard rows that do match specified, often complicated, criteria). One further advantage of DTS is that the wizard allows you to schedule when the import operation should occur — long running tasks can be scheduled as overnight tasks, initiated by the Microsoft SQL Agent, which is another component of SQL Server.

Rebuilding Indexes

When the tables have been filled, you can rebuild the indexes. The Extract Schema Wizard generates the file database_indexes.sql, where database is the name of the migrated database, in the root of the directory tree created by the SMT. You can edit and run this script using Microsoft Query Analyzer.

Testing the Database

The sixth and final task to be performed in the Developing Phase is to test the database. A database migrated from Sybase to SQL Server requires thorough testing before release to the production environment. Problems can arise in a number of places, ranging from misconfigured hardware to errors in migrated database objects.

Objectives of Testing

Generally speaking, the tests designed to ascertain whether the migration has been completed successfully should assess the following items, which were described in detail in Chapter 4, "Planning Phase: Database:"

  • Physical architecture of the database. Have the data files, transaction log, and other items that comprise the database been created correctly?

  • Security. Have the logins and users been correctly created? Have the appropriate permissions been assigned? Is the database physically secure?

  • Logical architecture of the database. Have the tables, views, stored procedures, triggers, and other database objects been created successfully?

  • Data. Have the contents of the tables been transferred correctly? Do tables contain the correct numbers of rows? Is the data valid?

  • Functionality. Do the stored procedures, triggers, views, and other items comprising Transact-SQL code operate in the same manner as the original Sybase objects?

  • Performance. Does response time and throughput meet requirements and match user expectations?

The Testing Process

Lab testing can be used to check that the database is functionally sound, but only bench-marking, piloting, and user acceptance testing can establish whether performance is adequate and user expectations have been met. It will often be necessary to restructure a database to improve performance — to add additional disks, use additional networks, create or drop indexes, fine-tune Transact-SQL objects, and so on. All changes made will need to be documented and thoroughly retested before release.

Unit Testing Database Objects

The SMT provides the SQL Comparison Wizard and Object Comparison Wizard to help you determine whether the contents of the original Sybase database have been successfully migrated to SQL Server. You can use these wizards to help unit-test stored procedures, triggers, and views.

Checking Object Creation

You must verify that all objects and data in the original Sybase database have been successfully migrated to SQL Server. The Object Comparison Wizard of the SMT generates a count of the tables, stored procedures, triggers, and views in the specified database, and for each table it also counts the number of rows found. This information is stored in the PERFDB database. You should make sure that the values generated correspond to the number of tables, stored procedures, triggers, views, and rows in the original Sybase database. For more information, see Appendix I, "Sybase Migration Toolkit User Guide."

Testing Stored Procedures and Views

You can invoke the SQL Comparison Wizard in the SMT to test stored procedures and views in the migrated database and to compare the results against tests run against the corresponding objects in the original Sybase database. This wizard identifies the behavioral differences between Sybase Transact-SQL and SQL Server Transact-SQL by applying a checksum to the entire result set. In this way, you can easily identify varying result sets caused by inconsistencies in the two SQL dialects.

The SQL Comparison Wizard expects you to provide a script that contains Transact-SQL commands that invoke stored procedures and query tables. The format of this file is documented in Appendix I, "Sybase Migration Toolkit User Guide." The output is recorded in the msft_batch table in the PERFDB database, a SQL Server database created with the SMT.

For more information about how to use the SQL Comparison Wizard, see the "SQL Comparison" section of Appendix I, "Sybase Migration Toolkit User Guide."

Testing Triggers

Testing triggers requires that you write Transact-SQL batches that insert, update, and delete data from the relevant tables. You can submit these statements using the SQL Comparison Wizard and include queries that examine the contents of the affected tables. To thoroughly test triggers, you should also create Transact-SQL batches that attempt to violate the integrity of the database and ensure that any trigger operations are correctly rolled back.

Performance Testing

Performance testing of a database is a complex task that should be performed by a specialist team that is familiar with SQL Server and capable of interpreting results and identifying bottlenecks.

Although standard benchmarks are available, it is preferable to do performance testing using a workload that is more representative of the operations typically executed by users of the system. The Planning Phase will have identified a number of test cases and transactions.

You must ensure that adequate time and resources are available; a single performance test run will require that the database is populated with known data if the results are to be meaningful and repeatable.

Security Testing

The security mechanisms available with SQL Server and Windows are significantly different from those used by UNIX and Sybase. You must verify that only authenticated, authorized users have access to the objects in the database. You must also verify that Sybase users and groups have been correctly mapped to SQL Server 2000 users and roles, and that all objects in the database have the appropriate access rights granted to them.

Code Review

All stored procedures, triggers, and other Transact-SQL code should undergo formal code review by a team of experienced developers who are familiar with SQL Server. This code review will ensure that coding standards are adhered to, and it also provides an opportunity to spot weaknesses in the code or opportunities for optimization. All recommendations should be documented and, if implemented, subjected to rigorous testing.

Migration Tasks Summary

The following list can be used as a checklist of the various tasks that are required to migrate a Sybase database:

  • Review the Sybase migration issues discussed throughout this document.

  • Identify objects that require development effort in each targeted Sybase database.

  • Create scripts that can be used to build the corresponding SQL Server objects.

  • Manually migrate the non-Transact-SQL database objects, such as user logins, passwords, user IDs, user-defined data types, rules, and groups (SQL Server uses roles). You can build scripts that perform these tasks so they become a repeatable process.

  • Execute scripts to build the database objects in SQL Server.

  • Transform any Transact-SQL statements from Sybase syntax to Microsoft syntax.

  • Transfer data from the Sybase database to SQL Server.

  • Unit test queries and stored procedures to confirm that results are the same as in the premigrated state.

Additionally, the following SQL Server instance-level tasks should be addressed:

  • An authentication mechanism must be configured that provides the required level of security.

  • Logins that make use of Sybase roles must be mapped to the appropriate SQL Server fixed-server roles.

  • Automated tasks, such as daily maintenance routines scheduled by SQL Server Agent, must be implemented.

  • If appropriate, any database replication schedules and strategies performed by the original Sybase database must be configured with SQL Server.

Closing the Developing Phase

The Developing Phase ends when the solution scope is complete, as specified in the functional specification. At this point, you have migrated the database from Sybase running on UNIX to Microsoft SQL Server running on Windows. If the solution involves migrating client applications, they must also be migrated before the team, stakeholders, and customer can hold the Scope Complete Milestone meeting, evaluate the state of the solution, and agree that the project is ready to move to the Stabilizing Phase, which will focus on meeting the established quality bar for the solution.

Key Milestone: Scope Complete

Meeting this milestone for the database migration aspect of the project means that the database contains the correct objects and data. Unit tests have been performed.

Download

Get the Solution Guide for Sybase/UNIX to SQL Server 2000: Database Engine Migration and Application Interoperation Guide

Update Notifications

Sign up to learn about updates and new releases

Feedback

Send us your comments or suggestions

Was this page helpful?
(1500 characters remaining)
Thank you for your feedback
Show:
© 2014 Microsoft