Share via


Integration Services Samples

To help you learn how to work with SQL Server Integration Services, you can download the following sample applications, package samples, and coding samples from CodePlex.

Important

These samples are provided for educational purposes only. They are not intended to be used in a production environment and have not been tested in a production environment. Microsoft does not provide technical support for these samples.

Package Samples

The following table describes the Integration Services package samples that are available.

Name

Description

Calculated Columns Package Sample

Demonstrates how to use the Aggregate and Derived Column transformations to compute values and write the results to a file.

Capture Data Lineage Package Sample

Demonstrates how to get lineage information about data, add it to the data flow, and write the source data and the lineage information to a table.

Process XML Data Package Sample

Demonstrates how to extract data from an XML data file by using an XPath operation, convert the data to a flat-file format, and insert the data into a text file.

Data Cleaning Package Sample

Demonstrates how to clean data by applying exact and fuzzy matching of new customers to existing customers, and how to identify duplicate customers by using fuzzy grouping.

Execute SQL Statements in a Loop Package Sample

Demonstrates how to run SQL CREATE TABLE statements in a loop, evaluate and clean data values, and insert data into tables.

Execute Process Package Sample

Demonstrates how to use the Execute Process task to run an executable that expands a CAB file, read the expanded data, and insert the data into a table.

Lookup Transformation Sample

Demonstrates how to use the Cache connection manager to implement the Lookup transformation in full cache mode. The transformation performs lookups on a reference dataset that is stored in a text file.

SMO Tables DBCC

Demonstrates how to enumerate the user tables in the AdventureWorks2008R2 database by using SMO, use a Script task to retrieve the schema and table names, put the names into two variables, and use an Execute SQL task that executes a DBCC CHECKCONSTRAINTS command using the two variables.

AWDataWarehouseRefresh Package Sample

Demonstrates how to update the AdventureWorksDW data warehouse from the AdventureWorks2008R2 OLTP database.

Synchronizing Adventure Works DW Partitions

Demonstrates how to automate the synchronization of partitions between relational and Analysis Services databases by using an Integration Services package. The sample package uses the Script task and the Analysis Services Execute DDL task, and illustrates the use of the ASSL and AMO technologies.

Execute SQL Parameters and Result Sets Sample

Demonstrates how to use a parameterized stored procedure and an SQL statement to query the AdventureWorks2008R2 database. The package stores the query results in package variables by mapping parameters and result sets to the variables.

Creating a Basic Package Tutorial: Lesson Packages

Demonstrates how to use the wizard to import data from Excel and load the data into a table.

Creating a Simple ETL Package Tutorial: Lesson Packages

Demonstrates how to use the Foreach Loop to read multiple flat files, and then transform the data. Also introduces logging and configurations.

Deploying Packages Tutorial: Lesson Packages

Demonstrates how to deploy packages and install them on the destination computer by using a deployment utility you create.

Exporting from ODBC to a Flat File Using a Wizard Tutorial: Lesson Packages

Demonstrates how to use the SQL Server Import and Export Wizard to export data from a SQL Server ODBC data source to an existing text file.

Readme_Change Data Capture for Specified Interval Package Sample

Demonstrates how to use change data capture functionality in Integration Services packages to perform an incremental load of all changes that occurred during a specified time interval.

Readme_Change Data Capture since Last Request Package Sample

Demonstrates how to use change data capture functionality in Integration Services packages to perform an incremental load of all changes that occurred since the previous request.

Control Flow Programming Samples

The following table describes the Integration Services programming samples that are available and that demonstrate how to program the control flow.

Sample

Description

Create DataFlow Package Sample

Demonstrates how to programmatically create a package and add a Data Flow.

Creating A Custom Increment Task Sample

Demonstrates how to create a custom task and its associated user interface.

ForEachDirectory Sample

Demonstrates how to create a custom enumerator. This sample enumerates folders in the file system.

Sql Server Custom Connection Manager Sample

Demonstrates how to create a custom connection manager with a custom user interface. This sample connects to an instance of SQL Server.

Excel2 Custom Connection Manager Samplea

Demonstrates how to create a custom connection manager with a custom user interface. This sample connects to an Excel workbook.

HtmlLogProvider Sample

Demonstrates how to create a custom log provider. This sample saves logging output as a table in an HTML document.

EmailLogProvider Sample

Demonstrates how to create a custom log provider. This sample sends logging output as an SMTP mail message in either plain text or HTML format.

Data Flow Programming Samples

The following table describes the Integration Services programming samples that are available and that demonstrate how to program the data flow.

Sample

Description

Readme_AdoSource

Demonstrates how to implement a source component that can be used as the data source in the Data Flow task.

Readme_ChangeCase

Demonstrates how to implement a transformation component with a synchronous output.

Readme_RemoveDuplicates

Demonstrates how to implement a data flow transformation component with asynchronous outputs.

Readme_Remove Duplicates with UI Component Sample

Demonstrates how to implement a custom user interface for a data flow component.

Readme_DatasetDestination

Demonstrates how to implement a custom destination component that is integrated into Business Intelligence Development Studio and can be added to a Data Flow task.