Share via


Basic Guidelines for Bulk Importing Data

You can use the bcp command, BULK INSERT statement, or OPENROWSET(BULK) function to bulk import data from a data file only.

Note

It is possible to write a custom application that bulk imports data from objects other than a text file. To bulk import data from memory buffers, use either the bcp extensions to the SQL Server Native Client (ODBC) application programming interface (API) or the OLE DB IRowsetFastLoad interface. To bulk import data from a C# data table, use the ADO.NET bulk-copy API, SqlBulkCopy.

Note

Bulk importing data into a remote table is not supported.

Use the following guidelines when you bulk import data from a data file to an instance of Microsoft SQL Server:

  • Obtain required permissions for your user account.

    The user account in which you use the bcp utility, the BULK INSERT statement, or the INSERT ... SELECT * FROM OPENROWSET(BULK...) statement must have the required permissions on the table, which are assigned by the table owner. For more information about permissions that are required by each method, see bcp Utility, OPENROWSET (Transact-SQL), and BULK INSERT (Transact-SQL).

  • Use the bulk-logged recovery model.

    This guideline is for a database that uses the full recovery model. The bulk-logged recovery model is useful when performing bulk operations into an unindexed table (a heap). Using bulk-logged recovery helps prevent the transaction log from running out of space because bulk-logged recovery does not perform log row inserts. For more information about the bulk-logged recovery model, see Backup Under the Bulk-Logged Recovery Model.

    We recommend that you change the database to use the bulk-logged recovery model immediately before the bulk import operation. Immediately afterwards, you should reset the database to the full recovery model. For more information, see Considerations for Switching from the Full or Bulk-Logged Recovery Model.

    Note

    For more information about how to minimize logging during bulk import operations, see Prerequisites for Minimal Logging in Bulk Import.

  • Back up after bulk importing data.

    For a database that uses the simple recovery model, we recommend that you take a full or differential backup after the bulk-import operation finishes. For more information, see Creating Full and Differential Backups of a SQL Server Database.

    For the bulk-logged recovery model or full recovery model, a log backup is enough. For more information, see Creating Transaction Log Backups.

  • Drop table indexes to improve performance for large bulk imports.

    This guideline is for when you are importing a large amount of data compared to the amount of data that is already in the table. In this case, dropping the indexes from the table before you perform the bulk-import operation can significantly increase performance.

    Note

    If you are loading a small amount of data compared to the amount of data already in the table, dropping the indexes is counterproductive. The time required to rebuild the indexes might be longer than the time saved during the bulk-import operation.

    For more information, see Optimizing Bulk Import Performance.

  • Find and remove hidden characters in the data file.

    Many utilities and text editors display hidden characters, which are usually at the end of the data file. During a bulk-import operation, hidden characters in an ASCII data file can cause problems that cause an error of "unexpected null found". Finding and removing all the hidden characters should help prevent this problem.