Import & Export Data Part 2

The BULK INSERT statement imports formatted data directly into a table or view of your choosing. The main advantage of this statement is that is minimally logged if the correct recovery model is choosen. Peforming a transaction log backup after each bulk insert reclaims the log space that was used. This statment has many parameters that can alter how the statement executes. Today, I am going to demonstrate the parameters that I think are most useful.

We will be working again with the Boy Scouts of America (BSA) hypothetical database. The first step when creating a nightly load process is to create a full database backup and change the recovery model to bulk .

The code snipet below performs the backup using a 7 day file name rotation. This is just a preview of things to come. I will be fully exploring database maintenance and backups in the future.

The code snippet changes the recovery model from FULL to BULK LOGGED by using the ALTER DATABASE statement.

The BSA database has a STAGING schema for loading external data. The BULK INSERT statement in its simplest form must have the source data file match the number of columns in the target table. We need to recreate the table to have just two fields.

There are many arguements that can be used to change how the statement executes. The FIRSTROW arguement allows us to skip the header row. The FIELDTERMINATOR and ROWTERMINATOR arguements are used to define a Comma Seperated Values format. The KEEPIDENTITY arguement allows us to use ID value in the data file instead of the automatic number generated by the IDENTITY column. Last but not least, we do not want to skip any errors.

I am going to make the problem a little more difficult by adding the three fields that are defaults. How do we now import a file that has less columns that the table?

That is where format files come in handy. We are going to use the BCP utility to create a format file from the table definition. From there, I am going to modify it by removing unwanted source rows and defining target fields. Run the following from a command prompt. All files used in the examples can be found at the end of the article.

Some arguements of interest are used in this example. The FIRSTROW and LASTROW are used to select a subset of the source data file. The BATCHSIZE allows the database engine to commit changes to disk. This is really important when the number of records increases to free up resources. Last but not least, the FORMATFILE allows our custom file definition to be used.

The last arguement that I want to introduce today allows triggers to be executed when BULK INSERT is executed. This is an easy way to move data from one table to another or from STAGE to RECENT schemas. The snipet below adds the trigger to the staging table and imports the data.

Last but not least, we should backup the transaction log file to reclaim the space and change the recovery model to FULL.

The code snippet changes the recovery model from BULK LOGGED to FULL.

Importing data by using the BULK INSERT statements is a better choice for moving large amounts of data. There are many options that can be specified with the statement. A format file can be used to skip columns or define additional mappings. In summary, BULK INSERT is like a one way ticket to BANGOR, ME. You can get to the destination but can not get back to the original starting point.
The BCP utility fixes this defect by allowing data to be exported or imported. I will be exploring using the utility next time.

Files Used In Examples

rank1.csv Scout Rank Data (CSV)
rank2.tab Scout Rank Data (TAB)
rank2.fmt Modified format file
rank2-all.fmt BCP format file based on table

Related posts

6 Thoughts to “Import & Export Data Part 2”

  1. Saved, I enjoy your site! :)

  2. I just want to say I’m new to blogs and definitely loved your web page. Very likely I’m going to bookmark your blog post . You actually have really good writings. Kudos for sharing your blog.

  3. Top notch post. Continue to keep up the very fantastic work.

  4. Precious blogger, thank you for providing this amazing stuff! I found it excellent. Greets, !!!!

Leave a Comment