Application data import
Azure Azure App Services Azure Blob Azure Storage SQL SSDT SSIS Visual Studio

Import Data from Excel to an Azure SQL Database using SSIS (Part 2)

Welcome to my post!

In the previous post I showed how to import a CSV data file Azure Storage as a blob.  

In this second part I will show how to define a package to transfer the Azure Blob Storage container file into an Azure SQL database.

First, we will define the package to transfer the Azure Blob Storage container file into an Azure SQL database.

Select the second package, which will open in the designer.

Copy the SSIS Connection Manager from the first package into this package.

Drag a Data Flow Task into the designer.

Click into the Data Flow designer. Next drag an Azure Blob Storage task into the Data Flow designer as shown. 

Define the Azure Blob Source as shown:

Ensure the column delimiter matches the same delimiter in the source file.

Next, drag an OLE DB Destination task onto the Data Flow designer as shown:

Before you edit the OLD DB Destination task, ensure you drag a Data Flow Path Line to connect the Azure Blob Source and OLE DB Destination as shown:

This will make the data fields from the blob storage file available as mapping input fields to the OLD DB Destination task.

Right-click and Edit to define the OLE DB Destination as shown:

On the next dialog specify the Ole Db Connection Manager and Table to be imported.

On the mapping dialog, check the input mappings are available from the Azure Blob Source:

Your server name will be of the form:

.database.windows.net

Click ok.

Save the package.

Before running the package, here is an additional tip: to view the data before it is processed by the destination task, you can set the data viewer to display and display selected columns during the import processing. This is done within the Data Flow Path as shown:

The magnifying glass icon will show on the Data Flow Path line.

Now run the package.

After a few seconds you will see the data viewer show on the right on the display:

Now, select the progress menu item and the following will display during processing:

If you notice there are some errors (with the red circle) then expand the window to view the full message.

When you see errors like the ones below:

[Books Table [47]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.An OLE DB record is available.  Source: “Microsoft SQL Server Native Client 11.0”  Hresult: 0x80004005  Description: “Invalid character value for cast specification”.

[Books Table [47]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The “Books Table.Inputs[OLE DB Destination Input]” failed because error code 0xC0209077 occurred, and the error row disposition on “Books Table.Inputs[OLE DB Destination Input]”

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component “Books Table” (47) failed with error code 0xC0209029 while processing input “OLE DB Destination Input” (60). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

Open the source data file and check the following:

  1. Dates are in correct format. Most common translation errors occur for DD-MM and MM-DD in the wrong ordering.
  2. Using a column delimiter that is used within a string.
  3. Incorrect number of delimiters in a row of data.

If there are input issues, then try correcting those in the source, save the file and re-upload. Existing blob files will be replaced in Azure.

Once the input issues have been rectified, the import with no errors should show as follows:

The execution progress when successful will show all data has been written and committed successfully:

Check the data has been imported by opening the destination database in Azure using SSMS:

That’s all!

I hope you found this post useful in your attempts to start using SSIS and Azure data.

Stay tuned for the next post.

Social media & sharing icons powered by UltimatelySocial