SSIS: How to loop through multiple flat files as data sources

What?
This is an article to demonstrate a quick step-by-step on having an SSIS package loop through a directory/folder of files in order to populate a database table. We could add each file as a separate connection manager but this is inefficient and not versatile enough to accommodate files that get added later.

Why?
I am creating an extract SSIS package intended to take a text file as its source and to populate a database table with this data. Note that this only works if all the text files to be used as source data have the same number of columns and where the column widths match.

How?
I've adapted my real working product with an example. As my work was for a Personnel/HR project, and data confidentiality is somewhat important, some of the images will be censored or data changed in the example below.
  1. Add a variable (preferably to the scope of the package) called "SourceExtractFile" and give it the data type "String". Set the Value to the full path and the first file including its extension (eg. "C:\Temp\SourceFiles\File00001.txt" - although I used a network share without any issues)
  2. Add a Connection Manager
    1. Right-click in "Connection Managers"
    2. Select "New Flat File Connection..."
    3. Browse to the first file in the folder to loop through and select it.
    4. Set the connection manager name, specify columns as per usual.
    5. OK the connection manager and display its "Properties"
    6. Under Expressions, click the ellipsis
      1. Under Property, select "ConnectionString"
      2. In Expression, type @[User::SourceExtractFile]
      3. OK to save it
  3. Add the ForEach Loop Container to your "Control Flow" (NB: This did not exist in my data flow ssis toolbox - if you do not see it when you are under the "Control Flow" design tab, then select Tools > Choose Toolbox Items > SSIS Control Flow Items > Tick the "For Each Loop Container")
  4. Add the Data-Flow to the container (or drag into the container if it already exists)
  5. Edit the ForEach Loop Container (or double-click on it)
    1. Under General, give it a reasonable name of your choice
    2. Under Collection, select as Enumerator the "Foreach File Enumerator"
    3. Select the source folder which contains all the files to loop through
    4. Specify the file name convention with wildcard to indicate which files (in this example "File*.txt", all files starting with "File" - eg. "File0001.txt", "File0002.txt")
    5. For "Retrieve File Name" I put "Fully Qualified" (hoping this means the full path is used as just putting the file name did not work).
    6. Under "Variable Mappings" specify the Variable "User::SourceExtractFile" (based on this example) and keep the Index at 0 (zero).
    7. OK to close the dialog
  6. Edit your Data Flow as per usual, selecting your dynamic connection manager as the Flat File Source.
  7. Done.

Credit where Credit is Due:


Feel free to copy, redistribute and share this information. All that we ask is that you attribute credit and possibly even a link back to this website as it really helps in our search engine rankings.

Disclaimer: The information on this website is provided without warranty and any content is merely the opinion of the author. Please try to test in development environments prior to adapting them to your production environments. The articles are written in good faith and, at the time of print, are working examples used in a commercial setting.

Thank you for visiting and, as always, we hope this website was of some use to you!

Kind Regards,

Joel Lipman
www.joellipman.com

Related Articles

Joes Revolver Map

Joes Word Cloud

Accreditation

Badge - Certified Zoho Creator Associate
Badge - Certified Zoho Creator Associate

Donate & Support

If you like my content, and would like to support this sharing site, feel free to donate using a method below:

Paypal:
Donate to Joel Lipman via PayPal

Bitcoin:
Donate to Joel Lipman with Bitcoin - Valid till 8 May 2022 3QnhmaBX7LQSRsC9hh6Je9rGQKEGNQNfPb
© 2021 Joel Lipman .com. All Rights Reserved.