Sunday, March 11, 2012

Dynamic Flat File Destination Name!

Hi,

I am trying to access from OLE DB source. And based on one of the columns, I need to write the data to a Flat File Destination.

For Example,

CustID, ProductID, Product Name, Product Description

Say I am going to write to a different Flat File for every product. So if there are 10 products in the data. There should be 10 Flat Files. Also the file name should include the Product Name And Product ID.

It is being done in a single Data Flow Task.

Right now the Property Expression for the File Name isSad which is not working)

Code Snippet

@.DestFolder + [Data Conversion].ProductID + @.TodaysDate + ".txt"

The ProductIDs are in the ascending order. Any help or guidance?

Thanks

-Leo

You can't do that in one data flow unless you want to split it out to a separate destination (and connection manager) for each product. That might be okay for three or four products, but not if you have a bunch.

What you should do instead is set up a ForEach loop based on a the results of a SELECT DISTINCT ProductId query. Then execute your data flow once for each ProductId, customizing both the Source query and the destination connection manager with your ProductId.
|||

Thanks for your thoughts. OK, can we achieve this i.e. first 1000 records from the query to file1 and then next 1000 to file2. Or vice versa i.e. Write all the records to a single file and then read 1000 in first iteration and next 1000 records in the next iteration and so on, without multiple reads of the whole destination file.

Thanks

--Leo

|||You might be able to use the Export Column transform. That transform writes out a file per row based on one column containing the filename and another column containing the data.|||

Correct me if I am wrong, Ted, but the Export Column transform is used when you have a binary field in your data flow that you want to persist to a file. That doesn't seem to be the case for Leo's data.

Leo, you can partition the file by rows, but what Jay is suggesting would be simpler and perform better. If you use a For Each Loop, you would get a list of all product IDs, then execute a data flow inside the loop for each product ID. Using expressions, you can alter both the OLE DB Source query and the destination file connection string for each iteration of the loop. You wouldn't ever read a destination file - you would only be writing rows to them. And you would only process each row once.

Here's a few examples of using ForEach loops:

http://sqljunkies.com/WebLog/knight_reign/archive/2005/03/25/9588.aspx

http://agilebi.com/cs/blogs/jwelch/archive/2007/03/21/using-for-each-to-iterate-a-resultset.aspx

http://www.sqlis.com/55.aspx

http://blogs.conchango.com/jamiethomson/archive/2005/05/30/1489.aspx

http://rafael-salas.blogspot.com/2007/02/ssis-loop-through-files-in-date-range.html

http://blogs.conchango.com/jamiethomson/archive/2005/06/15/SSIS_3A00_-Getting-a-value-out-of-a-file-to-use-it-in-our-package.aspx

and a search that might have some more:

http://search.live.com/results.aspx?q=foreach+&form=QBRE&q1=macro%3Ajamiet.ssis

|||The Export Column should work with text and ntext in addition to binary -- the column containing the file data just needs to be either a character or binary lob column. This is just another option for Leo with a different approach, requiring the data and file columns to be derived, which he might be able to do since he has already derived the filename column in his original post.|||

New Leo wrote:

Thanks for your thoughts. OK, can we achieve this i.e. first 1000 records from the query to file1 and then next 1000 to file2. Or vice versa i.e. Write all the records to a single file and then read 1000 in first iteration and next 1000 records in the next iteration and so on, without multiple reads of the whole destination file.

Thanks

--Leo

Leo,

Yes you can do this. I've covered this exact problem here:

Splitting a file into multiple files

(http://blogs.conchango.com/jamiethomson/archive/2005/12/04/SSIS-Nugget_3A00_-Splitting-a-file-into-multiple-files.aspx)

You won't be able to do it in a single data-flow tho.

-Jamie

No comments:

Post a Comment