Showing posts with label load. Show all posts
Showing posts with label load. Show all posts

Tuesday, March 27, 2012

Dynamic queue receive sql ?

Hi There

My activation sp must be able to read of various queues.

I load a variable with the queue name that activated the sp btu i cannot get the syntax working to receive or get a conversation group of a queue name that is a variable.

I have tried:

WAITFOR

(

RECEIVE message_body, conversation_handle, message_type_name, message_sequence_number, conversation_group_id FROM @.callingQueue INTO @.msgTable WHERE conversation_group_id = @.conversationGroup

), TIMEOUT 2000;

But i get this error:

Incorrect syntax near '@.callingQueue'.

Looks like you cannot use a variable.

So i tried the following:

SELECT @.SQL = N' WAITFOR

(

RECEIVE message_body, conversation_handle, message_type_name, message_sequence_number, conversation_group_id FROM @.callingQueue INTO @.msgTable WHERE conversation_group_id = @.conversationGroup

), TIMEOUT 2000'

EXEC sp_executesql @.SQL, N'@.msgTable table output'

But i get the same error.

How do i receive of a queue using a vriable holding the queue name ?

Thanx

Hi,

Just have a look at the following code sample. This sample implements an actviated stored procedure that gets the queue name from the sys.dm_broker_activated_tasks DMV. Maybe you must modify the code a little bit to target your scenario, but the basic structure should meet your requirements.

HTH

Klaus Aschenbrenner
www.csharp.at
http://www.sqljunkies.com/weblog/klaus.aschenbrenner

CREATE PROCEDURE ProcessRequestMessages
AS
DECLARE @.ch UNIQUEIDENTIFIER
DECLARE @.messagetypename NVARCHAR(256)
DECLARE @.messagebody XML
DECLARE @.responsemessage XML
DECLARE @.queue_id INT
DECLARE @.queue_name NVARCHAR(MAX)
DECLARE @.sql NVARCHAR(MAX)
DECLARE @.param_def NVARCHAR(MAX);

-- Determining the queue for that the stored procedure was activated
SELECT @.queue_id = queue_id FROM sys.dm_broker_activated_tasks
WHERE spid = @.@.SPID

SELECT @.queue_name = [name] FROM sys.service_queues
WHERE object_id = @.queue_id

-- Creating the parameter substitution
SET @.param_def = '
@.ch UNIQUEIDENTIFIER OUTPUT,
@.messagetypename NVARCHAR(MAX) OUTPUT,
@.messagebody XML OUTPUT'

-- Creating the dynamic T-SQL statement, which does a query on the actual queue
SET @.sql = '
WAITFOR (
RECEIVE TOP(1)
@.ch = conversation_handle,
@.messagetypename = message_type_name,
@.messagebody = CAST(message_body AS XML)
FROM '
+ @.queue_name + '
), TIMEOUT 1000'

WHILE (1=1)
BEGIN
BEGIN TRANSACTION

-- Executing the dynamic T-SQL statement that contains the actual queue
EXEC sp_executesql
@.sql,
@.param_def,
@.ch = @.ch OUTPUT,
@.messagetypename = @.messagetypename OUTPUT,
@.messagebody = @.messagebody OUTPUT

IF (@.@.ROWCOUNT = 0)
BEGIN
ROLLBACK TRANSACTION
BREAK
END
END

|||

Hi Klaus

I use exactly the same technique, i can see straight away where i went wrong, Thank You

|||

Hi Klaus

Ok next problem.

This does not work:

SELECT @.SQL = N'WAITFOR

(RECEIVE message_body, conversation_handle, message_type_name, message_sequence_number, conversation_group_id FROM ' + @.callingQueue + ' INTO @.msgTable WHERE conversation_group_id = '

+ CAST(@.conversationGroup AS char) + '), TIMEOUT 2000'

EXEC sp_executesql @.SQL, N'@.msgTable TABLE output', @.msgTable out

The reason for this is becuase a table variable cannot be an output.

Now i have to read into a tabel variable because i am expecting many messages. Originally i kept looping to get the next message but i kept getting the first message of the queue , i then saw that it was a good practice to read all message into a table variable and then process them.

So i am a bit stuck i have to use sp_executesql to have a dynamic queue, and i need a table variable becuase i want to process multipel message, but i cannot use a table variable with sp_executesql.?

Please help

|||

Hi Klaus

Just so you can get a better idea, this is what i am trying to do. Not sure how to make it work without a table variable. As mentioned i used to simply do a receive top 1, i would then process the message commit and do another read of the queue but i would keep getting the first message. This was resolved using a table variable which is now a problem for sp_executesql.

BEGIN

SELECT @.SQL = N'WAITFOR

(RECEIVE message_body, conversation_handle, message_type_name, message_sequence_number, conversation_group_id FROM ' + @.callingQueue + ' INTO @.msgTable WHERE conversation_group_id = '

+ CAST(@.conversationGroup AS char) + '), TIMEOUT 2000'

EXEC sp_executesql @.SQL, N'@.msgTable TABLE output', @.msgTable out

IF @.@.ROWCOUNT = 0

BEGIN

ROLLBACK TRAN

BREAK

END

DECLARE message_cursor CURSOR FOR

SELECT message_body , conversation_handle , message_type_name, message_seq_number, conversation_group_id from @.msgTable order by message_seq_number asc

OPEN message_cursor

FETCH NEXT from message_cursor INTO @.msgBody, @.ConvHandle, @.msgType, @.message_seq_number, @.conversationGroup

WHILE @.@.FETCH_STATUS = 0

BEGIN

IF @.msgType = 'http://schemas.microsoft.com/SQL/ServiceBroker/EndDialog'

FETCH NEXT from message_cursor INTO @.msgBody, @.ConvHandle, @.msgType, @.message_seq_number, @.conversationGroup

END

END CONVERSATION @.ConvHandle;

CLOSE message_cursor

DEALLOCATE message_cursor

return

END

|||Hi Kluas please note i dont actually do an end conversation in the if, i have cut the logic out, i just wanted you to get an idea of the recieve and how i use the table variable. Thanx|||

Hi Dietz!

Yes, that's a restriction with the sp_executesql statement: you can't use table variables.
Another option would be to use a temp table instead a table variable - would this make sense to you?

Thanks

Klaus Aschenbrenner
www.csharp.at
www.sqljunkies.com/weblog/klaus.aschenbrenner

|||

Dietz wrote:

SELECT @.SQL = N'WAITFOR

SELECT @.SQL = N'WAITFOR

(RECEIVE message_body, conversation_handle, message_type_name, message_sequence_number, conversation_group_id FROM ' + @.callingQueue + ' INTO @.msgTable WHERE conversation_group_id = '

+ CAST(@.conversationGroup AS char) + '), TIMEOUT 2000'

EXEC sp_executesql @.SQL, N'@.msgTable TABLE output', @.msgTable out

Use QUOTENAME around the @.callingQueue. Use INSERT ... EXEC instead of INTO @.msgTable:

SELECT @.SQL = N'WAITFOR (RECEIVE message_body, conversation_handle, message_type_name, message_sequence_number, conversation_group_id FROM ' + QUOTENAME(@.callingQueue)

+ ' WHERE conversation_group_id = '

+ CAST(@.conversationGroup AS char) + '), TIMEOUT 2000'

INSERT INTO @.msgTable EXEC (@.SQL);

Dietz wrote:

DECLARE message_cursor CURSOR FOR

SELECT message_body , conversation_handle , message_type_name, message_seq_number, conversation_group_id from @.msgTable order by message_seq_number asc

The correct order by is conversation_handle, message_seq_number, to process one conversation at a time:

DECLARE message_cursor CURSOR FOR

SELECT message_body , conversation_handle , message_type_name, message_seq_number, conversation_group_id from @.msgTable order by conversation_handle, message_seq_number asc

HTH,

~ Remus

|||

Hi Remus

That looks great, thank you very much, i will try it today.

There is only ever 1 conversation handle, the converation continues for years so i dont think it is required in the order by. Also is it necessary to order by message_seq_number to ensure messages are processed in the right order? Is this not "built into" service broker, i added it just in case.

|||

The RECEIVE returns the messages in order, but the SELECT ... FROM @.table does not offer any quarantee unless an explicit ORDER BY is provided.

HTH,
~ Remus

|||Yes of course, sorry stupid question. Dont know why i thought you where reffering to the recieve :)

Wednesday, March 21, 2012

Dynamic loading of a file

I've been told by Kirk that it is possible to load a file whose metadata is
unknown at design-time by using bulk insert along with a file (not a flat
file) connection.

He didn't elaborate though. Can anyone explain?

-JamieJamie,
Bulk loading to several different tables with known metadata is simple. I don't know how one could load to a table with completely unknown metadata. The file connection works fine with the bulk insert task. If I remember this discussion, you were asking how to put a load into a foreach loop. If you had flat files with several different schema, you could name them accordingly and load into the table with the appropriate schema for that flat file, but without knowing the schema beforehand, there would be no way of generating the table on the destination.
SMO Transfer does something similar, but it's a SQL to SQL solution, ie. one can move a table without knowing it's schema. The reason we can do that is because we can get the schema from the existing table and use it to generate the destination table.
If someone knows how to do this, I'd be interested as well. Sounds like a super task to me.
The question I have is, why would you want to do this? When, I mean, in what circumstance would you do it?
Thanks,
K|||"I don't know how one could load to a table with completely unknown metadata"

Well that's exactly what I'm talking about. I was surprised when we were talking about this before and you suggested it could be done but we obviously got our wires crossed. At least I didn't spend hours trying to find a solution that didn't exist.

Donald actually emailed me about this (i.e. loading a file when the metadata is unknown) a long long time ago because customers were disgruntled to find that it couldn't be done because it CAN be done in DTS. [He wanted to know if Informatica can do it. Answer: No!]

The reason to do it is simple. We already have a DTS solution that can point at a collection of files. All but 1 of those files will be data files but the other one will be a metadata file containing the metadata of the data files. We can read that metadata file, change our DTS pump accordingly, and grab each data file in turn. The only thing that is expected/known beforehand is the structure of the metadata file.
I'm pretty sure alot of other people do similar to this as well.

As far as achieving this in SSIS goes...I think the only solution is a custom component that builds a package on the fly in memory based on the metadata, executes it, and then returns control to the regular package. I remember that Darren was thinking of putting something together around this.
Make sense?

-Jamie|||Another situation...

A guy on the beta NG wanted to build a process that could import data from multiple access databases. The trouble is, the metadata of the MDB (including the tables) is not known beforehand and each MDB was (potentially) different. He just wants to point a package at a folder containing a load of MDBs and say "Import all that stuff, I don't care what's in them".

Disappointingly he is now building a .Net solution to do this.

-Jamie|||If you have a metadata file, something that describes the metadata for a set of flatfiles, this would work. Is this a SQL script file? We don't have a way to ad-hoc discover the schema of a flatfile. We do have some heuristics in the flat file source that try to guess, but that's just what it is, a guess and it's wrong sometimes.
The transfer tasks use a script file. The foundation of which is the transfer provider task. But those are SMO solutions and they use SMO to build the metadata file on the fly (which a SQL script) that the SQL task uses to build the target table and then the pipeline sucks the data from the source table into the target/destination table.
This is the scenario I thought you were talking about. So long as there is a file somewhere that describes the schema of the flatfiles, you can build a package that will do this with a file connection, a bulk insert task, a SQL task in a foreach loop.
Send me a metadata file. If it's a SQL Script file, it's a cinch. If not, what is it?
Thanks,
K|||He should write a custom connection/adapter pair and then market it! :)

K|||

Here is the situation that I am doing.

It's a simple problem, but SSIS cannot handle it.

I want to create an SSIS package, that can dump data from my tables into a file.

The data to dump and the tables to dump is unknown until runtime (I want to be able to set which tables and data to dump using other tables as configuration).

I can build a script file quickly, a metadata file that describes the schema of all the tables that could be exported.

How do I get SSIS to retreive the correct table schema for the table that I want to export into the file?

-rob.cruz

|||

What are you going to do with the files once you've extracted the table data into them?

K

|||Just providing the data to our client who will use the data in their own way. From what I know, they are using it for reports and validation.|||

How many different tables to you have?

Are you doing any transformation on the tables as you export them?

K

|||

no transformations on the tables-

just a straight export

number of tables- that's part of the the problem i am trying to solve- I want the number of tables we can export to be arbitrary- we have probably around 30+ tables that we would want to export, and that can grow or shrink as requested by the client.

I would rather not create a transformation and destination connection for each table- but it may come down to that.

-rob

|||

I was able to do this via the script task!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB

|||

Very nice. This is always the fallback for everything, but not everyone wants to do it. Now you've gone and done it.

Cool!

|||

Iam trying to do the same thing, difference is I have a flat file source and oledb destination. Iam pretty new to SSIS never done any DTS work. Can you please send me the code how to create a flat file source and insert into a oledb destination table.

so far I was able to create a SSIS package, added a script task, added data flow task, added flat file source, this is where Iam stuck.

any help on this is appreciated. thanks.

|||

Hi Kirk.

Any guidance then on how to use SSIS to export a dynamic collection of tables as flat files. I used a foreach loop and a user variable to set the file name, the first table works fine but then it fails on the second because the column mappings are not being dynamically set in the dataflow task.

Alex

Dynamic loading of a file

I've been told by Kirk that it is possible to load a file whose metadata is
unknown at design-time by using bulk insert along with a file (not a flat
file) connection.

He didn't elaborate though. Can anyone explain?

-JamieJamie,
Bulk loading to several different tables with known metadata is simple. I don't know how one could load to a table with completely unknown metadata. The file connection works fine with the bulk insert task. If I remember this discussion, you were asking how to put a load into a foreach loop. If you had flat files with several different schema, you could name them accordingly and load into the table with the appropriate schema for that flat file, but without knowing the schema beforehand, there would be no way of generating the table on the destination.
SMO Transfer does something similar, but it's a SQL to SQL solution, ie. one can move a table without knowing it's schema. The reason we can do that is because we can get the schema from the existing table and use it to generate the destination table.
If someone knows how to do this, I'd be interested as well. Sounds like a super task to me.
The question I have is, why would you want to do this? When, I mean, in what circumstance would you do it?
Thanks,
K|||"I don't know how one could load to a table with completely unknown metadata"

Well that's exactly what I'm talking about. I was surprised when we were talking about this before and you suggested it could be done but we obviously got our wires crossed. At least I didn't spend hours trying to find a solution that didn't exist.

Donald actually emailed me about this (i.e. loading a file when the metadata is unknown) a long long time ago because customers were disgruntled to find that it couldn't be done because it CAN be done in DTS. [He wanted to know if Informatica can do it. Answer: No!]

The reason to do it is simple. We already have a DTS solution that can point at a collection of files. All but 1 of those files will be data files but the other one will be a metadata file containing the metadata of the data files. We can read that metadata file, change our DTS pump accordingly, and grab each data file in turn. The only thing that is expected/known beforehand is the structure of the metadata file.
I'm pretty sure alot of other people do similar to this as well.

As far as achieving this in SSIS goes...I think the only solution is a custom component that builds a package on the fly in memory based on the metadata, executes it, and then returns control to the regular package. I remember that Darren was thinking of putting something together around this.
Make sense?

-Jamie|||Another situation...

A guy on the beta NG wanted to build a process that could import data from multiple access databases. The trouble is, the metadata of the MDB (including the tables) is not known beforehand and each MDB was (potentially) different. He just wants to point a package at a folder containing a load of MDBs and say "Import all that stuff, I don't care what's in them".

Disappointingly he is now building a .Net solution to do this.

-Jamie|||If you have a metadata file, something that describes the metadata for a set of flatfiles, this would work. Is this a SQL script file? We don't have a way to ad-hoc discover the schema of a flatfile. We do have some heuristics in the flat file source that try to guess, but that's just what it is, a guess and it's wrong sometimes.
The transfer tasks use a script file. The foundation of which is the transfer provider task. But those are SMO solutions and they use SMO to build the metadata file on the fly (which a SQL script) that the SQL task uses to build the target table and then the pipeline sucks the data from the source table into the target/destination table.
This is the scenario I thought you were talking about. So long as there is a file somewhere that describes the schema of the flatfiles, you can build a package that will do this with a file connection, a bulk insert task, a SQL task in a foreach loop.
Send me a metadata file. If it's a SQL Script file, it's a cinch. If not, what is it?
Thanks,
K|||He should write a custom connection/adapter pair and then market it! :)

K|||

Here is the situation that I am doing.

It's a simple problem, but SSIS cannot handle it.

I want to create an SSIS package, that can dump data from my tables into a file.

The data to dump and the tables to dump is unknown until runtime (I want to be able to set which tables and data to dump using other tables as configuration).

I can build a script file quickly, a metadata file that describes the schema of all the tables that could be exported.

How do I get SSIS to retreive the correct table schema for the table that I want to export into the file?

-rob.cruz

|||

What are you going to do with the files once you've extracted the table data into them?

K

|||Just providing the data to our client who will use the data in their own way. From what I know, they are using it for reports and validation.|||

How many different tables to you have?

Are you doing any transformation on the tables as you export them?

K

|||

no transformations on the tables-

just a straight export

number of tables- that's part of the the problem i am trying to solve- I want the number of tables we can export to be arbitrary- we have probably around 30+ tables that we would want to export, and that can grow or shrink as requested by the client.

I would rather not create a transformation and destination connection for each table- but it may come down to that.

-rob

|||

I was able to do this via the script task!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB

|||

Very nice. This is always the fallback for everything, but not everyone wants to do it. Now you've gone and done it.

Cool!

|||

Iam trying to do the same thing, difference is I have a flat file source and oledb destination. Iam pretty new to SSIS never done any DTS work. Can you please send me the code how to create a flat file source and insert into a oledb destination table.

so far I was able to create a SSIS package, added a script task, added data flow task, added flat file source, this is where Iam stuck.

any help on this is appreciated. thanks.

|||

Hi Kirk.

Any guidance then on how to use SSIS to export a dynamic collection of tables as flat files. I used a foreach loop and a user variable to set the file name, the first table works fine but then it fails on the second because the column mappings are not being dynamically set in the dataflow task.

Alex

sql

Dynamic loading of a file

I've been told by Kirk that it is possible to load a file whose metadata is
unknown at design-time by using bulk insert along with a file (not a flat
file) connection.

He didn't elaborate though. Can anyone explain?

-JamieJamie,
Bulk loading to several different tables with known metadata is simple. I don't know how one could load to a table with completely unknown metadata. The file connection works fine with the bulk insert task. If I remember this discussion, you were asking how to put a load into a foreach loop. If you had flat files with several different schema, you could name them accordingly and load into the table with the appropriate schema for that flat file, but without knowing the schema beforehand, there would be no way of generating the table on the destination.
SMO Transfer does something similar, but it's a SQL to SQL solution, ie. one can move a table without knowing it's schema. The reason we can do that is because we can get the schema from the existing table and use it to generate the destination table.
If someone knows how to do this, I'd be interested as well. Sounds like a super task to me.
The question I have is, why would you want to do this? When, I mean, in what circumstance would you do it?
Thanks,
K|||"I don't know how one could load to a table with completely unknown metadata"

Well that's exactly what I'm talking about. I was surprised when we were talking about this before and you suggested it could be done but we obviously got our wires crossed. At least I didn't spend hours trying to find a solution that didn't exist.

Donald actually emailed me about this (i.e. loading a file when the metadata is unknown) a long long time ago because customers were disgruntled to find that it couldn't be done because it CAN be done in DTS. [He wanted to know if Informatica can do it. Answer: No!]

The reason to do it is simple. We already have a DTS solution that can point at a collection of files. All but 1 of those files will be data files but the other one will be a metadata file containing the metadata of the data files. We can read that metadata file, change our DTS pump accordingly, and grab each data file in turn. The only thing that is expected/known beforehand is the structure of the metadata file.
I'm pretty sure alot of other people do similar to this as well.

As far as achieving this in SSIS goes...I think the only solution is a custom component that builds a package on the fly in memory based on the metadata, executes it, and then returns control to the regular package. I remember that Darren was thinking of putting something together around this.
Make sense?

-Jamie|||Another situation...

A guy on the beta NG wanted to build a process that could import data from multiple access databases. The trouble is, the metadata of the MDB (including the tables) is not known beforehand and each MDB was (potentially) different. He just wants to point a package at a folder containing a load of MDBs and say "Import all that stuff, I don't care what's in them".

Disappointingly he is now building a .Net solution to do this.

-Jamie|||If you have a metadata file, something that describes the metadata for a set of flatfiles, this would work. Is this a SQL script file? We don't have a way to ad-hoc discover the schema of a flatfile. We do have some heuristics in the flat file source that try to guess, but that's just what it is, a guess and it's wrong sometimes.
The transfer tasks use a script file. The foundation of which is the transfer provider task. But those are SMO solutions and they use SMO to build the metadata file on the fly (which a SQL script) that the SQL task uses to build the target table and then the pipeline sucks the data from the source table into the target/destination table.
This is the scenario I thought you were talking about. So long as there is a file somewhere that describes the schema of the flatfiles, you can build a package that will do this with a file connection, a bulk insert task, a SQL task in a foreach loop.
Send me a metadata file. If it's a SQL Script file, it's a cinch. If not, what is it?
Thanks,
K|||He should write a custom connection/adapter pair and then market it! :)

K|||

Here is the situation that I am doing.

It's a simple problem, but SSIS cannot handle it.

I want to create an SSIS package, that can dump data from my tables into a file.

The data to dump and the tables to dump is unknown until runtime (I want to be able to set which tables and data to dump using other tables as configuration).

I can build a script file quickly, a metadata file that describes the schema of all the tables that could be exported.

How do I get SSIS to retreive the correct table schema for the table that I want to export into the file?

-rob.cruz

|||

What are you going to do with the files once you've extracted the table data into them?

K

|||Just providing the data to our client who will use the data in their own way. From what I know, they are using it for reports and validation.|||

How many different tables to you have?

Are you doing any transformation on the tables as you export them?

K

|||

no transformations on the tables-

just a straight export

number of tables- that's part of the the problem i am trying to solve- I want the number of tables we can export to be arbitrary- we have probably around 30+ tables that we would want to export, and that can grow or shrink as requested by the client.

I would rather not create a transformation and destination connection for each table- but it may come down to that.

-rob

|||

I was able to do this via the script task!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB

|||

Very nice. This is always the fallback for everything, but not everyone wants to do it. Now you've gone and done it.

Cool!

|||

Iam trying to do the same thing, difference is I have a flat file source and oledb destination. Iam pretty new to SSIS never done any DTS work. Can you please send me the code how to create a flat file source and insert into a oledb destination table.

so far I was able to create a SSIS package, added a script task, added data flow task, added flat file source, this is where Iam stuck.

any help on this is appreciated. thanks.

|||

Hi Kirk.

Any guidance then on how to use SSIS to export a dynamic collection of tables as flat files. I used a foreach loop and a user variable to set the file name, the first table works fine but then it fails on the second because the column mappings are not being dynamically set in the dataflow task.

Alex

Dynamic loading of a file

I've been told by Kirk that it is possible to load a file whose metadata is
unknown at design-time by using bulk insert along with a file (not a flat
file) connection.

He didn't elaborate though. Can anyone explain?

-JamieJamie,
Bulk loading to several different tables with known metadata is simple. I don't know how one could load to a table with completely unknown metadata. The file connection works fine with the bulk insert task. If I remember this discussion, you were asking how to put a load into a foreach loop. If you had flat files with several different schema, you could name them accordingly and load into the table with the appropriate schema for that flat file, but without knowing the schema beforehand, there would be no way of generating the table on the destination.
SMO Transfer does something similar, but it's a SQL to SQL solution, ie. one can move a table without knowing it's schema. The reason we can do that is because we can get the schema from the existing table and use it to generate the destination table.
If someone knows how to do this, I'd be interested as well. Sounds like a super task to me.
The question I have is, why would you want to do this? When, I mean, in what circumstance would you do it?
Thanks,
K|||"I don't know how one could load to a table with completely unknown metadata"

Well that's exactly what I'm talking about. I was surprised when we were talking about this before and you suggested it could be done but we obviously got our wires crossed. At least I didn't spend hours trying to find a solution that didn't exist.

Donald actually emailed me about this (i.e. loading a file when the metadata is unknown) a long long time ago because customers were disgruntled to find that it couldn't be done because it CAN be done in DTS. [He wanted to know if Informatica can do it. Answer: No!]

The reason to do it is simple. We already have a DTS solution that can point at a collection of files. All but 1 of those files will be data files but the other one will be a metadata file containing the metadata of the data files. We can read that metadata file, change our DTS pump accordingly, and grab each data file in turn. The only thing that is expected/known beforehand is the structure of the metadata file.
I'm pretty sure alot of other people do similar to this as well.

As far as achieving this in SSIS goes...I think the only solution is a custom component that builds a package on the fly in memory based on the metadata, executes it, and then returns control to the regular package. I remember that Darren was thinking of putting something together around this.
Make sense?

-Jamie|||Another situation...

A guy on the beta NG wanted to build a process that could import data from multiple access databases. The trouble is, the metadata of the MDB (including the tables) is not known beforehand and each MDB was (potentially) different. He just wants to point a package at a folder containing a load of MDBs and say "Import all that stuff, I don't care what's in them".

Disappointingly he is now building a .Net solution to do this.

-Jamie|||If you have a metadata file, something that describes the metadata for a set of flatfiles, this would work. Is this a SQL script file? We don't have a way to ad-hoc discover the schema of a flatfile. We do have some heuristics in the flat file source that try to guess, but that's just what it is, a guess and it's wrong sometimes.
The transfer tasks use a script file. The foundation of which is the transfer provider task. But those are SMO solutions and they use SMO to build the metadata file on the fly (which a SQL script) that the SQL task uses to build the target table and then the pipeline sucks the data from the source table into the target/destination table.
This is the scenario I thought you were talking about. So long as there is a file somewhere that describes the schema of the flatfiles, you can build a package that will do this with a file connection, a bulk insert task, a SQL task in a foreach loop.
Send me a metadata file. If it's a SQL Script file, it's a cinch. If not, what is it?
Thanks,
K|||He should write a custom connection/adapter pair and then market it! :)

K|||

Here is the situation that I am doing.

It's a simple problem, but SSIS cannot handle it.

I want to create an SSIS package, that can dump data from my tables into a file.

The data to dump and the tables to dump is unknown until runtime (I want to be able to set which tables and data to dump using other tables as configuration).

I can build a script file quickly, a metadata file that describes the schema of all the tables that could be exported.

How do I get SSIS to retreive the correct table schema for the table that I want to export into the file?

-rob.cruz

|||

What are you going to do with the files once you've extracted the table data into them?

K

|||Just providing the data to our client who will use the data in their own way. From what I know, they are using it for reports and validation.|||

How many different tables to you have?

Are you doing any transformation on the tables as you export them?

K

|||

no transformations on the tables-

just a straight export

number of tables- that's part of the the problem i am trying to solve- I want the number of tables we can export to be arbitrary- we have probably around 30+ tables that we would want to export, and that can grow or shrink as requested by the client.

I would rather not create a transformation and destination connection for each table- but it may come down to that.

-rob

|||

I was able to do this via the script task!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB

|||

Very nice. This is always the fallback for everything, but not everyone wants to do it. Now you've gone and done it.

Cool!

|||

Iam trying to do the same thing, difference is I have a flat file source and oledb destination. Iam pretty new to SSIS never done any DTS work. Can you please send me the code how to create a flat file source and insert into a oledb destination table.

so far I was able to create a SSIS package, added a script task, added data flow task, added flat file source, this is where Iam stuck.

any help on this is appreciated. thanks.

|||

Hi Kirk.

Any guidance then on how to use SSIS to export a dynamic collection of tables as flat files. I used a foreach loop and a user variable to set the file name, the first table works fine but then it fails on the second because the column mappings are not being dynamically set in the dataflow task.

Alex

Dynamic loading of a file

I've been told by Kirk that it is possible to load a file whose metadata is
unknown at design-time by using bulk insert along with a file (not a flat
file) connection.

He didn't elaborate though. Can anyone explain?

-JamieJamie,
Bulk loading to several different tables with known metadata is simple. I don't know how one could load to a table with completely unknown metadata. The file connection works fine with the bulk insert task. If I remember this discussion, you were asking how to put a load into a foreach loop. If you had flat files with several different schema, you could name them accordingly and load into the table with the appropriate schema for that flat file, but without knowing the schema beforehand, there would be no way of generating the table on the destination.
SMO Transfer does something similar, but it's a SQL to SQL solution, ie. one can move a table without knowing it's schema. The reason we can do that is because we can get the schema from the existing table and use it to generate the destination table.
If someone knows how to do this, I'd be interested as well. Sounds like a super task to me.
The question I have is, why would you want to do this? When, I mean, in what circumstance would you do it?
Thanks,
K|||"I don't know how one could load to a table with completely unknown metadata"

Well that's exactly what I'm talking about. I was surprised when we were talking about this before and you suggested it could be done but we obviously got our wires crossed. At least I didn't spend hours trying to find a solution that didn't exist.

Donald actually emailed me about this (i.e. loading a file when the metadata is unknown) a long long time ago because customers were disgruntled to find that it couldn't be done because it CAN be done in DTS. [He wanted to know if Informatica can do it. Answer: No!]

The reason to do it is simple. We already have a DTS solution that can point at a collection of files. All but 1 of those files will be data files but the other one will be a metadata file containing the metadata of the data files. We can read that metadata file, change our DTS pump accordingly, and grab each data file in turn. The only thing that is expected/known beforehand is the structure of the metadata file.
I'm pretty sure alot of other people do similar to this as well.

As far as achieving this in SSIS goes...I think the only solution is a custom component that builds a package on the fly in memory based on the metadata, executes it, and then returns control to the regular package. I remember that Darren was thinking of putting something together around this.
Make sense?

-Jamie|||Another situation...

A guy on the beta NG wanted to build a process that could import data from multiple access databases. The trouble is, the metadata of the MDB (including the tables) is not known beforehand and each MDB was (potentially) different. He just wants to point a package at a folder containing a load of MDBs and say "Import all that stuff, I don't care what's in them".

Disappointingly he is now building a .Net solution to do this.

-Jamie|||If you have a metadata file, something that describes the metadata for a set of flatfiles, this would work. Is this a SQL script file? We don't have a way to ad-hoc discover the schema of a flatfile. We do have some heuristics in the flat file source that try to guess, but that's just what it is, a guess and it's wrong sometimes.
The transfer tasks use a script file. The foundation of which is the transfer provider task. But those are SMO solutions and they use SMO to build the metadata file on the fly (which a SQL script) that the SQL task uses to build the target table and then the pipeline sucks the data from the source table into the target/destination table.
This is the scenario I thought you were talking about. So long as there is a file somewhere that describes the schema of the flatfiles, you can build a package that will do this with a file connection, a bulk insert task, a SQL task in a foreach loop.
Send me a metadata file. If it's a SQL Script file, it's a cinch. If not, what is it?
Thanks,
K|||He should write a custom connection/adapter pair and then market it! :)

K|||

Here is the situation that I am doing.

It's a simple problem, but SSIS cannot handle it.

I want to create an SSIS package, that can dump data from my tables into a file.

The data to dump and the tables to dump is unknown until runtime (I want to be able to set which tables and data to dump using other tables as configuration).

I can build a script file quickly, a metadata file that describes the schema of all the tables that could be exported.

How do I get SSIS to retreive the correct table schema for the table that I want to export into the file?

-rob.cruz

|||

What are you going to do with the files once you've extracted the table data into them?

K

|||Just providing the data to our client who will use the data in their own way. From what I know, they are using it for reports and validation.|||

How many different tables to you have?

Are you doing any transformation on the tables as you export them?

K

|||

no transformations on the tables-

just a straight export

number of tables- that's part of the the problem i am trying to solve- I want the number of tables we can export to be arbitrary- we have probably around 30+ tables that we would want to export, and that can grow or shrink as requested by the client.

I would rather not create a transformation and destination connection for each table- but it may come down to that.

-rob

|||

I was able to do this via the script task!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB

|||

Very nice. This is always the fallback for everything, but not everyone wants to do it. Now you've gone and done it.

Cool!

|||

Iam trying to do the same thing, difference is I have a flat file source and oledb destination. Iam pretty new to SSIS never done any DTS work. Can you please send me the code how to create a flat file source and insert into a oledb destination table.

so far I was able to create a SSIS package, added a script task, added data flow task, added flat file source, this is where Iam stuck.

any help on this is appreciated. thanks.

|||

Hi Kirk.

Any guidance then on how to use SSIS to export a dynamic collection of tables as flat files. I used a foreach loop and a user variable to set the file name, the first table works fine but then it fails on the second because the column mappings are not being dynamically set in the dataflow task.

Alex

Dynamic loading of a file

I've been told by Kirk that it is possible to load a file whose metadata is
unknown at design-time by using bulk insert along with a file (not a flat
file) connection.

He didn't elaborate though. Can anyone explain?

-JamieJamie,
Bulk loading to several different tables with known metadata is simple. I don't know how one could load to a table with completely unknown metadata. The file connection works fine with the bulk insert task. If I remember this discussion, you were asking how to put a load into a foreach loop. If you had flat files with several different schema, you could name them accordingly and load into the table with the appropriate schema for that flat file, but without knowing the schema beforehand, there would be no way of generating the table on the destination.
SMO Transfer does something similar, but it's a SQL to SQL solution, ie. one can move a table without knowing it's schema. The reason we can do that is because we can get the schema from the existing table and use it to generate the destination table.
If someone knows how to do this, I'd be interested as well. Sounds like a super task to me.
The question I have is, why would you want to do this? When, I mean, in what circumstance would you do it?
Thanks,
K|||"I don't know how one could load to a table with completely unknown metadata"

Well that's exactly what I'm talking about. I was surprised when we were talking about this before and you suggested it could be done but we obviously got our wires crossed. At least I didn't spend hours trying to find a solution that didn't exist.

Donald actually emailed me about this (i.e. loading a file when the metadata is unknown) a long long time ago because customers were disgruntled to find that it couldn't be done because it CAN be done in DTS. [He wanted to know if Informatica can do it. Answer: No!]

The reason to do it is simple. We already have a DTS solution that can point at a collection of files. All but 1 of those files will be data files but the other one will be a metadata file containing the metadata of the data files. We can read that metadata file, change our DTS pump accordingly, and grab each data file in turn. The only thing that is expected/known beforehand is the structure of the metadata file.
I'm pretty sure alot of other people do similar to this as well.

As far as achieving this in SSIS goes...I think the only solution is a custom component that builds a package on the fly in memory based on the metadata, executes it, and then returns control to the regular package. I remember that Darren was thinking of putting something together around this.
Make sense?

-Jamie|||Another situation...

A guy on the beta NG wanted to build a process that could import data from multiple access databases. The trouble is, the metadata of the MDB (including the tables) is not known beforehand and each MDB was (potentially) different. He just wants to point a package at a folder containing a load of MDBs and say "Import all that stuff, I don't care what's in them".

Disappointingly he is now building a .Net solution to do this.

-Jamie|||If you have a metadata file, something that describes the metadata for a set of flatfiles, this would work. Is this a SQL script file? We don't have a way to ad-hoc discover the schema of a flatfile. We do have some heuristics in the flat file source that try to guess, but that's just what it is, a guess and it's wrong sometimes.
The transfer tasks use a script file. The foundation of which is the transfer provider task. But those are SMO solutions and they use SMO to build the metadata file on the fly (which a SQL script) that the SQL task uses to build the target table and then the pipeline sucks the data from the source table into the target/destination table.
This is the scenario I thought you were talking about. So long as there is a file somewhere that describes the schema of the flatfiles, you can build a package that will do this with a file connection, a bulk insert task, a SQL task in a foreach loop.
Send me a metadata file. If it's a SQL Script file, it's a cinch. If not, what is it?
Thanks,
K|||He should write a custom connection/adapter pair and then market it! :)

K|||

Here is the situation that I am doing.

It's a simple problem, but SSIS cannot handle it.

I want to create an SSIS package, that can dump data from my tables into a file.

The data to dump and the tables to dump is unknown until runtime (I want to be able to set which tables and data to dump using other tables as configuration).

I can build a script file quickly, a metadata file that describes the schema of all the tables that could be exported.

How do I get SSIS to retreive the correct table schema for the table that I want to export into the file?

-rob.cruz

|||

What are you going to do with the files once you've extracted the table data into them?

K

|||Just providing the data to our client who will use the data in their own way. From what I know, they are using it for reports and validation.|||

How many different tables to you have?

Are you doing any transformation on the tables as you export them?

K

|||

no transformations on the tables-

just a straight export

number of tables- that's part of the the problem i am trying to solve- I want the number of tables we can export to be arbitrary- we have probably around 30+ tables that we would want to export, and that can grow or shrink as requested by the client.

I would rather not create a transformation and destination connection for each table- but it may come down to that.

-rob

|||

I was able to do this via the script task!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB

|||

Very nice. This is always the fallback for everything, but not everyone wants to do it. Now you've gone and done it.

Cool!

|||

Iam trying to do the same thing, difference is I have a flat file source and oledb destination. Iam pretty new to SSIS never done any DTS work. Can you please send me the code how to create a flat file source and insert into a oledb destination table.

so far I was able to create a SSIS package, added a script task, added data flow task, added flat file source, this is where Iam stuck.

any help on this is appreciated. thanks.

|||

Hi Kirk.

Any guidance then on how to use SSIS to export a dynamic collection of tables as flat files. I used a foreach loop and a user variable to set the file name, the first table works fine but then it fails on the second because the column mappings are not being dynamically set in the dataflow task.

Alex

Friday, March 9, 2012

Dynamic field base on parameter

Hi,

I have a matrix based on a cube. I would like to load up a field based on the selection of a parameter.

The parameter values has, Actual, Budget, Target

For the one field, base on the above parameter, will select,

if the value for the parameter is Actual, then =Fields!Actual.Value

if it's Budget, then =Fields!Budget.Value

if it's Target, then =Fields!Target.Value

Is this possible? And If so, how should I do it?

Thanks a lot.

You can try

=IIF(Parameters!<ParamName>.Value="Actual", Fields!Actual.Value, IIF(Parameters!<ParamName>.Value="Budget", Fields!Budget.Value, Fields!Target.Value))

|||

hi,

thanks a lot for your reply.

But what if I loaded the parameter list from the database, that means there maybe more option later in the course, what can I do in that case?

|||You can try using this syntax =Fields(Parameters!<paramName>.Value).Value.|||thanks a lot for your great help... that works!

Dynamic Destination Flat File Loading

Here's what I want to do -

Dynamically load a flat file from a dynamic source table-

The source table metadata is known via the SYSOBJECTS and SYSCOLUMNS tables- I can pull the column names, type and lengths from these tables based on the table name. (my goal is pretty simple- pull data from a table in our database, and save it down to a flat file)

Would this be enough to dynamically create the destination flat file? If so, how do I do it?

Thanks

-rob

The only way I can imagine is to build a pice of software dynamically creating an SSIS package... Since the metadata of the pipeline can't be dynamic this might be the only way to do this...|||

This is actually the answer!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB

Dynamic Destination Flat File Loading

Here's what I want to do -

Dynamically load a flat file from a dynamic source table-

The source table metadata is known via the SYSOBJECTS and SYSCOLUMNS tables- I can pull the column names, type and lengths from these tables based on the table name. (my goal is pretty simple- pull data from a table in our database, and save it down to a flat file)

Would this be enough to dynamically create the destination flat file? If so, how do I do it?

Thanks

-rob

The only way I can imagine is to build a pice of software dynamically creating an SSIS package... Since the metadata of the pipeline can't be dynamic this might be the only way to do this...|||

This is actually the answer!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB

Dynamic Destination Flat File Loading

Here's what I want to do -

Dynamically load a flat file from a dynamic source table-

The source table metadata is known via the SYSOBJECTS and SYSCOLUMNS tables- I can pull the column names, type and lengths from these tables based on the table name. (my goal is pretty simple- pull data from a table in our database, and save it down to a flat file)

Would this be enough to dynamically create the destination flat file? If so, how do I do it?

Thanks

-rob

The only way I can imagine is to build a pice of software dynamically creating an SSIS package... Since the metadata of the pipeline can't be dynamic this might be the only way to do this...|||

This is actually the answer!

Basically, I created an SSIS package - in it is a script task that dynamically builds a new package- the package dynamically pulls the source and destinations- dynamically creates the columns and destination connection and voila- it works!

Took some work, but the script task was able to do it all!

I used the CreatePackage Sample as a guide! If you install the Microsoft SQL Server Samples it will be found in the following directory after installation:

C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Programming Samples\Control Flow\CreatePackage Sample\VB