Showing posts with label alter. Show all posts
Showing posts with label alter. Show all posts

Wednesday, March 21, 2012

Dynamic modification of data flow objects

My idea is to read in source and destination values from a table and using these values within a ForEach loop dynamically alter the source, destination and mapping on the data flow within the package. My reading on SSIS leads me to believe that these properties are not available for modification at run-time. Has anyone any ideas on how to accomplish this task. I have data in over 200 tables to import every 4 hours so I'd rather have to maintain 1 package rather than 200.

You cannot alter the metadata of the data-flow pipeline. In english, that means you cannot change the names and data-types of the columns, not can you add or remove them.

However, you CAN dynamically set the external sources and destinations. Would this be sufficient for you?

-Jamie

|||Hi Jamie - thanks for the quick reply. I don't think this will be sufficient. The 200 tables are all different - we are replicationg tables from an Oracle 8i ERP database to SQL for reporting and analysis purposes. The metadata on each source-destination combination will be different from the next so this will be a problem. As I see it the only way to accomplish this concept is to dynamically create a new package for each table i.e each iteration of the ForEach loop. Do you agree?|||

OK, you have to create 200 packages. But you only have to create them once.

You are correct that the only other option is to dynamically build the package. That's not much fun, believe me!

-Jamie

|||

Thanks for that. That is disappointing as I was hoping for a more elegant solution than creating 200 separate packages.

If I was so hardheaded to try the dynamic building of the package, any ideas on the system overhead taken to dynamically build a package 200 times versus running 200 pre-built packages?

Also, could you suggest any examples on-line re dynamically building the data flow package using VB script?

|||

Peter G D wrote:

Thanks for that. That is disappointing as I was hoping for a more elegant solution than creating 200 separate packages.

200 different requirements means 200 things to build. The complexity is in your requirement. I'm slightly confused how it could be made more elegant. I'd welcome your ideas though.

Peter G D wrote:

If I was so hardheaded to try the dynamic building of the package, any ideas on the system overhead taken to dynamically build a package 200 times versus running 200 pre-built packages?

Interesting one. I don't know is the honest answer but I'd love to know. It depends on alot of things, mainly on the amount of data you're moving. The larger dataset then the less the proportionate time to build the package.

Peter G D wrote:

Also, could you suggest any examples on-line re dynamically building the data flow package using VB script?

No way. You won't be able to do this using VBScript. I don't even think you can do it in the Script Task. You are in custom task territory.

-Jamie

|||

i think you you need to use ado.net to iterate over a lookup table that has the table name, source info, and destination info. for each table, you read the data into a recordset, then insert that data into the destination table. you should also probably use a transaction to rollback everything in the event of an error. all of this can be accomplished in a script task.

hope this helps.

|||

Thanks Duane. I've approached the solution much as you prescribe. I've got a table which has the source info and destination info, I read this into an object variable in the package, then use the object recordset as the basis for the Foreach loop. I thought that I'd be able to dynamically change the source and destination information on the data flow task via a script task, and then rebuild the metadata on the data flow task also using a script task(the tables contain exactly the same column names so I naively thought the metadata could be rebuilt using column name matching). However I'm now pessimistic that this approach is possible.

I'm a little unclear on your solution. When you say "you read the data into a recordset" do you mean read it into an object variable?. (I don't have a development background so I'm a little slow on these concepts!). Can you point me to any examples using a similar approach?

|||

Peter G D wrote:

Thanks Duane. I've approached the solution much as you prescribe. I've got a table which has the source info and destination info, I read this into an object variable in the package, then use the object recordset as the basis for the Foreach loop. I thought that I'd be able to dynamically change the source and destination information on the data flow task via a script task, and then rebuild the metadata on the data flow task also using a script task(the tables contain exactly the same column names so I naively thought the metadata could be rebuilt using column name matching). However I'm now pessimistic that this approach is possible.

Correct. You cannot do that.

-Jamie

|||

Peter G D wrote:

I'm a little unclear on your solution. When you say "you read the data into a recordset" do you mean read it into an object variable?. (I don't have a development background so I'm a little slow on these concepts!). Can you point me to any examples using a similar approach?

actually, i rather back away from the recordset solution. a better method would be to use raw files instead (for performance reasons). perhaps you could stage the data as raw xml when pulling it out of the source -- i'm not sure if this is the best way. then, you could load that staged data into the destination.

unfortunately, i don't know of any examples to point you towards. all i can tell you is that this solution requires knowledge of ado.net.

Monday, March 19, 2012

dynamic Instead of update trigger problem

Hello,

I am trying to use an instead of update trigger to alter a lastupdated field in the table when any of the other fields are modified by an update. I want to do this as I can't rely on the applications updating the date when they alter other data. The trigger looks something like this.

CREATE TRIGGER [noidentitytableinsteadupdate] ON [dbo].[noidentitytable]

INSTEAD OF UPDATE

AS

if(not update(updatedat))

begin

update noidentitytable

set text = inserted.text,

state = 'updated',

updatedat = GetDate()

from noidentitytable

INNER JOIN inserted ON (noidentitytable.id = inserted.id)

end

else

update noidentitytable

set text = inserted.text,

state = 'updated',

updatedat = inserted.updatedat

from noidentitytable

INNER JOIN inserted ON (noidentitytable.id = inserted.id)

The problem I am facing is when you add say another 10 fields to the table, and I want to ONLY do an update to the fields that have been updated by the original statement. In effect I want to check each column using update(column) and include it in the new update statement along with the lastupdate field. Dynamic sql does not have access to the inserted table so how would you do this?

The reason I want to exclude all the unmodified fields is because of existing after triggers that fire based on changes to individual column changes. I tried this in an after trigger originally but as this is done as an additional update this causes multiple fires of other triggers.

If the original update was

update noidentitytable set text = 'hello', state = 'updated',

but there were actually 10 columns in the table then the actual update I want to do is.

update noidentitytable set text = inserted.text, state = 'updated', updatedat = GetDate() from noidentitytable INNER JOIN inserted ON (noidentitytable.id = inserted.id)

Or perhaps what I really want to do is get hold of the original SQL and insert text change the datefield.

Can someone enlighted me please?

It would definitely be better to do this in an after trigger. Can't you just add a condition to all the other after triggers to do nothing if the updatedate column is the one being updated?|||

Why would you want to reissue the UPDATE statement again? It takes lot of work to perform an update and you will just make the entire UPDATE statement slower. It seems like you should just create a SP that performs the necessary UPDATE statement, add default on the updatedat column and use DEFAULT keyword in the UPDATE statement. This will be much more scalable, run more efficiently and easy to manage.

|||

I understand that you have to issue the update statement yourself in a instead of trigger, or am I missing a way of committing the original statement first?

I do use the default column value for the date, however setting the date to the default on the update is not much different than setting to GetDate().

what I really wanted was all the values that have changed from the inserted table plus some other default information ie date, user who modified.

|||

Perhaps I should give a different scenario that would require the same solution.

If I was using an instead of update trigger on a view and the underlying table structure changed but to remain backward compatible I left the view definition the same how would you go about ensuring only the fields that the client was updating were updated correctly in the underlying tables?

I would assume this would require some dynamic sql to check the updated columns and then update the tables underneath. Surely you would not update every column even if the value had not changed?

Wednesday, February 15, 2012

Dynamic ALTER TABLE and CURSOR

I'd like to create a dynamic ALTER TABLE and CURSOR based on a variable. Is
it possible ?If you really wanna do that dynamically by a Statement you should go for
that example which can be easily implemented in a trigger.
Create Table VarTestTable
(
COlumn1 INT
)
GO
DECLARE @.TableName varchar(50)
SET @.TableName = 'VarTestTable'
EXEC ('ALTER TABLE ' + @.TableName + ' ADD Column2 INT')
GO
Select * from VarTestTable
But beware of the curse of dynamic SQL :
http://www.sommarskog.se/dynamic_sql.html
HTH, Jens Suessmeyer.
http://www.sqlserver2005.de
--
"Jean Dupont" <Jean Dupont@.discussions.microsoft.com> schrieb im Newsbeitrag
news:F79E4618-EA42-44AC-9550-383E44EC0E67@.microsoft.com...
> I'd like to create a dynamic ALTER TABLE and CURSOR based on a variable.
> Is
> it possible ?|||Thank you. I didn't know what I did because I tried before to ask the
question and it didn't worked. But your very clear exemple is working very
well, even with cursors.
Thanks again.
"Jens Sü?meyer" a écrit :

> If you really wanna do that dynamically by a Statement you should go for
> that example which can be easily implemented in a trigger.
> Create Table VarTestTable
> (
> COlumn1 INT
> )
> GO
> DECLARE @.TableName varchar(50)
> SET @.TableName = 'VarTestTable'
> EXEC ('ALTER TABLE ' + @.TableName + ' ADD Column2 INT')
> GO
> Select * from VarTestTable
> But beware of the curse of dynamic SQL :
> http://www.sommarskog.se/dynamic_sql.html
> HTH, Jens Suessmeyer.
> --
> http://www.sqlserver2005.de
> --
> "Jean Dupont" <Jean Dupont@.discussions.microsoft.com> schrieb im Newsbeitr
ag
> news:F79E4618-EA42-44AC-9550-383E44EC0E67@.microsoft.com...
>
>