I am involved in a process that includes nightly batch feeds of 40 flat
files from an AS400 system. I've inherited a rather complex but impressive
set of stored procedures that manage the load process by first performing a
bulk copy of the feed files into corresponding feed tables; subsequent
procedures process the data into production tables. The entire process
includes data validation, error reporting, translations, look ups and
replication. Natural keys are concatenated in the feed files to establish
"original key" values that are used to populate FK references to identity
fields in parent tables.
I have just started investigating DTS and my source thus far is Wrox's
"Professional SQL Server 2000 DTS". The book leans heavily on the use of
ActiveX scripts and the multi phase data pump. With an approach like this it
seems that the DTS package would attempt to insert only one row at a time;
this would seem to be a very slow approach.
I have been giving thought to the idea of blending the two approaches by
utilizing much of the code in the existing load procedures but roll it all
into a DTS package. I realize that I haven't given much to go on, making
these rather loaded questions, but...
Is there any real advantage to migrating to DTS? The existing load
procedures perform most of the "transformations" in bulk insert/update
statements.
Where might I find some other real world approaches that don't rely so
heavily on ActiveX scripting?
Are my thoughts even correct, is the ActiveX approach considered to be a
sluggish approach?
I would appreciate any insight regarding common practices that utilize DTS.
Thank you very much,
Ken