Probably a stupid and regularly asked question but I can't seem to find an answer, so here goes,
we have 16 .txt files, some with over 350 columns.
That info from each individual file needs importing to multiple sql tables.
need to look at sql table1 does record exist? if not create new then add in data once its been transformed eg datetime from yyyymmdd into datetime values [managed to get this using derived column] for first 20 columns, otherwise do update for the 20 columns...
then look at sql table2 and repeat for next n columns....
So I was wondering is it going to be better to write this as a dtsx package? if so can you point me to an example
or should I just write the code as part of a code behind page that scrapes the info and does a standard update/insert procedure?
Any help would be welcome.
thanks
Sounds easy with a Flat File source, a lookup transformation (to see if you need to update or insert), OLE DB Destination and an OLE DB Command transformation...|||Cheer Phil,
Do you know of any page that has an example to follow?
|||Jamie has an example of hw to use the lookup transform to chek if the row exist:
http://blogs.conchango.com/jamiethomson/archive/2006/09/12/SSIS_3A00_-Checking-if-a-row-exists-and-if-it-does_2C00_-has-it-changed.aspx
没有评论:
发表评论