开发者

Does SQL Server Integration Services (SSIS) re-compile C# code every time it's run?

We have a process that is getting data in real time and adding records to a database. We're using SQL Server 2008 Integration Services to run our Extrac开发者_如何学运维t Transform Load (ETL) process. We download about 50 files from an FTP site, process them and then archive the files.

The problem is that the processing is taking about 17s per file even thought the files are really small (about 10 lines) and the processing code is fairly simple. Looking at the load on the machine it is CPU bound and there are not a lot of traffic on the network, disc, or memory.

I suspect that SSIS might be re-compiling the C# code every time it is run. Has anyone run into similar problems? Or have you used a similar process without problems?

Are there any tools that can allow us to profile a dtsx package?


Since you're using SSIS 2008, your Script Tasks are always precompiled.


Are you sure it's the script task in the first place?

I had some extensive script tasks which built many dictionaries, saw if an incoming value was in various dictionaries according to crazy complex business logic and did a translation or other work. Buy building the dictionaries once in the task initialization instead of on the each row method, the processing improved vastly, as you might expect. But this was a very special case.

The package components will be validated (either at the beginning or right before each control flow component is run), that's some overhead you can't get away from.

Are you processing all the files in a single loop within SSIS? In that case, the data flow validation shouldn't be repeated.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜