开发者

Scripting large tables to .sql file

Don't ask me why, but I have to migrate a database from 2008 to 2005. This is not an issue of itself, but I have a very large table.

When I script the tables contents (using Generate Scripts开发者_Python百科), the .sql file made is over 4gb. This is more memory than the server has available in RAM.

Is there anyway to generate insert commands that splits into multiple files?

Or is there a way to split a file into multiple files with the expectation that the file is the larger than the amount of RAM available?


Why script the data out?

I'd use SSIS or some other programmatical method after scripting/generating my schema.

Or use something like Red Gate Compare tools

I've almost never generated DML scripts this way.

However, SSMS tools pack does offer batched INSERT generation and it's free

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜