开发者

Reading from DB and Writing to File

Scenario - There is huge data in database. Utility uses datareader to read the records one by one and appends to a txt/xml file. Originally utility used to read one record, write it to file; Then i changed it so that around 10k records are read and put into memory (stringbuilder) and then flushed to file and so on. The time reduction was superb.

So, i guess File I/O was the 开发者_运维问答bottleneck. I want to improve it further. Thinking to use some kind of buffer, and then use one thread to read from DB and put to buffer, and another thread to fetch from buffer and write to File.

Is it possible. Where to start? Any better alternatives?


A starting point would be using two buffers, and asynchronously writing the buffer content to the file. Something like:

buffera
bufferb
currentbuffer=buffera

fill currentbuffer with data
kickoff job to write currentbuffer to file
wait for previous write job to finish, if relevant.    
currentbuffer=otherbuffer

Repeat until complete.


If you really need better performance, you could read the data in chunks of 10k, like you do today, passing in start/end values to the database and write them simultaneously to separate files using multiple threads. On completion, you concatenate/merge/append the files. Writing to a single file can only have so much improvement because the data write has to be sequential.

something like,

using (var output = File.Create("output"))
{
    foreach (var file in new[] { "file1", "file2" })
    {
        using (var input = File.OpenRead(file))
        {
            input.CopyTo(output);
        }
    }
}

Not sure if it really improves performance by a big margin, but worth a shot.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜