开发者

std::istream::get efficiency

c++ question.

for(i=1;i<10000;i++){
    cout << myfile.get();
}

Will program make 10000 IO operations on the file in HDD? (given that file is larger) If so, maybe it is better to read lets say 512 bytes to some buffer and then take char by char from there and then again copy 512 b开发者_JAVA百科ytes and so on?


As others have said - try it. Tests I've done show that reading a large block in one go (using streams) can be up to twice as fast as depending solely on the stream's own buffering. However, this is dependent on things like buffer size and (I would expect) stream library implementation - I use g++.


Your OS will cache the file, so you shouldn't need to optimize this for common use.


ifstream is buffered, so, no.


Try it.

However, in many cases, the fastest operation will be to read the whole file at once, and then work on in-memory data.

But really, try out each strategy, and see what works best.

Keep in mind though, that regardless of the underlying file buffering mechanism, reading one byte at a time is slow. If nothing else, it calls the fairly slow IOStreams library 10000 times, when you could have done just a couple of calls.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜