Fastest way to read a file line by line with an arbitrary number of characters in each
Ok, I'm trying to figure out which way would be faster to read a text file that I'm working with. The contents of the file look like this
1982 3923 3542 4343 2344 3453 2 334 423423 32432 23423
They're basically just an arbitrary number of int numbers and I need to read line by line. Would it be better to use getline or the insertion (>>) operator? I, personally, think it would be a lot easier to implement by using the insertion operator but I don't know how I would make the program so that it reads all of the int numbers in the same line until it reaches the end. I was thinking of setting it up like the following:
ifstream input;
input.open("someFile.txt");
if (input) {
char* ch;
while (ch != '\n\)
getline(input, buffer, ' ')
The only problem is that I have to do a conversion to an int, then put each int in an array. My desired end goal is to produce a two-dimensional array where each line of int's is an开发者_如何学Python array of int's. Any suggestions as to the best implementation is appreciated!
I would keep it real simple:
ifstream in(...);
string line;
while (getline(in, line)) {
istringstream line_in(line);
while (line_in) {
int val = 0;
if (line_in >> val) {
// Do something with val
}
}
// eol
}
- You'd have to benchmark to get a correct answer.
- The speed of the two functions is implementation defined. You might get different results on different compilers.
- Fastest way to do it would probably to use a custom-made finite state machine. But those are about as unreadable as you get.
Produce correct code first. Then fine tune it if you need to later.
精彩评论