开发者

php file random access and object to file saving

I have a csv file with records being sorted on the first field. I managed to generate a function that does binary search through that file, using fseek for random access through file.

However, this is still a pretty slow process, since when I seek some file position, I actually need to look left, looking for \n characted, so I can make sure I'm r开发者_如何学JAVAeading a whole line (once whole line is read, I can check for first field value mentioned above).

Here is the function that returns a line that contains character at position x:


function fgetLineContaining( $fh, $x ) {
        if( $x  125145411) // 12514511 is the last pos in my file
            return "";
        // now go as much left as possible, until newline is found
        // or beginning of the file
        while( $x > 0 && $c != "\n" && $c != "\r") {
            fseek($fh, $x);
            $x--; // go left in the file
            $c =  fgetc( $fh );
        }
        $x+=2; // skip newline char
        fseek( $fh, $x );
        return fgets( $fh, 1024 ); // return the line from the beginning until \n
    }

While this is working as expected, I have to sad that my csv file has ~1.5Mil lines, and these left-seeks are slowing thins down pretty much.

Is there a better way to seek a line containing position x inside a file?

Also, it would be much better if object of a class could be saved to a file without serializing it, thus enabling reading of a file object-by-object. Does php support that?

Thanks


I think you really should consider using SQLite or MySQL again (like others have suggested in the comments). Most of the suggestions about pre-calculating indexes are already implemented "properly" in these SQL engines.

You said the speed wasn't good enough in SQL. Did you have the fields indexed properly? How were you querying the data? Where you using bulk queries, where you using prepared statements? Did the SQL process have enough ram to store it's indexes in RAM?

One thing you can possibly try to speed under the current algorithm is to load the (~100MB ?) file onto a RAM disc. No matter what you chose to do, either CVS or SQLite, this WILL help speed things up, especially if the hard drive seek time is your bottleneck.

You could possibly even read the whole file into PHP array's (assuming your computer has enough RAM for that). That would allow you to do your search via index ($big_array[$offset]) lookups.

Also one thing to keep in mind, PHP isn't exactly super fast at doing low level things fast. You might want to consider moving away from PHP in favor of C or C++.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜