开发者

Which method is better? Hashing each line in a file with PHP

This question was asked on a message board, and I want to get a definitive answer and intelligent debate about which method is more semantically correct and less resource intensive.

Say I have a file with each line in that file containing a string. I want to generate an MD5 hash for each line and write it to the same file, overwriting the previous data. My first thought was to do this:

$file = 'strings.txt';

$lines = file($file);
$handle = fopen($file, 'w+');

foreach ($lines as $line)
{
    fwrite($handle, md5(trim($line))."\n");
}

fclose($handle);

Another user pointed out that file_get_contents() and file_put_contents() were better than using fwrite() in a loop. Their solution:

$thefile = 'strings.txt';
$newfile = 'newstrings.txt';

$current = file_get_contents($thefile);

$explodedcurrent = explode('\n', $thefile);

$temp = '';
foreach ($explodedcurrent as $string)
      $temp .= md5(trim($string)) . '\n';

$newfile = file_put_contents($newfile, $temp);

My argument is that since the main goal of this is to get the file into an array, and file_get_contents() is the preferred way to read the contents of a file into a string, file() is more appropriate and allows us to cut out another unnecessary function, explode().

Furthermore, by directly manipulating the file using fopen(), fwrite(), and fclose() (which is the exact same as one call to file_put_contents()) there is no need to have extraneous variables in which to store the converted strings; you're writing them directly to the file.

My method is the exact same as the alternative - the same number of opens/closes on the file - except mine is shorter and more semantically correct.

What do you have to say, and which one would you choose?


This should be more efficient and less resource-intensive as the previous two methods:

$file = '开发者_如何学运维passwords.txt';

$passwords = file($file);
$converted = fopen($file, 'w+');

while (count($passwords) > 0)
{
    static $i = 0;
    fwrite($converted, md5(trim($passwords[$i])));
    unset($passwords[$i]);
    $i++;
}

fclose($converted);

echo 'Done.';


As one of the comments suggests do what makes more sense to you. Since you might come back to this code in few months and you need to spend least amount of time trying to understand it.

However, if speed is your concern then I would create two test cases (you pretty much already got them) and use timestamp (create variable with timestamp at the beginning of the script, then at the end of the script subtract it from timestamp at the end of the script to work out the difference - how long it took to run the script.) Prepare few files I would go for about 3, two extremes and one normal file. To see which version runs faster.

http://php.net/manual/en/function.time.php

I would think that differences would be marginal, but it also depends on your file sizes.


I'd propose to write a new temporary file, while you process the input one. Once done, overwrite the input file with the temporary one.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜