开发者

PHP memory usage for fopen in append mode

I have a custom CakePHP shopping cart application where I’m trying to create a CSV file that contains a row of data for each transaction. I’m running into memory problems when having PHP create the CSV file at once by compiling the relevant data in the MySql databse. Currently the CSV file contains about 200 rows of data.

Alternatively, I’ve considered creating th开发者_StackOverflow社区e CSV in a piecemeal process by appending a row of data to the file every time a transaction is made using: fopen($mFile.csv, 'a');

My developers are saying that I will still run into memory issues with this approach when the CSV file gets too large as PHP will read the whole file into memory. Is this the case? When using the append mode will PHP attempt to read the whole file into memory? If so, can you recommend a better approach?

Thanks in advance, Ben


I ran the following script for a few minutes, and generated a 1.4gb file, well over my php memory limit. I also read from the file without issue. If you are running into memory issues it is probably something else that is causing the problem.

$fp = fopen("big_file.csv","a");

for($i = 0; $i < 100000000; $i++)
{
    fputcsv($fp , array("val1","val2","val3","val4","val5","val6","val7","val8","val9"));
}


can't you just export from the db like so:

SELECT list_fields INTO OUTFILE '/tmp/result.text'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table; 
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜