Export large rows to Excel document, in small memory footprint
I am using PHPExcel to create an Excel document, using data from a MySQL database. My script must execute in under 512MB of RAM, and I am running into trouble as my export reaches 200k records:
PHP Fatal error: Allowed memory size of...
How can I use PHPExcel to create large documents in as little amount of RAM as possible?
My current code:
// Autoload c开发者_StackOverflow中文版lasses
ProjectConfiguration::registerPHPExcel();
$xls = new PHPExcel();
$xls->setActiveSheetIndex(0);
$i = 0;
$j = 2;
// Write the col names
foreach ($columnas_excel as $columna) {
$xls->getActiveSheet()->setCellValueByColumnAndRow($i,1,$columna);
$xls->getActiveSheet()->getColumnDimensionByColumn($i)->setAutoSize(true);
$i++;
}
// paginate the result from database
$pager = new sfPropelPager('Antecedentes',50);
$pager->setCriteria($query_personas);
$pager->init();
$last_page = $pager->getLastPage();
//write the data to excel object
for($pagina =1; $pagina <= $last_page; $pagina++) {
$pager->setPage($pagina);
$pager->init();
foreach ($pager->getResults() as $persona) {
$i = 0;
foreach ($columnas_excel as $key_col => $columnas) {
$xls->getActiveSheet()->setCellValueByColumnAndRow($i,$j,$persona->getByName($key_col, BasePeer::TYPE_PHPNAME));
$i++;
}
$j++;
}
}
// write the file to the disk
$writer = new PHPExcel_Writer_Excel2007($xls);
$filename = sfConfig::get('sf_upload_dir') . DIRECTORY_SEPARATOR . "$cache.listado_personas.xlsx";
if (file_exists($filename)) {
unlink($filename);
}
$writer->save($filename);
CSV version:
// Write the col names to the file
$columnas_key = array_keys($columnas_excel);
file_put_contents($filename, implode(",", $columnas_excel) . "\n");
//write data to the file
for($pagina =1; $pagina <= $last_page; $pagina++) {
$pager->setPage($pagina);
$pager->init();
foreach ($pager->getResults() as $persona) {
$persona_arr = array();
// make an array
foreach ($columnas_excel as $key_col => $columnas) {
$persona_arr[] = $persona->getByName($key_col, BasePeer::TYPE_PHPNAME);
}
// append to the file
file_put_contents($filename, implode(",", $persona_arr) . "\n", FILE_APPEND | LOCK_EX);
}
}
Still have the problem of RAM when Propel makes requests to the database, it's like Propel, does not release the RAM every time you make a new request. I even tried to create and delete the Pager object in each iteration
Propel has formatters
in the Query API, you'll be able to write this kind of code:
<?php
$query = AntecedentesQuery::create()
// Some ->filter()
;
$csv = $query->toCSV();
$csv
contains a CSV content you'l be able to render by setting the correct mime-type.
Since it appears you can use a CSV, try pulling 1 record at a time and appending it to your CSV. Don't try to get all 200k records at the same time.
$cursor = mysql_query( $sqlToFetchData ); // get a MySql resource for your query
$fileHandle = fopen( 'data.csv', 'a'); // use 'a' for Append mode
while( $row = mysql_fetch_row( $cursor ) ){ // pull your data 1 record at a time
fputcsv( $fileHandle, $row ); // append the record to the CSV file
}
fclose( $fileHandle ); // clean up
mysql_close( $cursor );
I'm not sure how to transform the CSV into an XLS file, but hopefully this will get you on your way.
精彩评论