开发者

Cakephp, how to saveAll 1200000 data form SQL server to mysql?

Allowed memory size of 134217728 bytes exhausted (tried to allocate 43 bytes)
Allowed memory size of 134217728 bytes exhausted (tried to allocate 43 bytes)
Allowed memory size of 134217728 bytes exhausted (tried to allocate 43 bytes)
Allowed memory size of 134217728 bytes exhausted (tried to allocate 43 bytes)
Allowed memory size of 134217728 bytes exhausted (tr开发者_如何学Cied to allocate 43 bytes)
Allowed memory size of 134217728 bytes exhausted (tried to allocate 43 bytes)
Allowed memory size of 134217728 bytes exhausted (tried to allocate 43 bytes)


A CakePHP implementation of josh.trow's solution:

class DetailsController extends AppController
{
    var $uses = array('Detail','DataDetail');

    function import()
    {
        $max = $this->Detail->find('count');

        $offset = 0;
        $limit = 10000;

        do {
            $exportedData = $importableData = array();

            $exportedData = $this->Detail->find('all', array(
                'order' => 'Detail.id',
                'limit' => $limit, 'offset' => $offset
            ));

            foreach($exportedData as $k => $row) {

                $importableData[] = array('DataDetail'=>array(
                    'id' => $row['Detail']['ID'],
                    'name' => $row['Detail']['name']
                ));
            }

            $this->DataDetail->saveAll($importableData,array('validate'=>false));

            $offset += $limit;

        } while ($offset <= $max);

        $this->flash(__("The details have been imported.",true));
    }
}

Reduce the size of $limit as needed to accommodate your memory constraints. You'll probably also need to increase the max_execution_time configuration variable in php.ini, as cycling through 1.2M records will take awhile.

NB: I definitely don't recommend this solution unless you need to perform this operation frequently and without human intervention.

NB2: I should note as well that Cake lacks the capability (natively) to aggregate many rows in a single INSERT, so this solution will perform as many queries against your MySQL DB as there are rows in your MSSQL DB. Using Cake for this is entirely unnecessary, and my solution should be considered nothing more than a proof of concept.


I'll preface this by saying I don't know PHP, so this is DEFINITELY pseudocode:

max = countTotalRows(sourceDB)  // COUNT ONLY, DO NOT GET ACTUAL DATA
base = 0;
step = 10000;
do {
  getRows();                    // now you get data - (step) rows of it per loop
  storeRows();                  // and store it
  base += step;
} while ( base <= max )

EDIT: Shamelessly borrowed from http://www.devdaily.com/php/php-cakephp-database-sql-query-select

$results = $this->Order->query("select * from (your table name) LIMIT (the base), (the step)");

Again, I don't know Cake or PHP so you will have to do the syntax, etc but this may help you start.


By the looks of things you are just importing one table into another with no change of data/structure. Just take a SQL/CSV dump from the Detail table and import it into the other database, then just rename it DataDetail.

If DataDetail already exists, just dump the data with insert ignore, you can do this in phpmyadmin and save it as a file.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜