开发者

updating a mysql table with a large csv file in PHP

I have a CSV file which has approximately 13000 lines to be updated into a mysql database. My PHP function stops after about 5000 lines with a timeout or memory overflow.

It loops every line with fgetcsv and checks if the line exists already, which is causing the timeout i guess (SELECT/INSERT/UPDATE queries).

I can't drop the table and re-insert everything because the table still has relations with other tables. -> Then i could write a single INSERT statement with multiple value strings in blocks of e.g. 5000 lines.

I need to find a way to read the file in chunks to prevent a timeout.

Thx in advance!

Some code:

private function readFile()
    {
        $this->profiler = NULL;
        $this->auto_render = FALSE;
        $this->request->headers['Content-Type'] = 'application/json';

        if (array_key_exists('uploadedfile', $_FILES))
        {
            $filename = $_FILES['uploade开发者_JAVA技巧dfile']['tmp_name'];
            if ($_FILES['uploadedfile']['type'] == 'application/vnd.ms-excel') // csv file
            {
                if (is_uploaded_file($filename)) //check if file 
                {
                    if ($_FILES['uploadedfile']['error'] == UPLOAD_ERR_OK) //check no errors
                    {                   
                        // time limit : unlimited
                        set_time_limit(0);

                        // file handler
                        $filepointer = fopen($filename,'r');
                        return $filepointer;
                    }
                }
            }
        }   
    }

Within another function i call readFile() and loop the lines like this:

while (($line = fgetcsv($filepointer, 1000, ";")) != false)


Run php file from command line. By default it doesn't have set timeout.
Or increase timeout in php.ini.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜