开发者

Query with in a loop getting slower and slower

I have a process that reads a data feed line by line, parses and inserts the data into a MyISAM table. When it first start, it goes really quick, probably around 1000 records a second. As time progresses it gets slower and slower, right now we're at about 1 rows every 180 seconds.

开发者_开发知识库The general syntax of the function is:

function parse($file) {
  $handle = fopen($file, 'r');
  while (!feof($handle)) {
    $line = fgets($fileHandle, 1000);
    switch (substr($line, 0, 2)) { //gets record type
      case '01' :
        //parse the record
        //escapes some strings with mysql_real_escape_string()
        mysql_query('INSERT INTO table VALUES ($a, $b, $c...');
      case '02' :
        ...
    }
  }
}

The current file being parsed has a few million records. The server doesn't appear to be losing memory space. Does anybody know what could be causing the process to slow down?


It probably has to do with having to write so frequently to your indexes. You are already using MyISAM which would be my first suggestion.

Some suggestions

  1. Batch insert every 100 rows
  2. Use INSERT DELAYED to allow MySQL to allow inserts to quickly finish queuing, and MySQL will insert the record when resources are more permitting.
  3. Have your script create an SQL dump/delimited file which you can load using IN FILE importing functions, see http://dev.mysql.com/doc/refman/5.1/en/load-data.html which will be MUCH FASTER than batch inserts


You probably have at least one index on this table. So each row is getting inserted into an increasingly large index. Before you start your load, do

ALTER TABLE table DISABLE KEYS

After you're done do

ALTER TABLE table ENABLE KEYS

Re-enabling the keys may take a while.

Don't try to use the table while you're loading it. Also, INSERT DELAYED may help. But probably not if you're doing this table load on an otherwise quiet data base server.


I think you should be runnning queries in transaction(it got speedup), preferable using PDO because it is much safer. prepared statement in PDO are probably going to be faster, but at least safer because they are invulnerable to SQL-injections. That way is going to be much faster. I have an example ready with tags for you:

<?php

$array = array(
    "ActionScript",
    "AppleScript",
    "Asp",
    "BASIC",
    "C",
    "C++",
    "Clojure",
    "COBOL",
    "ColdFusion",
    "Erlang",
    "Fortran",
    "Groovy",
    "Haskell",
    "Java",
    "JavaScript",
    "Lisp",
    "Perl",
    "PHP",
    "Python",
    "Ruby",
    "Scala",
    "Scheme"
);

function createTable($db) {
    $db->exec("CREATE TABLE IF NOT EXISTS tags (id INTEGER PRIMARY KEY, tag TEXT NOT NULL UNIQUE)");
}

function insertData($db, $array) {
    $db->beginTransaction();

    foreach($array as $elm) {
        try {
            $placeholder = array($elm);
            $stmt = $db->prepare("INSERT INTO tags (tag) VALUES (?)");
            $stmt->execute($placeholder);
        } catch(PDOException $e) {
            /*** roll back the transaction if we fail ***/
            $db->rollback();
            /*** echo the sql statement and error message ***/
            echo $sql . '<br />' . $e->getMessage();
        }
    }

    $db->commit();
}

$db = new PDO('sqlite:database/tags.sqlite3');
$db->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$db->setAttribute(PDO::ATTR_EMULATE_PREPARES, true);
//
createTable($db);
insertData($db, $array);

P.S: I would also like to point out that InnoDB is probably going to be better.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜