Recursively MySQL Query
How can I implement recursive MySQL Queries. I am trying to look for it but resources are not very helpful.
Trying to implement similar logic.
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
$query = "";
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = $query + "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (" + $data[0] + ", " + $data[1] + ", " + $data[2] + ", " + $data[3] + ")";
}
$stmt = $this->prepare($query);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
@Author: Numenor
Error Message: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '0' at line 1
This Approach inspired to look for an MySQL recursive query approach.
Here is the Approach I was using Earlier:
Current Code:
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
}
Updated Code
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Prepare insertion query to insert data into schema.
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
//Loop through CSV file and execute inserts prepared, but this is not working
//and there are not data being populated into database.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
// E开发者_JAVA技巧xecute the statement
list($id, $code, $connid, $conncode)=$data;
$stmt->execute();
$this->checkForErrors($stmt);
}
}
This was my Main Question for which I am looking for suggestions !!!
- There's nothing recursive in that code snippet.
- The wrong operator is used to concatenate the strings, it's . (dot) not +
- You'd have to use something like mysqli::multi_query() to execute more than one statement with a single function call and the statements would have to be separated by a delimiter character (by default a semicolon)
- Since you're already using prepare() and execute() why not simply make it a parametrized prepared statement and then assign the values in each iteration of the loop and execute the statement? (Exactly what is $this and what type of object does $this->prepare() return?)
- edit and btw:
$this->prepare()
indicates that your class extends a database class. And it also holds a file descriptor$this->fin
. This has a certain code smell. My guess is that your class uses/has a database/datasink object and a file/datasource, but not is a database+readfile class. Only extend a class if your derived class is something.
edit: a simple example
class Foo {
protected $pdo;
public function __construct(PDO $pdo) {
$this->pdo = $pdo;
}
public function initiateInserts($file)
{
$query = '
INSERT INTO
dt_table_tmp
(id, code, connectid, connectcode)
VALUES
(:id, :code, :connid, :conncode)
';
$stmt = $this->pdo->prepare($query);
$stmt->bindParam(':id', $id);
$stmt->bindParam(':code', $code);
$stmt->bindParam(':connid', $connid);
$stmt->bindParam(':conncode', $conncode);
$fin = fopen($file, 'r') or die('Cannot open file');
while ( false!==($data=fgetcsv($fin,5000,";")) ) {
list($id, $code, $connid, $conncode)=$data;
$stmt->execute();
}
}
}
$pdo = new PDO("mysql:host=localhost;dbname=test", 'localonly', 'localonly');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// set up a demo table and some test data
$pdo->exec('CREATE TEMPORARY TABLE dt_table_tmp (id int, code int, connectid int, connectcode int)');
$sourcepath = 'sample.data.tmp';
$fh = fopen($sourcepath, 'wb') or die('!fopen(w)');
for($i=0; $i<10000; $i++) {
fputcsv($fh, array($i, $i%4, $i%100, $i%3), ';');
}
fclose($fh); unset($fh);
// test script
$foo = new Foo($pdo);
$foo->initiateInserts($sourcepath);
a few tips about speeding up mysql data import
- check if your data really requires to be parsed, sometimes load data works just fine for csv
- if possible, create an sql file first via php and then execute it with mysql command line client
- use multivalue inserts
- disable keys before inserting
multivalue insert statement is something like
INSERT INTO users(name, age) VALUES
("Sam", 13),
("Joe", 14),
("Bill", 33);
this is much faster than three distinct insert statements.
Disabling keys is important to prevent indexing each time you're executing an INSERT:
ALTER TABLE whatever DISABLE KEYS;
INSERT INTO whatever .....
INSERT INTO whatever .....
INSERT INTO whatever .....
ALTER TABLE whatever ENABLE KEYS;
further reading http://dev.mysql.com/doc/refman/5.1/en/insert-speed.html
Inspired by this question I would say you should do something similar. If you really have so many data, then a bulk import is the most appropriate approach for this. And you already have the data in a file.
Have a look at the LOAD DATA INFILE command.
The
LOAD DATA INFILE
statement reads rows from a text file into a table at a very high speed. The file name must be given as a literal string.
If you are interested in the speed differences then read Speed of INSERT Statements.
E.g. you can do this:
$query = "LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
"
This will also ignore the first line assuming that it only indicates the columns.
精彩评论