开发者

Automatically import CSV file and upload to database

One of my clients has all of his product information handle开发者_StackOverflowd by an outside source. They have provided this to me in a CSV file which they will regulary update and upload to an ftp folder of my specification, say every week.

Within this CSV file is all of the product information; product name, spec, image location etc.

The site which I have built for my client is running a MySQL database, which I thought would be holding all of the product information, and thus has been built to handle all of the product data.

My question is this: How would I go about creating and running a script that would find a newly added CSV file from the specified FTP folder, extract the data, and replace all of the data within the relevant MySQL table, all done automatically?

Is this even possbile?

Any help would be greatly appreciated as I don't want to use the IFrame option, S.


should be pretty straight forward depending on the csv file

some csv files have quotes around text "", some don't some have , comma inside the quoted field etc

depending on you level of php skills this should be reasonably easy

you can get a modified timestamp from the file to see if it is new

http://nz.php.net/manual/en/function.lstat.php

open the file and import the data

http://php.net/manual/en/function.fgetcsv.php

insert into the database

http://nz.php.net/manual/en/function.mysql-query.php

If the CSV is difficult to parse with fgetcsv the you could try something like PHPExcel project which has csv reading capabilities

http://phpexcel.codeplex.com


You can just make a script which reads csv file using fread function of php, extract each row and format in an array to insert it into database.

$fileTemp = "path-of-the-file.csv";
$fp = fopen($fileTemp,'r');
$datas = array()
while (($data = fgetcsv($fp)) !== FALSE)
{
     $data['productName'] = trim($data[0]);
     $data['spec'] = trim($data[1]);
     $data['imageLocation'] = trim($data[2]);
     $datas[] = $data;
}

Now you have prepared array $datas which you can insert into database with iterations.


All you need is:

  • Store last file's mtime somewhere (let's say, for simplicity, in another file)
  • script that runs every X minutes by cron

In this script you simply mtime of the csv file with stored value. If mtime differs, you run SQL query that looks like this:

LOAD DATA LOCAL INFILE '/var/www/tmp/file.csv' REPLACE INTO TABLE mytable COLUMNS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n'

Optionally, you can just touch your file to know when you've performed last data load. If the scv's file mtime is greate than your "helper" file, you should touch it and perform the query.

Documentation on LOAD DATA INFILE SQL statement is here

Of course there is a room for queries errors, but I hope you will handle it (you just need to be sure data loaded properly and only in this case touch file or write new mtime).


have you had a look at fgetcsv? You will probably have to set up a cron job to check for a new file at regular intervals.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜