开发者

Running a Python Script to read info. from a new .txt file being generated at a known location every 1 sec

My Scenario: I have a known location(directory/path) where a .txt file is going to be generated every 开发者_运维知识库1 sec, I just need to copy its content (contents are in a format which can be used directly to put in a MySQL query) and put it in a MySQL query in my Python script. I need to keep doing this continuously, i.e.,non-stop & always.

The script is something like:

import MySQLdb

mydb = MySQLdb.connect(host="IP_add", user="uname", passwd="pass", db="DB_name")
cursor = mydb.cursor()

#Need to add things below-

    sql = """INSERT INTO table_name VALUES('%d', 'dummy%d')""" % (i, i) //add what here ?

    cursor.execute(sql)
    mydb.commit()


mydb.close()

Problem: I don't know how to go about making such a script running always, and have the MySQL connection open just once, while there goes on a constant scan of my known folder/directory/path to look for new text file and just keep reading info. it contains and put it into MySQL INSERT query. [A good thing is I don't need to format the text file's contents, just need to read whatever it contains.]

Please Help !

Regards, Chirayu


import MySQLdb
import os
import time
from stat import ST_MTIME

TIME_TO_SLEEP = 1          # to avoid CPU burning

mydb = MySQLdb.connect(host="IP_add", user="uname", passwd="pass", db="DB_name")
cursor = mydb.cursor()
last_date = 0
while True:                        # or what you want as stop condition
    time.sleep(TIME_TO_SLEEP)
    stat = os.stat(filename)
    if stat[ST_MTIME] > last_date: # check the last modification date
        last_date = stat[ST_MTIME] # if more recent, store the last date
        with open(filename) as f:   # open the file
            sql = f.read()         # put the content in the sql data base
            if sql:
                cursor.execute(sql)
                mydb.commit()

mydb.close()


You can also use the MySQL command. Load data infile.

This will do what you want much, much faster than individual inserts will do, plus you don't need any perl code to read the lines.

Example

sql = """LOAD DATA INFILE '/var/test/test1.txt' INTO TABLE table1"""
cursor.execute(sql)

Load data infile has lots of options for field and line separators, check it out at:
http://dev.mysql.com/doc/refman/5.1/en/load-data.html

To load a file that contains

6666, 'test'

use

LOAD DATA INFILE 'c:/test.txt' INTO TABLE `test` 
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\'' 
LINES TERMINATED BY '\r\n';

Note the use of forward slashes on file paths (even on Windows) and the '\'' to put a single quote inside two quotes '. The line terminator is correct for Windows, Linux would need LINES TERMINATED BY '\n'.

If you're really lazy
Make a MySQL event.

DELIMITER $$

CREATE EVENT import_file       
ON SCHEDULE
EVERY 1 MINUTE                 
DO BEGIN
  LOAD DATA INFILE 'c:/test.txt' INTO TABLE `test` IGNORE 
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\'' 
    LINES TERMINATED BY '\r\n';
END$$

DELIMITER ;

The IGNORE keyword will skip over fields that trigger a primary or unique key violation, so that it will not try and import those entries.
Define the proper field as unique to prevent MySQL from importing duplicate entries.

For more info on events
See: http://dev.mysql.com/doc/refman/5.1/en/create-event.html

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜