开发者

Advice on writing to a log file with python

I have some code that will need to write about 20 bytes of data every 10 seconds. I'm on Windows 7 using python 2.7

You guys recommend any 'least strain to the os/hard drive' way to do this?

I was thinking about opening and closing the same file very 10 seconds:

f = open('log_file.txt', 'w')
f.write(information)
f.close()

Or should I keep it open and just flush() the data and not close it as often?

What about sqllite? Will it improve performance and be less intensive than the open and close file operations? (Isn't it just a flat file database so == to text file anyways...?)

What about mysql (this uses a local server/process.. not sure the specifics on when/how it saves data to hdd) ?

I'm just worried about not frying my hard drive and improving the performance on this logging procedure. I will be 开发者_如何学编程receiving new log information about every 10 seconds, and this will be going on 24/7 24 hours a day. Your advice?

ie: Think about programs like utorrent that require saving large amounts of data on a constant basis for long periods of time, (my log file is significantly less data that those being written in such "downloader type programs" like utorrent)

import random
import time


def get_data():
    letters = 'isn\'t the code obvious'
    data = ''
    for i in xrange(20):
        data += random.choice(letters)
    return data

while True:
    f = open('log_file.txt', 'w')
    f.write(get_data())
    f.close()
    time.sleep(10)

My CPU starts whining after about 15 seconds... (or is that my hdd? )


As expected, python comes included with a great tool for this, have a look at the logging module


Use the logging framework. This is exactly what it is designed to do.

Edit: Balls, beaten to it :).


Don't worry about "frying" your hard drive - 20 bytes every 10 seconds is a small fraction of the data written to the disk in the normal operation of the OS.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜