Redirecting stdout from multiple processes to python Logging module
I have a python script that launches a number of user processes using subprocess.Popen. Each process stdout is redirected to a unique file. For example the I launch each process as follows
proc = my_proc
for p in range(1, max_p, 1):
log_file = proc + "_" + str(p) + ".log"
log = open(log_file,开发者_如何转开发 "w+")
subprocess.Popen([my_proc, p], shell = False, stdout = log)
I would like to rotate these files when they become too big. What would the best way of doing this? I would like to use the logging module but I dont think this is its intended use
Thanks
Not a pythonic solution; but on linux systems I prefer using logrotate to automatically rotate my logs. Check to see if its installed on your system (On ubuntu say there is a directory called /etc/logrotate.d/ with files automatically run via cron). This may or may not be preferred to having log rotation run from within the application.
It's very configurable, e.g., allows compression of older files keeps N files via rotate N command, rotates when the cron over "size 100k", and looking at man logrotate, its very straightforward to setup.
From the man page here's a sample file
# sample logrotate configuration file
compress
/var/log/messages {
rotate 5
weekly
postrotate
/usr/bin/killall -HUP syslogd
endscript
}
"/var/log/httpd/access.log" /var/log/httpd/error.log {
rotate 5
mail www@my.org
size 100k
sharedscripts
postrotate
/usr/bin/killall -HUP httpd
endscript
}
How about logging.handlers.RotatingFileHandler
?
精彩评论