python log manager
I have several python programs that runs in parallel. I want to write a python program which will manage the programs logs, which mean that the other programs will sent log m开发者_如何学JAVAessage to this program and the program will write it to the log file. Another important feature is that if one of the programs will crash, the 'manage log program' will know about it and could write it to the log file. I try to use this sample http://docs.python.org/library/logging.html#sending-and-receiving-logging-events-across-a-network but I failed.
Can anyone please help me?
I wrote a python logger that does just this (even with mpi support). It is available at https://github.com/JensTimmerman/VSC-tools/blob/master/vsc/fancylogger.py
This logger can log to an udp port on a remote machine.
There I run a daemon that collects the logs and writes them to file: https://github.com/JensTimmerman/VSC-tools/blob/master/bin/logdaemon.py
This script will start the daemon for you: https://github.com/JensTimmerman/VSC-tools/blob/master/bin/startlogdaemon.sh
If you then start your python processes and run them in parallel with mpi for example you will only need to use fancylogger.getLogger() and use it as a normal python logger. It will pick up the environment variables set with the script, log to that server, and have some extra mpi info in the log records. (like the mpi thread number)
If you do not use mpi you will have two options:
set the 'FANCYLOG_SERVER' and 'FANCYLOG_SERVER_PORT' variables manually in each shell where you start the remote python process
or just start the daemon. And in the python scripts get your logger
like this:
import fancylogger
fancylogger.logToUDP(hostname, port=5005)
logger = fancylogger.getLogger()
精彩评论