开发者

python, signal blocking, databases

i have a question about signals and how they being handled when there is a sql command running. Particularly, i have a python script that runs as a process and inside it another process is being created. After that a data base check is being run and inside that database check a tables column NULLE开发者_JS百科GE attribute is being altered. i must say that table is fairly big ... as in GBs big and it gets bigger. when i send send a term signal to the main process (the one that runs the database check) the second process that is created inside that main process should also terminate. well ... that happens at all times minus when the database check is doing that NULLEGE attribute alteration. when that is being done the main process terminates but somehow the signal handler is being ignored (doesn't enter the function handling the signal) and thus no SIGTERM is being sent to the process created inside the main one. and that is how i am being left with one stuck process.

My question: is there a way to block signals (as in queuing them) while the database check is being ran or is there another way of sending the TERM signal to the second process using python ? i have successfully altered the bash script that launches the python script as a process to do a check and send SIGTERM to the left process after the main process terminates ... but i would prefer not handling it this way.

or is there a way of interrupting the database check ?

ps: for database handling the sqlalchemy is being used under python 2.6 on ubuntu linux.


Your MySQL command is a C-call, which blocks the Python interpreter.

The cleanest way is probably to use the "show processlist" and "kill" commands from mysql or mysql-python before you send TERM.

See here: http://www.techrepublic.com/article/how-to-examine-and-kill-mysql-client-processes/5211762

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜