开发者

Python atexit for cleaning up server running in different process -- port still in use

I have some integration testing code that spawns a HTTP server in a different process for calling against. This server could potentially get polluted by activity so I'd like the ability to start and stop new instances of it on demand.

This unfortunately isn't working... I am running into a situation where the port my server was running on is still locked after my process exits(meaning if I run the test two times quickly, it fails the second time because the port is locked).

I've tried using atexit.register to bind the shutdown method and it's not working either.

Here's the code for the server:

from BaseHTTPServer import BaseHTTPRequestHandler
import SocketServer
import atexit

class RestHTTPRequestHandler(BaseHTTPRequestHandler):
    def do_GET(self):
        if self.path == '/sanitycheck':
            self.send_response(200)
            self.send_header('Content-type', 'application/json')
            self.end_headers()
            self.wfile.write("{ 'text': 'You are sane.' }")
        else:
            self.wfile.write(self.path)

def kill_server(httpd):
    open("/tmp/log", "w").write("KILLING")
    httpd.shutdown()

def start_simp开发者_开发技巧le_server(port):
    httpd = SocketServer.TCPServer(("", port), RestHTTPRequestHandler)
    atexit.register(kill_server, httpd)

    httpd.serve_forever()

    return httpd

Nothing ever gets written to /tmp/log which makes me think that the atexit isn't getting called.

Here's how I instantiate the server:

p = Process(target=start_simple_server, args=(port,))
p.start()

And then when I'm done to terminate it, I just call:

p.terminate()

Which does kill the process and should(to my understanding) trigger the atexit call -- but it's not :(

Any thoughts?


Python atexit doesn't run when you terminate a process.

>>>import atexit
>>> def hook():
...   print "hook ran"
... 
>>> atexit.register(hook)
<function hook at 0x100414aa0>
>>> 
# in another terminal: kill <python process id>
>>> Terminated


I wound up taking a slightly different approach inspired by some code from David Beazley... server code:

from BaseHTTPServer import BaseHTTPRequestHandler
import SocketServer
import multiprocessing
import cgi
import urlparse

class StoppableHTTPServerProcess(multiprocessing.Process):
    def __init__(self, address, handler):
        multiprocessing.Process.__init__(self)
        self.exit = multiprocessing.Event()

        self.server = StoppableHTTPServer(address, handler)

    def run(self):
        while not self.exit.is_set():
            self.server.handle_request()

    def shutdown(self):
        self.exit.set()

class RestHTTPRequestHandler(BaseHTTPRequestHandler):
    def do_GET(self):
        self.wfile.write(self.path)


class StoppableHTTPServer(SocketServer.TCPServer):
    allow_reuse_address = True
    timeout = 0.5

def start_simple_server(port):
    process = StoppableHTTPServerProcess(("", port), RestHTTPRequestHandler)
    return process

Calling code:

p = start_simple_server(port)
p.start()

And when I'm done...

p.shutdown()
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜