Dynamic per-request database connections in Django
I'm building a centralised django application that will be interacting with a dynamic number of databases with basically identical schema. These dbs are also used by a couple legacy applications, some of which are in PHP. Our solution to avoid multiple silos of db credentials is to store this i开发者_运维问答nfo in generic setting files outside of the respective applications. Setting files could be created, altered or deleted without the django application being restarted.
For every request to the django application, there will be a http header or a url parameter which can be used to deduce which setting file to look at to determine which database credentials to use.
My first thought is to use a custom django middleware that would parse the settings files (possibly with caching) and create a new connection object on each request, patching it into django.db before any ORM activity.
Is there a more graceful method to handle this situation? Are there any thread safety issues I should consider with the middleware approach?
rereading the file is a heavy penalty to pay when it's unlikely that the file has changed.
My usual approach is to use INotify to watch for configuration file changes, rather than trying to read a file on every request. Additionally, I tend to keep a "current" configuration, parsed from the file, and only replace it with a new value once i've finished parsing the config file and i'm certain it's valid. You could resolve some of your concerns about thread safety by setting the current configuration on each incoming request, so that the configuration can't change mid-way through a request.
You could start different instances with different settings.py
files (by setting different DJANGO_SETTINGS_MODULE
) on different ports, and redirect the requests to the specific apps. Just my 2 cents.
精彩评论