开发者

Models in database speed vs static dictionaries speed

I have a need for some kind of information that is in essence static. There is not much of this information, but alot of objects will use that information.

Since there is not a lot of that information (few dictionaries and some lists), I thought that I have 2 options - create models for holding that information in the database or write them as dictionaries/lists to开发者_StackOverflow some settings file. My question is - which is faster, to read that information from the database or from a settings file? In either case I need to be able to access that information in lot of places, which would mean alot of database read calls. So which would be faster?


If they're truly never, ever going to change, then feel free to put them in your settings.py file as you would declare a normal Python dictionary.

However, if you want your information to be modifiable through the normal Django methods, then use the database for persistent storage, and then make the most of Django's cache framework.

Save your data to the database as normal, and then the first time it is accessed, cache them:

from django.core.cache import cache

def some_view_that_accesses_date(request):
  my_data = cache.get('some_key')

  if my_data is None:
    my_data = MyObject.objects.all()
    cache.set('some_key', my_data)

  ... snip ... normal view code

Make sure never to save None in a cache, as:

We advise against storing the literal value None in the cache, because you won't be able to distinguish between your stored None value and a cache miss signified by a return value of None.

Make sure you kill the cache on object deletion or change:

from django.core.cache import cache
from django.db.models.signals import post_save
from myapp.models import MyModel

def kill_object_cache(sender, **kwargs):
    cache.delete('some_key')

post_save.connect(kill_object_cache, sender=MyModel)
post_delete.connect(kill_object_cache, sender=MyModel)

I've got something similar to this in one of my apps, and it works great. Obviously you won't see any performance improvements if you then go and use the database backend, but this is a more Django-like (Djangonic?) approach than using memcached directly.

Obviously it's probably worth defining the cache key some_key somewhere, rather than littering it all over your code, the examples above are just intended to be easy to follow, rather than necessarily full-blown implementations of caching.


If the data is static, there is no need to keep going back to the database. Just read it the first time it is required and cache the result.

If there is some reason you can't cache the result in your app, you can always use memcached to avoid hitting the database.

The advantage of using memcached is that if the data does change, you can simply update the value in memcached.

Pseudocode for using memcached

if 'foo' in memcached
    data = memcached.get('foo')
else
    data = database.get('foo')
    memcached.put('foo', data)


If you need fast access from multiple processes, then a database is the best option for you.

However, if you just want to keep data in memory and access it from multiple places in the same process, then Python dictionaries will be faster than accessing a DB.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜