python global object cache
Little question concerning app architecture:
I have a python script, running as a daemon.
Inside i have many objects, all inheriting from one class (let's name it 'entity')
I 开发者_开发问答have also one main object, let it be 'topsys'
Entities are identified by pair (id, type (= class, roughly)), and they are connected in many wicked ways. They are also created and deleted all the time, and they are need to access other entities.
So, i need a kind of storage, basically dictionary of dictionaries (one for each type), holding all entities.
And the question is, what is better: attach this dictionary to 'topsys' as a object property or to class entity, as a property of the class? I would opt for the second (so entities does not need to know of existence of 'topsys'), but i am not feeling good about using properties directly in classes. Or maybe there is another way?
There's not enough detail here to be certain of what's best, but in general I'd store the actual object registry as a module-level (global) variable in the top class, and have a method in the base class to access it.
_entities = []
class entity(object):
@staticmethod
def get_entity_registry():
return _entities
Alternatively, hide _entites entirely and expose a few methods, eg. get_object_by_id
, register_object
, so you can change the storage of _entities itself more easily later on.
By the way, a tip in case you're not there already: you'll probably want to look into weakrefs when creating object registries like this.
There is no problem with using properties on classes. Classes are just objects, too.
In your case, with this little information available, I would go for a class property, too, because not creating dependencies ist great and will be one worry less sometimes later.
精彩评论