Large domain model in memory
I'm working on a project which has a large object graph in memory. The domain objects are really 'wide' and contain a whole load of data which is normally extraneous. I'm thinking about implementing some sort of lazy loading model.
I'm currently thinking along the 开发者_JAVA技巧lines of having attributes on each of the properties that are lazy. I build dynamic proxies types at application start, and send these over. These would then look up the local extended properties service as necessary.
Has anyone done something like this before .Net, with a clean implementation? Thanks
It depends on the characteristics of your object graph and usage patterns. The lazy loading proxy approach is reliable but doesn't scale well on its own. The usage pattern where the client asks for the same property for a large number of objects quickly causes performance problems with a remote server because of the large number of remote calls.
Holding the object graph in memory on the server and allowing the client to bulk-load property data, with the lazy-loading proxy as a fall-back worked much better for us.
Bulk loading also allows for size optimizations on the results stream where you have repeated, immutable values.
The builder pattern actually works pretty well in this situation. Defer the extraneous stuff to lazy loading, and farm off the actual loading and construction of those extraneous attributes to builder classes.
Additionally you can consider using WeakReference(s) in conjunction with lazy loading. This will save you from OutOfMemoryException.
精彩评论