How to consume data in large scale business application
I am just wondering, what is a best approach to consume data from a multi tier large scale application. To be honest I am still don’t have enough experience to choose proper way. We have started our project with thought that RIA Services will distribute entities over the wire and at first we were OK with RIA, but the problems have started after we had to send some sort of complex DTO as a response from service and as a parameters to service calls.
At this point working with RIA services seems to be like fighting with the framework and we have decided to use raw WCF services with generated clients on frontend. We have tried to do this for a week, it was kind of positive experience, but in the end it also showed it's downsides, like client does not get regenerated, some classes was not reusable and visual studio had to regenerate them on client side once again, etc.
I have started to search for the solution and have found a wonderful article on how to generate async clients during runtime. So the idea was to place interface describing WCF service in shared assembly, and then create WCF Client with auto generated (emitted) methods to support async work with service. It has proven to be best solution from what I have seen, but I have not enough time to find its downsides yet, because...
New architect has joined our team and now we are considering using his self written implementation of ServiceBus. It looks much more mature, but it is built around DuplexPushService, and I don’t really know if it will be as scalable and robust as author says.
Why I am writing this? - I just want to hear a story of a successful integration of a pattern or a technology into a fi开发者_如何转开发nished solution, what kind of technology do you use to support business logic? Can you say about it pros and cons? What you would do if you have started a new Silverlight project right now?
I will appreciate your answers very much, thank you for reading this wall of text and sorry for the lack of code samples and links.
So the real question is how to consume data in silverlight application that should serve ~50k people.
The golden rule
- Do not do anything until performance really is a problem.
Bonus rule
- Design the DTO's so only one WCF call is required per use case (if possible).
DTO's are not objects, so don't treat them as so in either end. Transform them into something more usable instead of use them directly.
When performance really is a problem
- Cache as much as you can in the silverlight application
- Use Local Storage and synchronization to keep data between sessions.
Similar to JGauffin. We are taking DTO's and "bootstrapping" them into "Models" which inherit OnPropchange, DataErrorInfo, etc. Which we then use in our ViewModels. The purest would say that that our Model is actually a lightweigh ViewModel, but I think they're Models. We take records from LINQ2SQL and transform them into CRUD Objects (because of the old school way we used to do it) and send that over a plain old SilverLight WCF service. That DTO Object becomes our ".Source" in the object which all properties point to ( set Source.Property = value; and return Source.Property; ). With EF, we are sure to not use include because of our associations, we'll end up in a recursive loop, A has a B which B has a reference to it's parent, which has a B.... etc etc etc
精彩评论