what is the standard practice to build a webpage with services
I have heard a lot about the benefits of SOA, services, etc. But I fail to see is how it can be done with good performance.
I have a webpage with social network functionalities and advertisements. Most of my code is entangled together but I think the advertisements and the "recommended friends" functionities are pretty independent, perfect for a SOA like approach.
I can make a REST HTTP level API for these two services and call then for every page request of my main site.
I dont want users to wait too long for the page开发者_StackOverflow中文版 to load, if I get a bunch of data in my request (for instance recommended user id) that I have to process or search related data, etc I feel I make things slower for no real benefit. A relatively slow HTTP call to get something I would have just processed locally. I've just moved my processing to the other side of a slow HTTP request.
The only other option I can see is toput iframes on my html templates with their src pointing to these external services. These services would return directly HTML. Thus loading in parallel.
It is said that a typical amazon page is build from 150 services, I dont see 150 iframes there so how they do that and get low latencies?
There seems to be a bit of mixing of terms in the question. First-off, SOA is not simply the act of creating services (web services, it looks like in this case) for providing data. It is a fairly complex architectural concept (along the lines of DDD, MVC, etc) that involves developing your application into a series of service-based tiers in which requests are made and served. So, as opposed to having large object graphs supporting a suite of business processes, you end up with a set of relatively-atomic commands that result in fairly straight-forward interactions with your overall model. One of the big benefits of this architecture is that it scales fairly well. Rather than constantly having to re-work your model and massage new commands/workflows in, you can build new sets of services.
All that having been said, service calls are fairly cheap. Consider the cost of making a service call (1k? 2k? less?) versus a full post-back or request of a web page (70k? 100+k?). If you are requiring a full post and redirect for each command on the page, you are looking at a fairly high cost in terms of bandwidth and performance if you are expecting a lot of traffic. Systems built by companies such as Yahoo and Google benefit immensely by partitioning tasks into a series of commands that are executed asynchronously after the page is loaded to reduce preceived waiting time and overall network traffic.
The big key here is that these service calls are made asynchronously. As such, from the user's perspective, there is very little time spent waiting for a page to load. So, just like you see here on StackOverflow, you can continue scanning the text of the question while your upvote is being processed in the background.
Is this the right way to go? That's up to you. If what you have works, stick with it. If it's worth the time and the ramp-up to implement a more complex solution because there is an active and perceptible degradation in performance caused by your current architecture, then maybe you should consider a change.
精彩评论