开发者

Data Access Layer as a web service -- Is this a good idea?

I have been researching for a while and have actually created a prototype ASP.NET web service as a DAL for a few ASP.NET 2.0 web sites. Just would like to ask for some insight/advice from more experienced developers out there who had successfully rolled out DAL as a web service. What are the drawbacks/risks to deploying DAL as a web service? What is the best wa开发者_如何学编程y to secure or authenticate consumption of this web service? WCF is out of the question, I'll be coding in VS 2005.

Thank you.


Let's look at this from the standpoint of the Evolution of "Enterprise" Software Development projects:

  1. Begin with a very simple, well-organized, and perhaps new application with little maintenance issues (if you're lucky). Programmers might be recent grads, but the system is young or clean enough they can be effective and quickly respond to change requests. Most database code may use stored procedures. There is no DBA involved, and there is no formal spec or road map.
  2. The application grows. There is a frequent need for multiple programmers to work in the same parts of the system at the same time. The new grads discover source control to help them share code among multiple programmers and move away from stored procedures in favor of n-Tier design or an ORM to make it easier to version the database code. This works well as long as each individual functional area is fairly isolated. A DBA may now begin helping tune queries. There is still no spec, but there might be a high-level road map or wish list.
  3. This eventually evolves into an interconnected system of apps that has grown organically rather than by design. Change requests become difficult, as changes in one area have subtle effects in others. To solve this problem of multiple apps talking to the same database and needing to share common and complex business logic, the programmers turn to a service oriented architecture (web services). Old data and business tiers are analyzed, combined, and refactored into a common set of web services. Most programmers now no longer even know how to connect to their database – only those working on the core services are allowed to do this, and even they tend to leave any actual SQL to the DBA team. If unit testing is not already in use it is now discovered as part of setting up a continuous integration system or issue tracker.
  4. The system continues to grow, but the business grows even faster. Things generally work; quality is good and performance is not great, but still acceptable. There is definitely a spec and issue tracker now; in fact, it's impossible to do anything that doesn't first have a tracking number associated with it. The problem is change rates are too slow. The layers of process between the programmers and application prevent them from keeping up with the business in a cost effective way. Someone discovers agile methods.
  5. Go back to step one.

To get serious again, the story above helps establish the context for web services and understand the problems they are intended to solve. We see from this context that web services really encompass both the data layer and the business layer. The purpose of a service layer is to enforce sharing a common set of rules among several applications. Leaving the business layer out of your service gives programmers the chance to write their own business code for each application, and that's really counter-productive to the purpose of using services in the first place.

That said, it's possible that things end up stacked in layers, where you have raw data services that are private to certain parts of a business, and those "raw" services are in turn used to build the down-stream services that comprise the business rules layer. It's hard to know for sure what business actually do. However, I get a sense this level of disconnect is less common.


I think, that biggest drawback of such approach is additional overhead of calling web service. If you need frequent queries/updates to DAL, this can get quite slow.

My opinion is, that such approach is kind of overengineering, unless you really need to have physically separate DAL for different consumers and you need some additional validation/processing in DAL (which is kind of wrong anyway).

Securing can be quite a simple. You can use SSL together with IIS authentication for your public service interface.

So, those are my $0.03


The only real challenge I ever faced with exposing data over an ASMX-based web service was dreaming up all the methods required to get data efficiently. Sometimes it's hard to have the discipline to honor the tier between the application and the database.

If you are deploying to an Intranet environment with AD, Integrated Windows Authentication is an excellent way to control who can and cannot interact with a service. It is helpful to group service classes by the consumer roles, so that permissions can be controlled declaratively in the Web.config. I tend to keep read methods in a different service class than insert update and delete methods

Avoid chatty service calls. Of course, it is well to avoid chatty database calls in a 2-tier system, but you'll pay the piper for chatty calls when you increase the number of tiers. Choose to send larger objects. For example, if you have a table with a few lookups, sending an object across the wire with pre-looked-up values will often save you a second or third call, and shouldn't cause undue burden on the system.

I hope these ideas help.


I would recommend against this until you can move to WCF. Otherwise, you'll be passing all your data back and forth in text-based XML over HTTP, which will slow things down substantially. You'll also have very little choice about security, being limited to SSL for encryption.

WCF will allow you to send binary data over TCP/IP, named pipes or message queues, and will allow you a great deal of flexibility in terms of security.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜