开发者

Why is establishing a connection with a database server considered a hefty and high resource consuming process?

How do we measure the 'high resource consuming' part?

As a follow up, I have another 开发者_C百科question-

If pooling is enabled for a web application that is in a web farm? What is the size of the pool?

Are 4 pools created if there are 4 servers in the farm? Is a separate connection pool maintained for every distinct process, app domain and connection string?

Is there a comprehensive article that can explain the connection pooling in ADO.NET. I have already read the one on codeproject, 4guysfromrolla, google, etc., but could not find answers to my questions. I am looking for something more comprehensive.

How is Connection pooling handled in EF4.1?


Answer to original question posed in the title...

A RDBMS is nothing more than a remote interface to the consuming application, regardless of its purpose (to store data), and the smoke and mirrors (data access frameworks like O/RM) hide the lack of locality. In reality, an object oriented system with RDBMS integration is a similar to an object oriented system with web service integration: explicit remote boundaries exist and cannot be discounted. The perceived, demonized impedance mismatches are a symptom of the misconception of distributed systems. An invisible and often abused line in the sand exists (remote interface), and when crossed, incurs a significant penalty in terms of CPU and I/O cost.

See: http://www.softwareishardwork.com/Exposing%20the%20True%20Nature%20of%20Impedance%20Mismatches%20in%20Data%20Programming%20Models%20%5BBullington,%20D.%202010%5D.pdf

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜