开发者

Performance consideration of destroying dataContext vs. keeping it open for future db access?

I'm using LINQ2SQL to handle my database needs in a ASP. Net MVC 3 project. I have a separate model which contains all my database access in its own class as follows:

 public class OperationsMetricsDB
{

    public IEnumerable<client> GetAllClients()
    {
        OperationsMetricsDataContext db = new OperationsMetricsDataContext();
        var clients = from r in db.clients
                      orderby r.client_name ascending
                      select r;
        return clients;
    }
    public void AddClient(client newClient)
    {
        OperationsMetricsDataContext db = new OperationsMetricsDataContext();

        db.clients.InsertOnSubmit(newClient);
        db.SubmitChanges();

    }

I have about 50 different methods in this class which all create and then destroy a copy of my DataContext. My reasoning was that this way would save memory because it would destroy the DataContext after I use the connection and free up that memory. However, I have a feeling that it may be better to use one copy the dataContext and keep it open instead of disposing and reestablishing the connection over and over again. e.g

public class OperationsMetricsDB
{
    OperationsMetricsDataContext db = new OperationsMetricsDataContext();
    public IEnumerable<client> GetAllClients()
    {      开发者_如何学Python      
        var clients = from r in db.clients
                      orderby r.client_name ascending
                      select r;
        return clients;
    }
    public void AddClient(client newClient)
    {
        db.clients.InsertOnSubmit(newClient);
        db.SubmitChanges();

    }

What is the best practice on this?


I personally use the Unit of Work pattern in conjunction with Repositories for this.

The UnitOfWork creates and manages the DataContext. It then passes the context to each repository when requested. Each time the caller wants to do a new set of operations with the database, they create a new UnitOfWork.

The interfaces would look something like:

public interface IUnitOfWork
{
    IRepository<T> GenerateRepository<T>();
    void SaveChanges();
}

public interface IRepository<T> where T : class
{
    public IQueryable<T> Find();
    public T Create(T newItem);
    public T Delete(T item);
    public T Update(T item);
}

That ensures that the context's lifespan is exactly one Unit of Work long (which is longer than a single operation but shorter than the lifespan of the application).


Its not recommended to cary a datacontext a long time with you. So you are on the right path. It uses connection pooling as far as i know, so the performance hit of creating more than one datacontext in an applications lifetime is not too serious.

But i would not create a new context instance for every single method call of your data class.

I prefer to use it in a unit of work style. Within a web application the processing of a http request can be seen as a unit of work.

So my advice is to create one datacontext instance for the lifetime of on http request and dispose it afterwards.


One context per request is usually fine for most applications.

http://blogs.microsoft.co.il/blogs/gilf/archive/2010/05/18/how-to-manage-objectcontext-per-request-in-asp-net.aspx

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜