开发者

Why would IQueryable<T>.FirstorDefault() return the same reference everytime of one object type and not another?

I came across this while pl开发者_高级运维aying with LINQ to SQL and can't seem to see it in a light that makes any sense. I have an IQueryable as:

var personnel = from p in dc.Resources select p;
var p1 = personnel.FirstOrDefault();
var p2 = personnel.FirstOrDefault();
objec.ReferenceEquals(p1,p2);

the above ReferenceEquals() call evalutates to false every time (as I would think it should since a new T-SQL call is generated for each call to FirstOrDefault()). However, I also have another IQueryable (different table in same database) as:

var accounts = from a in dc.Accounts select a;
var a1 = accounts.FirstOrDefault();
var a2 = accounts.FirstOrDefault();
object.ReferenceEquals(a1,a2);

this time the ReferenceEquals() call evalutes to true every time...any ideas how this is possible? (Note: I have checked to verify that a1,a2,p1 & p2 all evaluate to their respective classes and not null).


Although LINQ to SQL will execute the query every time you call FirstOrDefault, for every object returned from the database, LINQ to SQL will check if it is in the cache, and if it is, it will throw away the fetched data and simply returns the cached object.


There must be something with the first entity which is preventing the DC from giving you the cached version. I would expect the second example to be the behavior you would often see; the DC will recognize when you are requesting an object that it has already loaded, and will simply hand you the same reference when it can.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜