Exchange Web Services (Managed API) vs. WebDav Performance Question
I'm new to Exchange (2007) development so please bear with me. :-). There appear to be a myriad of technologies for Exchange development -- the latest being Exchange Web Services -- and it's related Managed API. I need to write a program that can -- if necessary -- run on the Exchange servers -- to scan people's mailboxes for the purpose of purging messages that meet various criteria (irrelevant for this discussion).
It is my understanding that most of the other technologies -- WebDav, MAPI, CDO -- are now deprecated with respect to Exchange 2007 and Exchange 2010. So since this is a greenfield application, I decided to use the Exchange Web Services Managed API.
I'm concerned about the number of items I can scan per hour. Since it is web services based there is a network hop involved. So I'd like to run this utility on the server with whom I am commnunicating. Am I correct that I have to talk to a "Hub" server?. I'm using Auto Discovery and it appears to resolve to a "hub" server no matter which mail server contains the actual message store I'm scanning.
When pulling multiple items down -- using ExchangeService.FindItems and specifying a page size of 500 -- I get pretty good throughput from my workstation to the hub server. I was able to retrieve 22,000 mail items in 47 seconds. That seems reasonable. However, turns out that not all properties are "bound" when retrieved that way. Certain properties -- like ToRecipients and CcReipients -- are not filled in. You have to explicitly bind them (individually) -- with a call to
Item.Bind(Server, Item.Id)
This is a separate round-trip to the server and this drops throughput down from about 460 items/second to 3 items per second -- which is unworkable.
So -- a few other questions. Is there any way to either force the missing properties to be bound during the call to Find开发者_StackOverflowItems? Failing that, is there a way to bind multiple items at once?
Finally, am I right in choosing Exchange Web Services at all for this type of work. I love the simplicity of the programming model and would not like to move to another technology if it is (a) more complex or (b) deprecated. If another technology will do this job better, and it is not deprecated, than I would consider using it if necessary. Your opinion and advice is appreciated.
You can user the service to load many properties for many items in one call to the server - it is designed exactly for your problem. It is just unfortunate that the Managed API documentation is still pretty thin on the ground.
results = folder.findItems... (or whatever find call you are making)
service.LoadPropertiesForItems(results, propertySet);
Where property set is something like:
PropertySet s = new PropertySet(BasePropertySet.IdOnly, ItemSchema.Subject, customDefinitions);
Use the various xSchema classes to load in the specific fields you want to minimise load if you are fetching lots of records back.
精彩评论