开发者

Fastest .net data structure for parallel search

Let's say we have a big read only list of (int, string) elements.

What would be the fastest way to get an item from that list?

I know a generic dictionary is fast, but as far as I know it uses only 1 cpu, and today's computers 开发者_开发知识库have at least 2 cpu's.

As a side question: what would be the fastest solution to search this collection for multiple items? For example collection.GetItems(new int[]{1,2,3,4}), where 1,2,3,4 are the keys.

Thanks!


A dictionary uses hash tables which should ammortize to O(1). Computing the hash on the keys should be very fast and the hash lookup is a direct array memory offset and hopefully walking a very short collision chain.

Therefore I would not recommend optimizing the lookups unless a dictionary does not serve your needs and it's too slow. You could argue that there's a processor sitting there going to waste but trying to leverage that processor to optimize a problem that may not exist will complicate your code.

I would recommend maintaining a lookup dictionary and for each lookups.

The only consideration is memory. A dictionary will add a memory footprint to make the lookups fast - typical space vs. time.

If you need to keep memory low and you need faster lookups and you have more processing power (multi-core), then maybe.

In that case, I would recommend you look into the task parallel library. Here's an article: http://www.codeproject.com/KB/cs/TPL1.aspx

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜