C# - What's are the available multi dimensional data structures that perform well?
I need to store multi-dimensional data consisting of numbers in a manner thats easy to work with. I'm capturing data in real time (processing live video frames every 40ms/10ms), and once processed I would destroy and GC older data.
This data structure must be fast so it won't hit my overall app performance. The faster the better (my algorithms are pretty time consuming, from 1ms to 20ms and I will be reading / writing t开发者_JS百科his data lots throughout my code).
What are my choices in terms of platform supported data structures? I'm using VS 2010. and .NET 4.
The fastest way is probably:
- A one-dimensional array, indexed like a multi-dimensional array (
array[x, y]
<=>array[x + y * columns]
) - Accessed through pointers if random read/writes occur.
- Reuse the array as much as possible, don't allocate new ones constantly. Especially if it's big, because if the size is > 85K, it gets allocated on the Large Object Heap, which doesn't get compacted by the GC after a collection, causing eventual fragmentation if you constantly reallocate the array.
- Accessed as much as possible in parallel. Just gotta make sure multiple threads never, ever write to the same element.
Of course, this is a generic answer, because the question is also pretty generic. Can't really go wrong with it, though.
Realtime or not, creating an object in .NET is pretty fast, so what about ConcurrentQueue+ tuples?
That way you can read / write thread-safe with pretty standard data structures. The following code took 3 seconds on my T500 inside Visual Studio with debugging on:
Stopwatch sw = Stopwatch.StartNew();
var q = new ConcurrentQueue<Tuple<int, int, int>>();
Enumerable.Range(0, 10000000).AsParallel().ForAll(i => q.Enqueue(Tuple.Create(i, i, i)));
Console.WriteLine(sw.ElapsedMilliseconds);
Console.ReadLine();
Funny enough, the non-parallel version is even faster.
How about a simple multidimensional array? That seems like the obvious and fastest choice.
精彩评论