开发者

DataSet size best practices - are there any general rules?

I'm working on a开发者_如何学编程 desktop application that will produce several in-memory datasets as an intermediary before being committed to a database.

Obviously I'm going to try to keep the size of these to a minimum, but are there any guidelines on thresholds I shouldn't cross for good functionality on an 'average' machine?

Thanks for any help.


There is no "average" machine. There is a wide range of still-in-use computers, including those that run DOS/Win3.1/Win9x and have less than 64MB of installed RAM. If you don't set any minimum hardware requirements for your application, at least consider the oldest OS you're planning to support, and use the official minimum hardware requirements of that OS to gain a lower-bound assesment.

Generally, if your application is going to consume a considerable amount of RAM, you may want to let the user configure the upper bounds of the application's memory management mechanism.

That said, if you decide to dynamically manage the upper bounds based on realtime data, there are quite a few things you can do.

If you're developing a windows application, you can use WMI to get the system's total memory amount, and base your limitations on that value (say, use up to 5% of the total memory).

In .NET, if your data structures are complex and you find it hard to assess the amount of memory you consume, you can query the Garbage Collector for the amount of allocated memory using GC.GetTotalMemory(false), or use a System.Diagnostics.Process object.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜