开发者

Is it possible to Estimate RAM needed from Turing space complexity?

Turing machines can consider complexity in both space (memory space on tapes) and time.

There are classes such as PSPACE and EXPSPACE.

Further, we can present algorithms that are definitely in PSPAC开发者_StackOverflowE.

http://www.springerlink.com/content/3hqtq11mqjbqfj2g/

However, when I actually code programs, some programs run faster than others, some programs have a smaller memory (RAM) footprint than others.

Presumably if I code a PSPACE algorithm to solve problem X and also an EXPSPACE algorithm to solve the same problem, the EXPSPACE program should use much more RAM than the PSPACE code.

Is there any way to estimate how much RAM will be involved, based on theoretical rating of the starting algorithm?


Presumably if I code a PSPACE algorithm to solve problem X and also an EXPSPACE algorithm to solve the same problem, the EXPSPACE program should use much more RAM than the PSPACE code.

You presume wrong.

These complexity classes describe the asymptotic growth of memory required to perform the algorithm. They tell you absolutely nothing about the actual amount of RAM required.

Basically, for some size of problem n, above that size EXPSPACE will use more memory than PSPACE, but for anything below n, you can't really say anything (just like O(n2) algorithms might run faster than O(n) algorithms for small n).


In principle no. In practice, sometimes.

First you have to understand what complexity analysis actually means (review the definitions in your textbook). PSPACE just means that the space required is bounded by a polynomial function of the input size. It doesn't tell you what that bounding function is, or what the actual space used is. So you can't work out anything about RAM just by knowing an algorithm is in PSPACE.

If you know that an algorithm is in PSPACE, you might hypothesise that the space it uses isn't just bounded by a polynomial, it's described by a polynomial. It might not be true, but for many algorithms it is true. You could then calculate (or measure) the space used for various different input sizes, and try to match a polynomial to the data.

In general that's fairly futile (because without knowing the order of the polynomial, there are infinitely many possible fits). But in practice if you know that the space used is, say O(n), and you have some idea what kinds of input will produce worst-case space use, then you can make fairly accurate predictions. If it takes 10MB of RAM to process 1MB of input, and 20MB of RAM to process 2MB of input, then frequently it will take about 100MB of RAM to process 10MB of input. But you will only gain this insight from more detailed knowledge of the algorithm than just knowing it has polynomial space complexity.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜