开发者

Computer logic problem

Consider a multilevel computer in which all the levels are different. Each level has instructions that are m times as powerful as those of the level below it; that is, one level r instruction can do the work of m level r – 1 instructions, If a level 1 program requires k seconds to run, how long would equivalent programs take at levels 2, 3, and 4, assuming n level r instructions are required to interpret a single r+1 instruction?

This is the solution I came up with. Can anyone confirm or comment?

This is the solution I wound up coming up with. Can anyone verify or comment?

Level (r)        Level-1 Instructions (m)          Time
4                m^3                               t(q) ==(m^3q+n+nm+nm^2) (k/q)
3                m^2                       t(q) =(m^2q+n+nm)(k/q)
2                m                             t(q) = (mq+n)(k/q)
1                1                             t(q) = k

In order to calculate runtime t(q) for a given program containing q level-1 instructions, we must take into account both the exponentially-increasing number of level-1 instructions each level r instruction represents (shown as m^(r-1)) and the additional number of level-1 instructions required for interpretation for each layer on which the program is executed (shown as nm^(r-1)). The additional level-1 instructions used for interpretation by the lower levels must also be added into the final equations for r>2. Finally, for each equation we can determine the number of seconds the program takes to run by multiplying the total number of level-1 instructions used by the execution time of one level-1 cycle, as calculated by (k/q).

Disclaimer: This IS homework, the assignme开发者_开发百科nt has already been handed in. I simply cannot get the semantics of this problem, and I would really like to understand it.


I think you are all making this too complicated. The problem statement says, in other words, that each layer runs m times faster than than the layer above. Hence layer 2 completes programs in 1/m the time, layer 3 in 1/m * 1/m and so on. So the final equation is just:

t(q) = k / (m ** q)


Problem simply states that if it takes k unit of time at level 1 then k/m units it ll take in second level as so on...


It's just a recursive function:

t(q, r) = q*k if r == 1
t(q, r) = q*t(m, r-1) + t(n, r-1)

Now explained:

This is obvious since it was stated in the question (I parameterized k as the elementary unit, k is the time for one level 1 instruction):
t(q, r) = q*k if r == 1

The amount of time it takes to execute a function with q r-1 instructions
t(q, r) = 
          q times the amount of the time it takes for m level r-1 instruction
          q*t(m, r-1)
                        plus the time it takes for n level r-1 instructions
                        + t(n, r-1)


I'm not sure the task definition is complete because if it is I don't see any other sane way to solve it than simplifying it.

So here are a few things I'd assume:

  1. t(q,1)=k (we are tasked to find t(q,r)) => t(1,1) = q/k, why? Because if we don't assume time depends only on number of instructions rather than on the instructions type, we are in reality where this task is not solvable. In such case q would not be looked at like a number and we cannot assume another set of instructions would take less or more time based on number of instruction. In conclusion, as far as my reading goes in this task they relate time only to number of instructions.
  2. if program is native in one level 'r' then it would take n instructions of another level to interpret them (make them native). There is nothing in the task definition (as presented here) that forces you to interpret only level r+1 instructions. Actually because we start from level one, "n level r instructions are required to interpret a single r+1 instruction" would be pretty useless if we cannot assume what I'm saying above.
  3. also q level X instructions after being interpreted should always convert to q level Y instructions, other vise we can never know the execution time of another level because number of instructions can greatly vary (like in real world)

So here is the answer with these preconditions:

q=number of level one program instructions
t(q, r)=time necessary to execute q **level 1** instructions on level r comp
t(1, r)=time necessaty to execute one instruction on level r comp
t(1, 1)=time necessary to execute one instruction on level 1 comp
t(q, 1)/m^(r-1)=time to execute q **native** instructions on level r comp

t(q, 1)=k
t(1, 1)=k/q
t(q,r)=(time to interpret q non-native instructions/convert them to native) + (time to actually execute the code)
t(q,r)=t(qn, 1)/m^(r-1) + t(q, 1)/m^(r-1)
t(q,r)=(time to execute qn native instructions) + (time to execute q native instructions)
t(q,r)=nt(q, 1)/m^(r-1) + t(q, 1)/m^(r-1)
t(q,r)=(n+1)t(q, 1)/m^(r-1)
t(q,r)=(n+1)k/m^(r-1)

If any of the 3 assumptions are wrong, you need to introduce new unknown functions so answer would become just an equation of unknown functions which would be useless but much more useless than this one. This one is just prettier :)

Btw you need to look at your exercises in university to see how similar tasks have been solved to be sure approach to the problem is correct.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜