calculation of bits needed
I need help with this
I was asked that for an unsigned integer range 1 to 1 billion, ,how many bits are needed!
How do we calculate this?
Thank you
UPDATE!!!开发者_开发百科!
This what I wanted to know because the interviwer said 17
Take the log base 2 of 1 billion and round up.
Alternatively, you should know that integers (with over 4 billion values) require 32-bits, therefore for 2 billion you'd need 31-bits and for 1 billion, 30-bits.
Another handy thing to know is that every 10 bits increase the number of values you can represent by a factor just over 1000 (1024), so for 1000, you need 10 bits, 1 million needs 20 bits, and 1 billion needs 30 bits.
Calculate log2(1000000000)
and round it up. It works out to 30 bits.
For example in Python you can calculate it like this:
>>> import math
>>> math.ceil(math.log(1000000000, 2))
30.0
2^10 = 1024
2^10 * 2^10 = 2^20 = 1024*1024 = 1048576
2^10 * 2^10 * 2^10 = 2^30 = 3 * 1024 ~= 1,000,000
=> 30 Bits
精彩评论