Storing pixels in a byte array
This was asked in one of the interviews and i cant seem to find any clue about it. The question is:
I have 40 x 20 screen, and i want to store these pixels in a byte array so that i can reconstruct this screen again from the byte array. The si开发者_如何学Goze of byte array is 100bytes.
Another way of looking at it is. How do we store a single pixel (x,y) using one bit. Since, there are 40 * 20 = 800 pixels and we have 100 bytes.
Any hints/ideas or reference links will be helpful
Thanks, Chander
One byte is 8 bit. So you can store 100 byte · 8 bit/byte = 800 bit of information with these 100 byte.
As each bit can have two values (0, 1), you can only represent two states with each bit. In case of a screen, these two states could be 0 = black and 1 = white or 0 = light off and 1 = light on.
And as you have 800 bit, you can represent your 40 · 20 pixel = 800 pixel with these 800 bits, with each bit represents one of the two states.
I think the word you're looking for is bitmap
. Each bit in the 80 bytes = 1 pixel on the screen (0 = black, 1 = white).
You don't need more than that because the shape of the output is a given.
Suppose if you have pixels represented in 40x20 pixels and say this is an input in form
byte[][] input={1,0,1...} each entry being a pixel value, to store in the in a byte array you can just do,
for(int i=0...width)
for(int j=0...height)
byteArr[pos]|=input[i][j]<<j
this means for each single position, just OR 8 bits of information from the original array to store as single value, converting it back just take each byteArr[index] and extract each bit in position from 0..8 through right shifting(>> pos)
Another question is to ask is if you are using big-endian or little-endian, currently I assume that the array is little endian
精彩评论