开发者

working with bits

program:

typedef bitset<8> bits;
char original = 0xF0F0F0F0;
char Mask = 0xFFFF0000;
char newBits = 0x0000AAAA;

/*开发者_如何学JAVA& operation with "0bit set 0" & "1bit give no change to original byte" */
cout<<"Original o: "<<bits(original)<<endl;
cout<<"NewBits: "<<bits(newBits)<<endl;
cout<<"Mask m: "<<bits(Mask)<<endl;
cout<<"o & m with Mask: "<<bits(original & Mask)<<endl;/*0 set original bit as 0 */

Result:

Original o: 11110000

NewBits: 10101010

Mask m: 00000000

o & m with Mask: 00000000

Result 10101010

I understand the hex & its result.. but....... o & m == 0000 0000 so bits(o & m | newBits) result should be 0000 0000, not 1010 1010...

Where i am missing the concept...

Can anyone help me please...

Expecting a good response

Thanks


o & m = 0000 0000 and newBits = 1010 1010. So if you OR them (bitwise) you will get the result as 1010 1010 as 0|0=0, 0|1=1, 1|0=1, 1|1=1.

0000 0000 OR WITH
1010 1010
-----------------
1010 1010
-----------------
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜