开发者

C pointers and bit setting question

As background to this question this was a test program I wrote to investigate some strange behavior we were seeing in a messaging program I work on. We have various structures that use bit flags to hold states about the messages, some of the flags use char, some use shorts. A utility we have to dump the messages to flat files was using a char* on these flag values to extract the settings, hence my deliberate use of a different pointer on flag2.

When this program is run on AIX UNIX 6.1 why is the output:

2 set

4 set

8 set

16 set

64 set

What happened to va开发者_JAVA百科lues 1 and 32 ?

#include <stdio.h>

#define SET(x,y) y |= (x)
#define UNSET(x,y) y &= (~x)

int main ( int argc, char *argv[] )
{
    unsigned char *g;
    unsigned short *h;
    unsigned long *i;

    char flag1; 
    short flag2; 
    long flag3;

    g = (char*) &flag2;

    SET(1, flag2);
    if ( 1 & *g )
        printf("1 set\n");
    UNSET(1, flag2);

    SET(2, flag2);
    if ( 2 & *g )
        printf("2 set\n");
    UNSET(2, flag2);

    SET(4, flag2);
    if ( 4 & *g )
        printf("4 set\n");
    UNSET(4, flag2);

    SET(8, flag2);
    if ( 8 & *g )
        printf("8 set\n");
    UNSET(8, flag2);

    SET(16, flag2);
    if ( 16 & *g )
        printf("16 set\n");
    UNSET(16, flag2);

    SET(32, flag2);
    if ( 32 & *g )
        printf("32 set\n");
    UNSET(32, flag2);

    SET(64, flag2);
    if ( 64 & *g )
        printf("64 set\n");
    UNSET(64, flag2);

    return 0;
}


AIX is a big endian architecture, that means that g points to the most significant bytes of flag. This means any changes to the least significant bytes are invisible to g.

flag2 is also never initialized so it will contain some arbirary values. So it turns out your think some bit setting worked when it was just the arbirary values that was there in the beginning.

If you run on a little endian machine like x86, things should behave as expected.


I see a couple of things that could be the problem.

  1. You aren't initializing flag2

  2. You might want to cast g= (short * ) &flag2, because if you cast to char you won't necessarily be getting a reference to the least significant 8 bits.


From wikipedia's pointers:

A 2005 draft of the C standard requires that casting a pointer derived from one type to one of another type should maintain the alignment correctness for both types (6.3.2.3 Pointers, par. 7):[3]

char *external_buffer = "abcdef";
int *internal_data;

internal_data = (int *)external_buffer;  // UNDEFINED BEHAVIOUR if "the resulting pointer
                                         // is not correctly aligned"

Bottom line: use the right pointer type

The behavior might work as expected, but isn't guaranteed.

C++03 Standard $5.3.3/1 says,

sizeof(char), sizeof(signed char) and sizeof(unsigned char) are 1; the

result of sizeof applied to any other fundamental type (3.9.1) is implementation-defined. [Note: in particular,sizeof(bool) and sizeof(wchar_t) are implementation-defined.69)

Look at this: C++ Size of primitives

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜