开发者

converting a ascii to int

gcc 4.5.1 c89

I have a buffer that is filled with char characters. I need to compare them:

This is a sample contents of the buffer:

vote buffer [ 51 ]
vote buffer [ 32 ]
vote buffer [ 49 ]
vote buffer [ 32 ]
vote buffer [ 50 ]
vote buffer [ 32 ]
vote buffer [ 53 ]
vote buffer [ 32 ]

I am trying to get the int equivalent of these char's that are in the buffer to compare.

#define NUMBER_OF_CANDIDATES 7
if((vote_data.vote_buff[i] > NUMBER_OF_CANDIDATES || vote_data.vote_buff[i] < 1) {
    /* Do something */
}

As you can see it will never be true in the if statement as the range is far greater.

I have tried casting to (int). However, that didn't solve the problem开发者_运维百科.

I guess I could calculate from the ascii character set. However, I would rather not add more complexity if I cannot help it.

Many thanks for any advice,


If you just want to convert single characters to int, you can use c - '0' (which is equivalent to c - 48). If you want to convert strings of more than a single character, use sscanf()


You can use atoi to convert strings as integers. Char and int are the same in C

int main (int argc, char** argv)
{
 int n=65;


 char* String;

 String="1234";


 printf("String: %s - Integer: %d\n", String, atoi(String));


 printf("int %d is char: %c\n ", n, n);

}


There is nothing built in to the standard library to turn a char into an int. This is because most ints don't fit in a char. There are however, several ways to turn a string into an int, because that is much more commonly done. You can easily use these by copying each char into a the first element of length 2 array with the second char 0, and using this as input to either atoi(), sscanf(), or strtol(). (I'd recommend one of the last two in a real program, as they allow error checking.)

char buffer[2] = {0,0};
int i;

for (i = 0; i < vote_count; ++i) {
    int vote;
    buffer[0] = vote_data.vote_buff[i];
    vote = atoi(buffer);
    /* handle vote */
}

Using the ASCII values and subtracting '0' is certainly a workable option. Any reasonable character set will have the digits in order.

Best yet would be to change the interface so that your routine doesn't receive an array of char, but an array of int. The external interfaces of the program should be responsible for sanitizing input and turning it into something easy to process. Currently, there is no way to easily change the program to support more than ten candidates. Letting the input routines store ints fixes this.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜