Network byte order conversion with "char"
I've always been taught that if an integer is larger than a char, you must solve the byte ordering problem. Usually, I'll just wrap it in the hton[l|s] and convert it back with ntoh[l|s]. But I'm confused why this doesn't apply to single byte characters.
I'm sick of wondering why this is, and would love for a seasoned networks programmer to help me shed some light on why byte orderings only apply for multibyte integers.
Ref: https开发者_如何学C://beej.us/guide/bgnet/html/multi/htonsman.html
What you are looking for is endianness.
A big-endian architecture stores the bytes of a multibyte data type like so:
while a little-endian architecture stores them in reverse:
When data is transferred from one machine to another, the bytes of a single data type must be reordered to correspond with the endianness of the destination machine.
But when a data type only consists of one byte, there is nothing to reorder.
Your networking stack will handle the bits inside the bytes correctly, you must only concern yourself with getting the bytes in the right order.
Exactly how many ways can you order the bytes in a single char?
You need to consider what each function does. From that, you need to apply that knowledge to the size of the type you intend to modify. Consider the following:
#include <stdio.h>
#include <netinet/in.h>
int main () {
uint16_t i = 42;
uint8_t c = 42; // a char
printf ("(uint16_t ) %08X (%d)\n", i, i);
printf ("( htons ) %08X (%d)\n", htons(i), htons(i));
printf ("( uint8_t ) %08X (%c)\n", c, c);
printf ("( htons ) %08X (%c)\n", htons(c), htons(c));
return 0;
}
(uint16_t ) 0000002A (42)
( htons ) 00002A00 (10752)
( uint8_t ) 0000002A (*)
( htons ) 00002A00 ()
You don't read individual bits off the wire, just bytes. Regardless of the endianness, a single byte is the same backwards and forwards just like the word "I" is the same backwards and forwards.
In addition to all the other ways folks have put it: Endianness is about the order of bytes in an integer, not about the order of bits in a byte. The order of bits in a byte is the same even across big-endian and little-endian machines. The only difference is the order in which the bytes themselves are used to store an integer (or short or what-have-you). And so because there's only one byte in a char, there's no difference in how that value is stored, either in a big-endian or little-endian architecture.
ezpz code is incorrect..
uint16_t i = 65534;
printf ("(uint16_t ) %08X (%d)\n", i, i);
Returns...
(uint16_t ) 0000FFFE (-2)
You should use a unsigned int instead so it's not interpreted as a signed int
uint16_t i = 65534;
printf ("(uint16_t ) %08X (%u)\n", i, i);
Returns...
(uint16_t ) 0000FFFE (65534)
精彩评论