开发者

How to guarantee bits of char and short for communication to external device

Hello I am writing a library for communicating to an external device via rs-232 serial connection.

Often I have to communicate a command that includes an 8 bit = 1 byte character or a 16 bit = 2 byte number. How do I do this in a portable way?

Main problem

From reading other questions it seems that the standard does not guarantee 1byte = 8bits, (defined in the Standard $1.7/1)

The fundamental storage unit in the C + + memory model is the byte. A byte is at least large enough to contain any member of the basic execution character set and is composed of a contiguous sequence of bits, the number of which is implementation-defined.

How can I guarantee the number of bits of char? My device expects 8-bits exactly, rather than at least 8 bits.

I realise that almost all implementations have 1byte = 8 bits but I am curious as to how to guarantee it.

Short->2 byte check

I hope you don't mind开发者_运维知识库, I would also like to run my proposed solution for the short -> 2 byte conversion by you. I am new to byte conversions and cross platform portability.

To guarantee the the number of bytes of the short I guess I am going to need to need to

  1. do a sizeof(short). If sizeof(short)=2 Convert to bytes and check the byte ordering (as here)

  2. if sizeof(short)>2 then convert the short to bytes, check the byte ordering (as here), then check the most significant bytes are empty and remove them?

    Is this the right thing to do? Is there a better way?

Many thanks


AFAIK, communication with the serial port is somehow platform/OS dependent, so when you write the low level part of it, you'll know very well the platform, its endianness and CHAR_BIT. In this way the question does not have any sense.

Also, don't forget that UART hardware is able to transmit 7 or 8 bit words, so it does not depend on the system architecture.

EDIT: I mentioned that the UART's word is fixed (let's consider mode 3 with 8 bits, as the most standard), the hardware itself won't send more than 8 bits, so by giving it one send command, it will send exactly 8 bits, regardless of the machine's CHAR_BIT. In this way, by using one single send for byte and

unsigned short i;
send(i); 
send(i>>8);

you can be sure it will do the right thing.
Also, a good idea would be to see what exactly boost.asio is doing.


This thread seems to suggest you can use CHAR_BIT from <climits>. This page even suggests 8 is the minimum amount of bits in a char... Don't know how the quote from the standard relates to this.

For fixed-size integer types, if using MSVC2010 or GCC, you can rely on C99's <stdint.h> (even in C++) to define (u)int8_t and (u)int16_t which are guaranteed to be exactly 8 and 16 bits wide respectively.


CHAR_BIT from the <climits> header tells you the number of bits in a char. This is at least 8. Also, a short int uses at least 16 bits for its value representation. This is guaranteed by the minimum value ranges:

type             can at least represent
---------------------------------------
unsigned char             0...255
signed char            -127...127
unsigned short            0...65535
signed short         -32767...32767
unsigned int              0...65535
signed int           -32767...32767

see here

Regarding portability, whenever I write code that relies on CHAR_BIT==8 I simply write this:

#include <climits>

#if CHAR_BIT != 8
#error "I expect CHAR_BIT==8"
#endif

As you said, this is true for almost all platforms and if it's not in a particular case, it won't compile. That's enough portability for me. :-)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜