开发者

C read call blocking on serial port operation

I am trying to write a C program in Linux to send and receive data from a microcontroller over the serial port. As a test, I have configured the microcontroller to immediately echo all characters sent. I have verified that this works in minicom and also by using "cat" and "echo" to send and receive data.

However, when I try to do 开发者_如何学编程the same in a C program, my read call blocks forever. I am setting the serial port to non-canonical mode, with a MIN of '1' and TIME of '0'. My minicom test proves that the microcontroller is returning characters as they are typed, so I expect read to return after the write call has sent characters. I have compared my code to several online examples, and I haven't found anything that I am missing. I have tried several permutations of the code below with no luck. Can someone spot the problem?

#include <stdio.h>
#include <stdlib.h>
#include <fcntl.h>
#include <string.h>
#include <errno.h>
#include <unistd.h>
#include <termios.h>

#define UART_SPEED B115200

char buf[512];

void init_serial (int fd)
{
    struct termios termios;
    int res;

    res = tcgetattr (fd, &termios);
    if (res < 0) {
        fprintf (stderr, "Termios get error: %s\n", strerror (errno));
        exit (-1);
    }

    cfsetispeed (&termios, UART_SPEED);
    cfsetospeed (&termios, UART_SPEED);

    termios.c_iflag &= ~(IGNPAR | IXON | IXOFF);
    termios.c_iflag |= IGNPAR;

    termios.c_cflag &= ~(CSIZE | PARENB | CSTOPB | CREAD | CLOCAL);
    termios.c_cflag |= CS8;
    termios.c_cflag |= CREAD;
    termios.c_cflag |= CLOCAL;

    termios.c_lflag &= ~(ICANON | ECHO);
    termios.c_cc[VMIN] = 1;
    termios.c_cc[VTIME] = 0;

    res = tcsetattr (fd, TCSANOW, &termios);
    if (res < 0) {
        fprintf (stderr, "Termios set error: %s\n", strerror (errno));
        exit (-1);
    }
}

int main (int argc, char **argv)
{
    int fd;
    int res;
    int i;

    if (argc < 2) {
        fprintf (stderr, "Please enter device name\n");
        return -1;
    }

    fd = open (argv[1], O_RDWR | O_NOCTTY);
    if (fd < 0) {
        fprintf (stderr, "Cannot open %s: %s\n", argv[1], strerror(errno));
        return -1;
    }

    init_serial (fd);

    res = write (fd, "P=20\r\n", 6);
    if (res < 0) {
        fprintf (stderr, "Write error: %s\n", strerror(errno));
        return -1;
    }
    tcdrain (fd);

    res = read (fd, buf, 512);
    printf ("%d\n", res);
    if (res < 0) {
        fprintf (stderr, "Read error: %s\n", strerror(errno));
        return -1;
    }

    for (i=0; i<res; i++) {
        printf ("%c", buf[i]);
    }

    return 0;
}


You might want to insert some delays, or loop waiting for input.

After setting the bit rate, some types of UART hardware takes one or two characters at the new speed to synchronize to the new speed. It is possible the first few characters are being lost on the write.

After the six character write, the read is issued immediately with a 0.1 second timeout. It is possible that not all the characters from the write() have finished being transmitted before the read(), let alone any time for the remote device to respond.

For example, one solution is:

init_serial (fd);
usleep (100000);   // delay 0.1 seconds (Linux) so term parameters have time to change

res = write (fd, "P=20\r\n", 6);
if (res < 0) {
    fprintf (stderr, "Write error: %s\n", strerror(errno));
    return -1;
}
tcdrain (fd);
usleep (250000);  // delay 0.25 for device to respond and return data

res = read (fd, buf, 512);

Another approach would be to continue reading until a sufficient number of characters arrive or a reasonable amount of time passes.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜