开发者

Stuck with sigwait

I did something wrong in my code, where an other process send a SIGUSR2 signal to it:

sigset_t sigset;
sigemptyset(&sigset);
sigaddset(&sigset, SIGILL);
sigaddset(&sigset, SIGUSR2);
sigwait(&sigset, &received);

XCode notices SIGUSER2(31) signal received, but received = SIGILL(4) (or the minimal signal in the set).

Why is that so? Where I am wrong?

Now, it looks like this:

    sigset_t sigset;
    sigemptyset(&sigset);
    sigaddset(&sigset, SIGILL);
    sigaddset(&sigset, SIGUSR2);
    sigprocmask(SIG_BLOCK, &sigset, 0);
    sigwait(&sigset, &received);
    if(receiv开发者_StackOverflow社区ed == SIGUSR2) {
        //...
    } else if(received == SIGILL) {
        //...
    }

Still not working.


Sometimes the debugger can get in the way. I have seen debuggers interfere with signal handling before. Try running the code without the debugger involved.

The following code works perfectly on OS X:

#include <signal.h>
#include <stdio.h>

int main()
{
    sigset_t set;
    int sig;

    sigemptyset(&set);
    sigaddset(&set, SIGUSR1);
    sigaddset(&set, SIGUSR2);
    sigprocmask(SIG_BLOCK, &set, NULL);
    sigwait(&set, &sig);
    printf("Got signal %d\n", sig);
    return 0;
}


As stated in the related question sigwait in Linux (Fedora 13) vs OS X, you need to block the signal using sigprocmask() (for single-threaded applications) or pthread_sigmask() (for multi-threaded applications).

Checking sigwait's return value for errors would not be bad either.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜