GStreamer RTP full-duplex video in/out with single rtpbin
I'm trying to create gstreamer pipeline with rtpbin to stream webcam both way (videophone). However, I am not even able to make rtpbin work with simple snippet like below which just takes webcam 开发者_StackOverflow社区source and streams out, then other udpsrc captures RTP packets and displays. All localhost. When splitted to two pipes and launched separately, it works. This, however, not. I feel it has something with threading, however I am stucked here as no queue worked for me so far. Basically, what I need is displaying incomming videostream and stream out my webcam videostream out to remote party.
gst-launch -v \ gstrtpbin name=rtpbin \ udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263" port=5000 ! rtpbin. \ rtpbin. ! rtph263depay ! ffdec_h263 ! ffmpegcolorspace ! xvimagesink \ v4l2src ! video/x-raw-yuv, framerate=30/1, width=320, height=240 ! videoscale ! videorate ! "video/x-raw-yuv,width=352,height=288,framerate=30/1" ! ffenc_h263 ! rtph263pay ! rtpbin. \ rtpbin. ! udpsink port=5000
Ok, I have to answer to myself, it was enough to add sync=false async=false to the udpsink:
gst-launch -v \ gstrtpbin name=rtpbin udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263" port=5000 ! queue ! rtpbin. \ rtpbin. ! rtph263depay ! ffdec_h263 ! ffmpegcolorspace ! xvimagesink \ v4l2src ! video/x-raw-yuv, framerate=30/1, width=320, height=240 ! videoscale ! videorate ! "video/x-raw-yuv,width=352,height=288,framerate=30/1" ! ffenc_h263 ! rtph263pay ! rtpbin. \ rtpbin. ! udpsink port=5000 sync=false async=false
精彩评论