My goal is to send two video streams from one computer to another and limit the bandwidth being used as much as possible. I have no problem doing this in one pipeline if I just send each source to a different port. However, I figured it would be better to mux the streams together when transmitting and demux when receiving. I have tried using mpegtsmux and tsdemux, but I can't find how to split the streams apart on the receiving end.
-Here is the transmitting pipeline: gst.parse_launch('mpegtsmux name="muxer" ! udpsink host=' + self.homeIP + ' port=1234 v4l2src device=/dev/video' + str(self.cam) + ' ! video/x-raw-yuv, framerate=' + str(self.fps) + '/1, width=640, height=480 ! x264enc pass=qual quantizer=20 tune=zerolatency ! muxer. v4l2src device=/dev/video' + str(self.cam + 1) + ' ! video/x-raw-yuv, framerate=' + str(self.fps) + '/1, width=640, height=480 ! x264enc pass=qual quantizer=20 tune=zerolatency ! muxer.') And here is a receiving pipeline that receives the video from the first stream: gst.parse_launch('udpsrc port=1234 caps="video/mpegts, systemstream=(boolean)true, packetsize=(int)188" ! tsdemux ! ffdec_h264 ! xvimagesink sync=false') I need to figure out how I can access both streams. Or is streaming to separate ports fine? |
On Do, 2016-04-28 at 19:26 -0700, hoene wrote:
> My goal is to send two video streams from one computer to another and limit > the bandwidth being used as much as possible. I have no problem doing this > in one pipeline if I just send each source to a different port. However, I > figured it would be better to mux the streams together when transmitting and > demux when receiving. I have tried using mpegtsmux and tsdemux, but I can't > find how to split the streams apart on the receiving end. > > -Here is the transmitting pipeline: > [...] > > gst.parse_launch('udpsrc port=1234 caps="video/mpegts, > systemstream=(boolean)true, packetsize=(int)188" ! tsdemux ! ffdec_h264 ! > xvimagesink sync=false') > > I need to figure out how I can access both streams. your case it will do that twice and then you can link the next elements to it. gst.parse_launch() is doing a lot behind the scenes for making this easier, so what you could do there is something like ... ! tsdemux name=d ! queue ! ffdec_h264 ! xvimagesink d. ! queue ! ffdec_h264 ! xvimagesink Also you seem to be using GStreamer 0.10, this is not a good idea. This release series is no longer maintained since more than 3 years now and you're unlikely to get any support for any problems you find. Use one of the stable 1.x releases, e.g. the latest 1.8.1. > Or is streaming to separate ports fine? That depends on if your codecs can be streamed without container and if you don't need synchronization between them. You should ideally also use RTP for sending media via UDP, in which case it wouldn't matter that you send via different ports if you do the setup of the streams correctly. -- Sebastian Dröge, Centricular Ltd · http://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (968 bytes) Download Attachment |
Thank you for the response. I tried to come up with some pipelines using MJPEG and then encoding into rtp packets. The following seems to work but I was wondering if you had any other suggestions on what I can improve in the pipeline. I will also port everything to gstreamer-1.0 after everything works as needed.
gst.parse_launch(v4l2src device=/dev/video' + str(self.cam) + ' ! video/x-raw-yuv, framerate=' + str(self.fps) + '/1, width=640, height=480 ! ffmpegcolorspace ! jpegenc ! rtpjpegpay ! udpsink host=' + self.homeIP + ' port=1234 v4l2src device=/dev/video' + str(self.cam + 1) + ' ! video/x-raw-yuv, framerate=' + str(self.fps) + '/1, width=640, height=480 ! ffmpegcolorspace ! jpegenc ! rtpjpegpay ! udpsink host=' + self.homeIP + ' port=1235') gst.parse_launch('udpsrc port=1234 caps="application/x-rtp, payload=127" ! rtpjpegdepay ! jpegdec ! xvimagesink sync=false udpsrc port=1235 caps="application/x-rtp, payload=127" ! rtpjpegdepay ! jpegdec ! xvimagesink sync=false') |
On Fr, 2016-04-29 at 18:58 -0700, hoene wrote:
> Thank you for the response. I tried to come up with some pipelines using > MJPEG and then encoding into rtp packets. The following seems to work but I > was wondering if you had any other suggestions on what I can improve in the > pipeline. I will also port everything to gstreamer-1.0 after everything > works as needed. In general it's a good idea to use an rtpjitterbuffer on the receiver side before the depayloader. IIRC the element was called gstrtpjitterbuffer in 0.10. -- Sebastian Dröge, Centricular Ltd · http://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (968 bytes) Download Attachment |
Free forum by Nabble | Edit this page |