I have 2 Nvidia Jetson Nano on my local network, each one running their own
gstreamer pipeline. On the first one (i.e. the slave), the pipeline looks like this: gst-launch-1.0 -e v4l2src ! 'video/x-raw,width=(int)1920,height=(int)1080' ! videoconvert ! videorate ! 'video/x-raw,framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1920,height=1080' ! omxh264enc ! flvmux streamable=true ! rtmpsink location=rtmp://MASTER_LOCAL_IP/live/CAMERA_ID So it's a simple pipeline that take a USB camera input, encode it in h264, mux it and send it to a nginx rtmp server on the second Jetson Nano. On the second Jetson Nano (i.e the master), the pipeline looks like this: gst-launch-1.0 -e v4l2src ! queue ! 'video/x-raw,width=(int)1920,height=(int)1080' ! videoconvert ! videorate ! 'video/x-raw,framerate=(fraction)30/1' ! nvvidconv flip-method=0 top=0 left=0 bottom=1080 right=1920 ! 'video/x-raw(memory:NVMM),width=(int)1920,height=(int)1080' ! omxh264enc bitrate=4000 ! "video/x-h264" ! h264parse ! input-selector cache-buffers=true sync-streams=true sync-mode=1 name=i ! queue ! flvmux streamable=true name=mux ! rtmpsink location=rtmp://random/rtmp/server/url rtmpsrc location=rtmp://SLAVE_LOCAL_IP/live/CAMERA_ID flvdemux ! queue ! i. alsasrc device=hw:2,0 ! queue ! audioconvert ! audioresample ! audio/x-raw,rate=44100 ! voaacenc bitrate=192000 ! "audio/mpeg, mpegversion=4" ! aacparse ! queue ! mux. It's a more complex pipeline. It takes a USB camera input, encode it in h264 and send it to an input-selector. It also retrieve the rtmp stream from the slave, demux it and send it to the input-selector. It also retrieve the audio from an alsa device and encode it in AAC. Then the input-selector output and the audio are muxed and sent to a remote RTMP server. FYI, my real code use the Python Gstreamer API, but for the sake of this question, I translated my python code into those gst-launch commands. In my python code, I have some logic that switch the active-pad of the input-selector every 5 seconds, to either get the master input source or the slave one. While the pipeline is running on the master Nano, I get a lot of flvmux errors: flvmux gstflvmux.c:1082:gst_flv_mux_buffer_to_tag_internal:<flvmux:sink_0> Got backwards dts! (0:25:45.133000000 < 0:25:45.234000000) And every 2x(swich time period) = 10s, I get this error: v4l2src gstv4l2src.c:976:gst_v4l2src_create:<v4l2src0> lost frames detected: count = 10 - ts: 0:28:18.428984294 This error below is perfectly periodic. If I change the switch time period and put it at 15 seconds for example, I will get this every 30 seconds (i.e. 2x15s). And all those errors messages have the effect to produce a lot of audio clicks/pops when I listen to it on the remote RTMP server. If I don't switch the input-selector, I don't get those glitches. I have been working on this issue for weeks without any luck. Any help would be appreciated. Thanks. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
I see exactly same problem with input-selector and flvmux. Even without the
second input being active in input-selector the issue is seen where flvmux keep printing DTS issue. Any solution? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |