Hello, I have a rtsp server with a pipeline which works in c code. I want to add onvif timestamps in the rtp extended header. There is a module in the gst-plugins-bad which does this, called gstonviftimestamp. I tried to add this to my pipeline, but this makes the program crash when a client wants to connect. Tried to debug the client connection using debug level 4, but no explicit errors found. Next step I used the test-launch application from the gst-rtsp-server plugin. With this I can give a pipeline via command line. Using this solution the pipeline with the gstonviftimestamp works (including the gstonviftimestamp). I'm using gstreamer 1.16.2, gst-rtsp-server 1.16.2 and gst-plugins-bad 1.16.2. What should I do next? Are there any suggestions? Kind regards, Goran Broeckaert _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hey, if using rtponviftimestamp results in a crash, you should
report an
issue with a sample application, and a stacktrace captured when the crash occurs. --
Mathieu Duponchelle · https://www.centricular.com On 6/17/20 3:16 PM, Goran Broeckaert
wrote:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Goran Broeckaert
Hello, Sorry, I didn't word it properly. It's not that my application crashes, it's just that I do not get any images on the receiving end. I have done some further research and I've been able to differentiate something: when sending the packets using unicast, the same pipeline works properly. While using multicast, the pipeline does not work. In the multicast application, I've removed the gstonviftimestamp element from my pipeline and it works fine. Thanks for the help. Kind regards. Goran Broeckaert Op wo 17 jun. 2020 om 15:16 schreef Goran Broeckaert <[hidden email]>:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
I suppose that's one way to fix the problem, but rtponviftimestamp
should
ideally work well no matter the transport :) If you can report an issue with a minimal test case that will be useful :) On 6/18/20 1:36 PM, Goran Broeckaert
wrote:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Goran Broeckaert
Hello,
Right now I have a simple pipeline: "videotestsrc ! x264enc ! h264parse ! rtph264 name=pay0". I use the test-launch application, which is an example from the rtps-server plugin. I did some modifications to the application to have both options: transmitting using unicast and multicast. I'm using version 1.16.2 of gstreamer, gst-rtps-server and gst-plugins-bad on Ubuntu20.04. The simple pipeline works perfectly: I can use ffplay to retrieve the stream (both unicast and multicast). If I modify the pipeline to the following:
"videotestsrc ! x264enc ! h264parse ! rtph264 ! rtponviftimestamp ntp-offset=0 ! name=pay0", I can retrieve the stream using unicast, but when I do it with multicast it does not work. I've used debugging level 4, but I couldn't find anything that said where it went wrong. How should I proceed with debugging this? Thanks for your help! Kind regards, Goran Broeckaert PS: I've uploaded the patch to modify the test-launch.c to have the option to use multicast to pastebin: https://pastebin.com/qjpaYwCL . _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Le vendredi 19 juin 2020 à 13:23 +0200, Goran Broeckaert a écrit :
> Hello, > > Right now I have a simple pipeline: "videotestsrc ! x264enc ! h264parse ! > rtph264 name=pay0". I use the test-launch application, which is an example > from the rtps-server plugin. I did some modifications to the application to > have both options: transmitting using unicast and multicast. > > I'm using version 1.16.2 of gstreamer, gst-rtps-server and gst-plugins-bad on > Ubuntu20.04. > > The simple pipeline works perfectly: I can use ffplay to retrieve the stream > (both unicast and multicast). If I modify the pipeline to the following: > "videotestsrc ! x264enc ! h264parse ! rtph264 ! rtponviftimestamp ntp-offset=0 > ! name=pay0", I can retrieve the stream using unicast, but when I do it with > multicast it does not work. I've used debugging level 4, but I couldn't find > anything that said where it went wrong. > > How should I proceed with debugging this? It is most likely the x264enc configuration that isn't adequate. In general, you might want to consider lower profiles then what would be picked by default (notably avoiding YCbCr 4:4:4 as this is rarely supported by HW decoders). You can control the sub-sampling by controlling the input color format: videotestsrc ! video/x-raw,format=I420 ! x264enc ... Then best result for streaming are done with CBR (constant bitrate), this is the default for this encoder. But you may wan to adapt the "bitrate" property (careful this one is in Kbit/s) if you see degraded quality. Another thing you wan to control is the speed and the latency of the compression algorithm. To simplify this, there is a "tune" property. "zerolatency" or "fastdecode" tuning should work fairly well for your use case. Finally, you want to control the H.264 profiles. A very conservative choice would be to select "constrained- baseline", this a format that is pretty much guarantied to be supported. But to gain a bit more features, you may want to select "main" profile. That is done with: ... ! x264enc tune=zerolatency ! video/x-h264,profile=constrained-baseline ! ... Because that's not enough, in multicast clients will connect at random point in time. The delay to be able to display is variable and depend on the distance between two keyframes. You can control that distance on the encoder using "key- int-max" property. A value of 60 will insert a key frame every 60 frames. With 30fps, this means that you worst case scenario it will take 2s to display the video. ... ! x264enc tune=zerolatency key-int-max=60 ! ... Hope this is useful and works for you, Nicolas p.s. recent x264 library enabled slices in zerolatency, which is not always well supported, I believe that setting threads=1 will disable this, as the number of sliced is selected base on the number of allowed encoding threads. > > Thanks for your help! Kind regards, > Goran Broeckaert > > PS: I've uploaded the patch to modify the test-launch.c to have the option to > use multicast to pastebin: https://pastebin.com/qjpaYwCL . > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |