Hi all,
I am trying to use gstreamer to simultaneously stream a video source via RTSP and record it (and in the future, more parallel processes). My main problem with the gst-rtsp-server examples is that the RTSP component wants to control the whole pipeline, and I don't want it to impact one of the other branches. My approach to this problem (I would be happy to hear a simpler way) is to use a primary, always-running pipeline which branches to an appsink, and allow the RTSP server to create its own pipeline using an appsrc, which I connect to the appsink in the GstRtspMediaFactory's "media-configured" signal handler. For the purposes of example, my "main" pipeline can be "videotestsrc ! appsink" (later with tee to filesink), and the rtsp factory bin can be "appsrc ! x264enc ! rtph264pay". In this case, it works, that is to say I can connect a media player to the RTSP stream and see the test image, although the memory usage explodes uncontrollably quite quickly. If I move the x264enc element to the "main" pipeline, which I want to do so it can be shared between the parallel functions, I never see any picture, and gst-play reports "Could not receive any udp packets". In neither case do I see errors on the server side with GST_DEBUG=*:3, and in both cases I set the appsrc caps to the caps from the last element in the "main" pipeline before the appsink. I checked that the RTSP pipeline's clock is slaved to the main pipeline clock (this seemed to happen automatically), but I guess I have some synchronization issues. Perhaps that can explain the memory explosion when passing raw packets too? Since assumedly the appsink is receiving frames much more quickly than the appsrc can send them onwards. In any case I am more interested in first getting the h264 appsink->appsrc link to work. I pasted a short code listing at pastebin.com/Lgc7BF1p (not sure of the preferred etiquette for this ML) Thanks for all advice! _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
hi folks,
I have the same question as Oliver had asked.
In the gst-rtsp-server/examples, all the source code uses gst_rtsp_media_factory_set_launch() to let the factory make pipelines with gst-launch string. But what if I have a pipeline already, and I want to assign the rtsp URI to that pipeline? Thanks, Xin
发件人: gstreamer-devel <[hidden email]> 代表 Oliver <[hidden email]>
发送时间: 2017年5月21日 19:39 收件人: [hidden email] 主题: Synchronizing h264 stream from appsink to appsrc for RTSP Hi all,
I am trying to use gstreamer to simultaneously stream a video source via RTSP and record it (and in the future, more parallel processes).
My main problem with the gst-rtsp-server examples is that the RTSP component wants to control the whole pipeline, and I don't want it to impact one of the other branches. My approach to this problem (I would be happy to hear a simpler way) is to use a
primary, always-running pipeline which branches to an appsink, and allow the RTSP server to create its own pipeline using an appsrc, which I connect to the appsink in the GstRtspMediaFactory's "media-configured" signal handler.
For the purposes of example, my "main" pipeline can be "videotestsrc ! appsink" (later with tee to filesink), and the rtsp factory bin can be "appsrc ! x264enc ! rtph264pay". In this case, it works, that is to say I can connect a media player to the RTSP
stream and see the test image, although the memory usage explodes uncontrollably quite quickly.
If I move the x264enc element to the "main" pipeline, which I want to do so it can be shared between the parallel functions, I never see any picture, and gst-play reports "Could not receive any udp packets". In neither case do I see errors on the server
side with GST_DEBUG=*:3, and in both cases I set the appsrc caps to the caps from the last element in the "main" pipeline before the appsink.
I checked that the RTSP pipeline's clock is slaved to the main pipeline clock (this seemed to happen automatically), but I guess I have some synchronization issues. Perhaps that can explain the memory explosion when passing raw packets too? Since assumedly
the appsink is receiving frames much more quickly than the appsrc can send them onwards. In any case I am more interested in first getting the h264 appsink->appsrc link to work.
I pasted a short code listing at
pastebin.com/Lgc7BF1p (not sure of the preferred etiquette for this ML)
Thanks for all advice!
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hey,
One suggestion I have seen for this is to use intervideosrc and intervideosink (gst-plugins-bad) in your gst_rtsp_media_factory_set_launch command. Then set the rtsp media factory to be shared with gst_rtsp_media_factory_set_shared.
But if you don't need your pipeline to run while there are no clients connected you shouldn't need to do this.
You are also probably going to run into issues with blocking your pipeline, so you might need to dynamically connect your pipeline via blocking probes to the intervideosink, or use a valve, or use a leaky queue.
Hope that helps!
Cheers, Michael On 6/2/2017 3:24 AM, Liu Xin wrote:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |