Hey so I've been working on freeseer to add rtmp streaming functionality, but I've hit a snag.
I've been able to set up, with gst-launch and a short python script, a pipeline that works (ie, streams rtmp content to a server). However, for the rtmp plugin, I create an output bin that uses rtmpsink. While there doesn't appear to be anything wrong with the bin that I create (it works in the short prototype script), when it's put in the pipeline, the pipeline won't play. It seems to get as far as creating the rtmp object, and then stops. Now, very likely this is a bug in my code, but I'm wondering, what are some good angles to approach this? Running the app with GST_DEBUG=bin*:6, it looks like my bin receives an async-start message from rtmpsink when I'm setting the state to STATE_PAUSED, but never receives a async-done message. Thanks. _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hi Mike.
I also work with a similar situation in rtmp streaming functionality. Can you post your pipeline? For example gst-launch-0.10 ksvideosrc ! ffmpegcolorspace ! timeoverlay ! x264enc tune=zerolatency ! mux. audiotestsrc ! faac ! audio/mpeg, mpegversion=4, profile=lc ! mux. flvmux name=mux ! rtmpsink location="rtmp://192.168.0.233/live/gstreamer_test"
|
The following pipeline was working for me:
gst-launch videotestsrc ! queue ! x264enc ! flvmux name=muxer ! rtmpsink location='rtmp://<streaming_url>' audiotestsrc ! queue ! audioconvert ! muxer. However, a similar pipeline was not working within our application. We discovered that the issue was we were using a tee, and splitting the pipeline to x264enc, and another bin that included an ffmpegcolourspace, and an autovideosink (something I had overlooked...). The issue we were encountering was described here. We found that we could resolve the issue by a) making the queue in the other output bin (before the ffmpegcolourspace and autovideosink) leaky (as described in the above link) or b) setting tune=zerolatency. I'm not sure which of the magics the zerolatency tuning uses fixes the problem, but to say something wrong in the hopes of being corrected, I would guess that one of the zerolatency settings stop x264enc from taking its time and prevent buffers from being filled up (please correct me, I'm curious). Cheers, Mike > Date: Mon, 26 Nov 2012 02:33:23 -0800 > From: [hidden email] > To: [hidden email] > Subject: Re: rtmpsink > > Hi Mike. > I also work with a similar situation in rtmp streaming functionality. > Can you post your pipeline? For example > > gst-launch-0.10 ksvideosrc ! ffmpegcolorspace ! timeoverlay ! x264enc > tune=zerolatency ! mux. audiotestsrc ! faac ! audio/mpeg, mpegversion=4, > profile=lc ! mux. flvmux name=mux ! rtmpsink > location="rtmp://192.168.0.233/live/gstreamer_test" > > > > Mike Chong wrote > > Hey so I've been working on freeseer to add rtmp streaming functionality, > > but I've hit a snag. > > I've been able to set up, with gst-launch and a short python script, a > > pipeline that works (ie, streams rtmp content to a server). However, for > > the rtmp plugin, I create an output bin that uses rtmpsink. While there > > doesn't appear to be anything wrong with the bin that I create (it works > > in the short prototype script), when it's put in the pipeline, the > > pipeline won't play. It seems to get as far as creating the rtmp object, > > and then stops. > > Now, very likely this is a bug in my code, but I'm wondering, what are > > some good angles to approach this? Running the app with GST_DEBUG=bin*:6, > > it looks like my bin receives an async-start message from rtmpsink when > > I'm setting the state to STATE_PAUSED, but never receives a async-done > > message. > > Thanks. > > _______________________________________________ > > gstreamer-devel mailing list > > > gstreamer-devel@.freedesktop > > > http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > > > > -- > View this message in context: http://gstreamer-devel.966125.n4.nabble.com/rtmpsink-tp4657056p4657062.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
From what I know, "tune = zerolatency" sets the coder x264enc to minimize the buffer so that there is a lag between audio and video.
Also I'm working with RTMP and x264. During several tests I realized that there are small changes that sometimes make the difference.
|
Le mardi 27 novembre 2012 à 06:54 -0800, orione1974 a écrit :
> From what I know, "tune = zerolatency" sets the coder x264enc to > minimize the To be more precise, zerolatency removes all the buffering, in a way that for each frames that comes in, a frame comes out. Latency is also introduced by queue. In this scenario I see no use of queues. Nicolas _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In my precedent post I made a mistake
sets the coder x264enc to minimize the buffer so that there is NO lag between audio and video.
|
Le mardi 27 novembre 2012 à 08:42 -0800, orione1974 a écrit :
> sets the coder x264enc to minimize the buffer so that there is NO lag > between audio and video. I think you should also enable audio/video synchronisation. Nicolas _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hi Nicolas.
You can send me an example? In my example pipeline audio and video are not synchronized? I take this opportunity to make a further question: if I want to change the synchronization between audio and video, there is a component GStreamer that, in case of need, delays the audio with respect to video? <quote author="Nicolas Dufresne"> Le mardi 27 novembre 2012 à 08:42 -0800, orione1974 a écrit : > sets the coder x264enc to minimize the buffer so that there is NO lag > between audio and video. I think you should also enable audio/video synchronisation. Nicolas |
Le jeudi 29 novembre 2012 à 07:13 -0800, orione1974 a écrit :
> Hi Nicolas. > > You can send me an example? In my example pipeline audio and video are not > synchronized? Most likely yes, because the only reason why the encoder latency can cause the audio and video to "lag" is because they are not synchronized at playback. > > I take this opportunity to make a further question: if I want to change the > synchronization between audio and video, there is a component GStreamer > that, in case of need, delays the audio with respect to video? Yes, synchronisation in GStreamer take place in the sink element (seek sync property), and is done base on the buffers timestamps. It's a bit more complex then waiting though on audio side. Nicolas _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |