This post was updated on .
Hello,
I've got an RTSP server streaming combined audio and video: Video comes from raspberry pi camera module element, rpicamsrc. Audio comes from an usb-serial connected ogg encoder, injected using appsrc. rpicamsrc -> videocaps -> h264parse -> rtph264pay pt=96 appsrc -> oggdemux -> vorbisdec -> audioconvert -> audioresample -> alawenc -> rtppcmapay pt=97 It works, but the audio/appsrc branch is ~4 seconds late. How can I sync this? any suggestions? |
On Mon, 2016-10-31 at 06:47 -0700, bomba wrote:
> Hello, > > I've got an RTSP server streaming combined audio and video: > > Video comes from raspberry pi camera module element, rpicamsrc. > Audio comes from an usb-serial connected ogg encoder, injected using > appsrc. > > rpicamsrc -> videocaps -> h2s4parse -> rtph264pay pt=96 > appsrc -> oggdemux -> vorbisdec -> audioconvert -> audioresample -> > alawenc > -> rtppcmapay pt=97 > > It works, but the audio/appsrc branch is ~4 seconds late. > > How can I sync this? any suggestions? How do you provide data to appsrc, do you set timestamps on buffers? What properties do you set on appsrc? -- Sebastian Dröge, Centricular Ltd · http://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (949 bytes) Download Attachment |
I connect to the RTSP stream from another computer, using gst-play-1.0. The audio comes ~4 seconds after the video, no other measure done. For this test, I read chunks from stdin and push them emitting "push-buffer". I've got my appsrc_feed() function: if (data->sourceid == 0) data->sourceid = g_idle_add ((GSourceFunc) read_data, data); That calls back my read_data() function: g_signal_emit_by_name (data->appsrc, "push-buffer", buffer, &ret); I don't set timestamps. g_object_set (si.appsrc, "name", "mysrc", NULL); g_object_set (si.appsrc, "is-live", TRUE, NULL); g_object_set (si.appsrc, "max-bytes", BUFSIZ, NULL); /*reading BUFSIZ chunks from stdin */ g_object_set (si.appsrc, "min-percent", APPSRC_MIN_PERCENT, NULL); /* 25% of BUFSIZ */ |
On Wed, 2016-11-02 at 02:36 -0700, bomba wrote:
> > Sebastian Dröge-3 wrote > > How do you provide data to appsrc, do you set timestamps on buffers? > > For this test, I read chunks from stdin and push them emitting > "push-buffer". > > I've got my appsrc_feed() function: > if (data->sourceid == 0) > data->sourceid = g_idle_add ((GSourceFunc) read_data, data); > > That calls back my read_data() function: > g_signal_emit_by_name (data->appsrc, "push-buffer", buffer, &ret); > > I don't set timestamps. synchronization between streams can't possibly work. You either have to set appropriate timestamps so it goes together with the video, or if it's live data that was captured "now", you can try the "do-timestamp" property on appsrc (which will timestamp it with "now"). -- Sebastian Dröge, Centricular Ltd · http://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (949 bytes) Download Attachment |
The media comes from an hardware OGG encoder, it has its own timestamps. The data comes out the hardware encoder ~1 second late so setting "do-timestamp" would not produce the desired result. I was thinking about "min-latency" and "max-latency" instead. |
On Thu, 2016-11-03 at 07:27 -0700, bomba wrote:
> Sebastian Dröge-3 wrote > > Ok but where does the data come from? Without setting timestamps, > > synchronization between streams can't possibly work. You either > > have to > > set appropriate timestamps so it goes together with the video, or > > if > > it's live data that was captured "now", you can try the "do- > > timestamp" > > property on appsrc (which will timestamp it with "now"). > > The media comes from an hardware OGG encoder, it has its own timestamps. > The data comes out the hardware encoder ~1 second late so setting > "do-timestamp" would not produce > the desired result. I was thinking about "min-latency" and "max-latency" > instead. And then timestamp the buffers accordingly with the timestamps coming from the hardware encoder translated to the GStreamer pipeline's running time (and especially to the clock used by the pipeline). Those timestamps are still 1s "late" though, the min-latency setting compensates for that (and ensures that video is in sync with that). -- Sebastian Dröge, Centricular Ltd · http://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (949 bytes) Download Attachment |
Free forum by Nabble | Edit this page |