I have successfully implemented the following recording pipeline in C code which captures audio and video from my computer’s mic and camera and creates a movie file. I’m using GStreamer 1.6.4 with your windows pre-built binaries, windows
7 x86, and visual studio 2010. appsrc is-live=TRUE do-timestamp=TRUE min-latency=0 !
videorate ! videoconvert ! vp8enc ! queue !
mux. directsoundsrc device-name="Microphone Array" ! audiorate !
audioconvert ! opusenc ! queue ! mux. oggmux name=mux !
filesink location=C:/movie.ogg sync=TRUE
Since it’s not included with your standard plugins, I’m using your source code for the directsoundsrc element directly in my visual studio project. So it’s compiled in with my other source files. I’m doing a plugin register static at the
beginning of my program. The mic audio output format is 24 bit, 2 channels, at 48000 Hz sample rate. According to my research, the directsoundsrc element should be providing the clock for the pipeline. The “provide-clock” property is true by default.
I have a separate thread that pushes 640 x 480 raw video frames from the camera into the appsrc at a fixed rate of 30 frames per second. I followed the appsrc example given in section 19.2.1.3 of your GStreamer Application Development Manual.
I also provide a user input to send an EOS to the pipeline for a graceful shutdown. Here are the appsrc settings:
g_object_set (G_OBJECT (appsrc), "caps", gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING, "BGR", "width", G_TYPE_INT, 640, "height", G_TYPE_INT, 480, "framerate", GST_TYPE_FRACTION, 30, 1, NULL), NULL);
g_object_set (G_OBJECT (appsrc), "stream-type", 0, "format", GST_FORMAT_TIME,
NULL); Here’s how the timestamps are handled in the appsrc “need-data” callback where the timestamp variable is initialized to zero: GST_BUFFER_PTS (buffer) = timestamp;
// set for 30 fps GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 30); timestamp += GST_BUFFER_DURATION (buffer); The program compiles and executes ok and a movie file is created. I have successfully created several 10 second test movie files.
I’m using the following pipeline with your launch tool for playback. gst-launch-1.0 --gst-debug=*:3 filesrc location=C:/movie.ogg !
oggdemux name=demux demux. ! queue ! vp8dec ! videoconvert !
d3dvideosink demux. ! queue ! opusdec ! audioconvert ! directsoundsink
Playback also runs ok without errors. However, the audio and video are not in sync. Both audio and video start at the same time but they are never synchronized. The video looks ok but plays very fast [like a fast forward] and then stops
while the audio plays all the way through, from the beginning, correctly at normal speed. I tried moving both queues to after the decoders and adding videorate and audiorate after their respective queues with no difference in the playback results.
I also tried this: gst-launch-1.0 playbin uri=file:///C:/movie.ogg, and got the same results as the above playback pipeline. If I do this with the same appsrc conditions as above the video frames are displayed correctly in real-time:
appsrc is-live=TRUE ! videorate ! videoconvert ! d3dvideosink
I have searched your developer’s website quite a bit over the last few days and gleaned a few tidbits which I have already incorporated into my recording program [as shown above] as best as I understand things right now. I think the problem
might have something to do with timestamps on the record side, but at this point it’s not quite clear what to do next. It appears that the movie file is already broken before I try to play it back. Everything I have tried on the playback side so far has not
corrected the sync problem. I would greatly appreciate any help that could be provided. I can give more info if needed.
Regards, Bill Salibrici _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Le mercredi 18 mai 2016 à 18:16 +0000, William Salibrici a écrit :
> appsrc is-live=TRUE do-timestamp=TRUE min-latency=0 ! Using do-timestamp here can only prevent doing proper av/sync. You need to provide adequate timestamps. Nicolas _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Nicolas,
Thank you for your reply. Can you give me some ideas of how to provide adequate time stamps? What does that look like for this case? I have read Chapter 14. Clocks and synchronization in Gstreamer of the manual but I'm not clear on how to apply that to my recording example. In my example, the need-data callback provides timestamps for the video buffers, and the directsounsrc provides the clock for the pipeline and I'm assuming timestamps for the audio buffers as well. What else would you recommend I do here? Thanks again for your help - it is much appreciated! Bill -----Original Message----- From: gstreamer-devel [mailto:[hidden email]] On Behalf Of Nicolas Dufresne Sent: Wednesday, May 18, 2016 2:44 PM To: Discussion of the development of and with GStreamer <[hidden email]> Subject: Re: Audio and video not in sync Le mercredi 18 mai 2016 à 18:16 +0000, William Salibrici a écrit : > appsrc is-live=TRUE do-timestamp=TRUE min-latency=0 ! Using do-timestamp here can only prevent doing proper av/sync. You need to provide adequate timestamps. Nicolas _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Le mercredi 18 mai 2016 à 19:45 +0000, William Salibrici a écrit :
> Can you give me some ideas of how to provide adequate time stamps? > What does that look like for this case? > I have read Chapter 14. Clocks and synchronization in Gstreamer of > the manual but I'm not clear on how to apply that to my recording > example. > In my example, the need-data callback provides timestamps for the > video buffers, and the directsounsrc provides the clock for the > pipeline and I'm assuming timestamps for the audio buffers as well. > What else would you recommend I do here? > Thanks again for your help - it is much appreciated! captured according to the pipeline clock, minus the configured base time. Nicolas _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (188 bytes) Download Attachment |
Free forum by Nabble | Edit this page |