Hi,
I have a server-client video streaming using gstreamer and RTP. So basically: [source storage] --> [server] ---------> [client] --> [frames rendered on screen] Is there any way to know, for each rendered frame, what was the original playing time in the video source of the server? For example. - At 10 fps, after 30 seconds of playing, the "playing time" would be 300.0 seconds. - But if the network is slow, the playing time could very well be only 276.4 seconds. - Or maybe some frames could be lost if using UDP. - Also, if the source storage has variable framerate, I cannot use "10fps" or any other average fps for any calculations. How could this be achieved? Thanks! Saludos, Bruno González _______________________________________________ Jabber: stenyak AT gmail.com http://www.stenyak.com _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On 01/24/2012 03:49 PM, Bruno Gonzalez wrote:
Hi, video is synchronized to the clock. That is after 30 sec of playing time the video is more or less at 30s. If the network is slow, it is likely that some frames have been skipped. Stefan
_______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Thanks for the input.
Unfortunately, I still don't have any hints as to how to get the play time in the original source file. Doing back-of-the-envelope calculations on the client is not an option, I need the actual playing time, with a precision of cents of second. Any idea?
On Tue, Jan 24, 2012 at 16:08, Stefan Sauer <[hidden email]> wrote:
Saludos, Bruno González _______________________________________________ Jabber: stenyak AT gmail.com http://www.stenyak.com _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Stefan Sauer
Hi,
When I record an H.264 video and mux it using mp4mux (from plugins-good-0.10.30/gst/isomp4), on playback VLC gives me the following warning: [00000292] main input debug: `testMeVideo.mp4' successfully opened [00000304] mp4 demuxer debug: track[Id 0x1] using Sync Sample Box (stss) [00000304] mp4 demuxer debug: stts gives 1 --> 0 (sample number) [00000304] mp4 demuxer debug: track[Id 0x2] does not provide Sync Sample Box (stss) [00000306] ffmpeg decoder warning: AVC: Consumed only 90 bytes instead of 11574 (h264@0x848d7f0) [00000306] ffmpeg decoder warning: AVC: nal size -298012672 (h264@0x848d7f0) [00000306] ffmpeg decoder warning: no frame! (h264@0x848d7f0) [00000306] ffmpeg decoder warning: cannot decode one frame (11578 bytes) [00000306] ffmpeg decoder warning: AVC: Consumed only 90 bytes instead of 11574 (h264@0x848d7f0) [00000306] ffmpeg decoder warning: AVC: nal size -298012672 (h264@0x848d7f0) [00000306] ffmpeg decoder warning: no frame! (h264@0x848d7f0) [00000306] ffmpeg decoder warning: cannot decode one frame (11578 bytes) [00000338] faad decoder warning: decoded zero sample [00000306] ffmpeg decoder warning: warning: first frame is no keyframe (h264@0x848d7f0) So, no keyframe. Digging into the plugin code, in gstqtmux.c... At the top of gst_qt_mux_add_buffer, we have this little snip: last_buf = pad->last_buf; if (G_UNLIKELY (qtmux->dts_method == DTS_METHOD_REORDER)) { buf = gst_qt_mux_get_asc_buffer_ts (qtmux, pad, buf); if (!buf && !last_buf) { GST_DEBUG_OBJECT (qtmux, "no reordered buffer"); return GST_FLOW_OK; } } if (last_buf == NULL) { #ifndef GST_DISABLE_GST_DEBUG if (buf == NULL) { GST_DEBUG_OBJECT (qtmux, "Pad %s has no previous buffer stored and " "received NULL buffer, doing nothing", GST_PAD_NAME (pad->collect.pad)); } else { GST_LOG_OBJECT (qtmux, "Pad %s has no previous buffer stored, storing now", GST_PAD_NAME (pad->collect.pad)); } #endif pad->last_buf = buf; goto exit; } else gst_buffer_ref (last_buf); It looks to me that we *always* throw away the first buffer coming in since there is no last_buf yet. Since the first frame I encode is my IDR frame, and it gets tossed, I reckon that this would be the reason VLC is whining about no keyframe. Am I barking up the right tree here? I want to ask before I start hacking the plugin since this seems fundamental, and qtmux has been around for a while, so I'm guessing I must be missing something. Any insight would be appreciated. Thanks, Paul _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Nevermind. I see towards the end of gst_qt_mux_add_buffer, we are
actually pushing last_buf, not the current buf. So, not sure why I'm not getting the keyframe with my last hypothesis shot to heck! Paul Stuart wrote: > Hi, > When I record an H.264 video and mux it using mp4mux (from > plugins-good-0.10.30/gst/isomp4), on playback VLC gives me the following > warning: > > [00000292] main input debug: `testMeVideo.mp4' successfully opened > [00000304] mp4 demuxer debug: track[Id 0x1] using Sync Sample Box (stss) > [00000304] mp4 demuxer debug: stts gives 1 --> 0 (sample number) > [00000304] mp4 demuxer debug: track[Id 0x2] does not provide Sync Sample > Box (stss) > [00000306] ffmpeg decoder warning: AVC: Consumed only 90 bytes instead > of 11574 > (h264@0x848d7f0) > [00000306] ffmpeg decoder warning: AVC: nal size -298012672 > (h264@0x848d7f0) > [00000306] ffmpeg decoder warning: no frame! > (h264@0x848d7f0) > [00000306] ffmpeg decoder warning: cannot decode one frame (11578 bytes) > [00000306] ffmpeg decoder warning: AVC: Consumed only 90 bytes instead > of 11574 > (h264@0x848d7f0) > [00000306] ffmpeg decoder warning: AVC: nal size -298012672 > (h264@0x848d7f0) > [00000306] ffmpeg decoder warning: no frame! > (h264@0x848d7f0) > [00000306] ffmpeg decoder warning: cannot decode one frame (11578 bytes) > [00000338] faad decoder warning: decoded zero sample > [00000306] ffmpeg decoder warning: warning: first frame is no keyframe > (h264@0x848d7f0) > > > So, no keyframe. > > Digging into the plugin code, in gstqtmux.c... > > At the top of gst_qt_mux_add_buffer, we have this little snip: > > last_buf = pad->last_buf; > if (G_UNLIKELY (qtmux->dts_method == DTS_METHOD_REORDER)) { > buf = gst_qt_mux_get_asc_buffer_ts (qtmux, pad, buf); > if (!buf && !last_buf) { > GST_DEBUG_OBJECT (qtmux, "no reordered buffer"); > return GST_FLOW_OK; > } > } > > if (last_buf == NULL) { > #ifndef GST_DISABLE_GST_DEBUG > if (buf == NULL) { > GST_DEBUG_OBJECT (qtmux, "Pad %s has no previous buffer stored and " > "received NULL buffer, doing nothing", > GST_PAD_NAME (pad->collect.pad)); > } else { > GST_LOG_OBJECT (qtmux, > "Pad %s has no previous buffer stored, storing now", > GST_PAD_NAME (pad->collect.pad)); > } > #endif > pad->last_buf = buf; > goto exit; > } else > gst_buffer_ref (last_buf); > > > > It looks to me that we *always* throw away the first buffer coming in > since there is no last_buf yet. Since the first frame I encode is my IDR > frame, and it gets tossed, I reckon that this would be the reason VLC is > whining about no keyframe. > > > Am I barking up the right tree here? I want to ask before I start > hacking the plugin since this seems fundamental, and qtmux has been > around for a while, so I'm guessing I must be missing something. > > Any insight would be appreciated. > > Thanks, > Paul > > > > > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Bruno Gonzalez-2
On 01/24/2012 05:24 PM, Bruno Gonzalez wrote:
Thanks for the input. Use queries to get the duration and current playback position Stefan.
_______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
I expect I cannot get duration, since the RTSP source can sometimes be live video (unknown duration).
However, when I try to get the position, using a Query on my rtspsrc element of the pipeline, I always get "-1" as result. I've tried using a position query with bytes, buffers, time, percentage and everything, but none returns a usable value.
Furthermore, the rtspsrc clock is not usable either, since it's just a counter of time passed on the client side. So when playing at 2x, it doesn't match the server time, nor does it account for jitter.
What can be influencing the queries to return -1? Am I doing something wrong? The code I'm using looks like this (using gstreamer-sharp): Element rtspsrc = m_pipeline.GetByName("rtspsrc"); Query query = Query.NewPosition(Gst.Format.Time); rtspsrc.Query(query); long value; Gst.Format format; query.ParsePosition(out format, out value); writeTrace("Query results: " + format + ", " + value); Is that correct? On Wed, Jan 25, 2012 at 10:07, Stefan Sauer <[hidden email]> wrote:
Saludos, Bruno González _______________________________________________ Jabber: stenyak AT gmail.com http://www.stenyak.com _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On 01/26/2012 10:50 AM, Bruno Gonzalez wrote:
I expect I cannot get duration, since the RTSP source can sometimes be live video (unknown duration).Yes, that can be the case. Well invetigate the code and figure out why you get a -1. In most cases reporting a position should be possible. Trying the different formats is less useful as e.g. percent is calculated from pos and duration. Dump a pipeline graph of you pipeline to see which element are used and check their query implementations to see why they don't answer the position query. Stefan
_______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Sun, Jan 29, 2012 at 2:47 PM, Stefan Sauer <[hidden email]> wrote:
I was trying to query the element which is closest to the origin of data, which is rtspsrc in my client pipeline. However if I try to query the whole pipeline, all queries work. My only problem is that, the reported time, is the timestamp on the client side, and not the exact timestamps of the original video source. For example, if playing fastfoward or fastreverse ("scale" != 1 on the rtsp element, in the server pipeline), the reported times are not what they should. E.g.: after 10 seconds of playing at 2x, the reported time is 10 seconds instead of 20 seconds.
Is there any way to get the actual time in the client side? Thanks for any hints.
_______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |