I know this has been talked about around a thousands times on list. And I have read most of these threads but I am still getting what I think is strange behavior. Here's what I'm trying to do: I want to intercept every frame of an h.264 30 fps stream (not variable rate, life is good) and get its frame number and timestamp. So as I understand the doc, that should be in the GstBuf.dts and GstBuf.offset fields. I was going down the pad probe route but found that the identity plugin as per the doc a much easier and better route: So my first pipeline looked like this: ... ! omxh264enc bitrate=10000000 ! video/x-h264, stream-format=(string)byte-stream ! h264parse ... ! identity name=nvcamid0 ! muxer Then I connected to the "handoff" signal for my callback and saw the following: dts : offset 0 : 816445297 106961 : 839200766 130469 : 849786717 151008 : 876858645 165810 : 909811526 I don't quite understand what the offset is here? I don't see how it can be a frame number but it is increasing monotonically (but it doesn't look like a per buffer offset either). Also the offset_end is also MAXUINT64 (essentially NONE). Another thread with a similar request as mine can be found here: In this case, the pipeline was augmented to push h264parse into a raw-video with frame alignment. I tried that via: ... ! omxh264enc bitrate=10000000 ! video/x-h264,
stream-format=(string)byte-stream ! h264parse ... ! tee name=tp tp. ! queue ! muxer.video_0 \ tp. ! queue ! video/x-h264, stream-format=avc ! identity
name=nvcamid0 And now of course I see no callback being called and nothing gets recorded. Anyway, can someone kindly just explain to me what the offset means above? And do I need to throw my h264parse output back into a video raw format again to get each frame number? Again, my goal is to capture the timestamp and frame number (and soon geotag) as a separate stream to eventually write to disk as a meta file. Thanks! -aps _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Fri, Jun 14, 2019, 11:07 AM pisymbol . <[hidden email]> wrote:
I guess I'll take a look at source. But does anyone know what dst/offset represent at this stage in the pipeline? My interest again is per frame. -aps _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
The DTS is the "data timestamp" not really something you care about. What you care about is the PTS, the offset is the number of frame for decoded frame, not encoded ones. I am not sure it is standardize for encoded frames at all. Thibault On Fri, Jun 14, 2019 at 6:12 PM pisymbol . <[hidden email]> wrote:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
s/data timestamp/decoding timestamp/ On Fri, Jun 14, 2019 at 6:42 PM Thibault Saunier <[hidden email]> wrote:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Thibault Saunier-4
On Fri, Jun 14, 2019 at 6:42 PM Thibault Saunier <[hidden email]> wrote:
Alrighty then. So you are telling me there is no way to intercept each frame? I find that hard to believe actually. -aps _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
It is not what I am saying, I am just saying that you do not have the frame number set on encoded buffers. On Fri, Jun 14, 2019 at 7:33 PM pisymbol . <[hidden email]> wrote:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Fri, Jun 14, 2019 at 7:40 PM Thibault Saunier <[hidden email]> wrote:
But if I try to dump offset before encoding, I get max(uint64) which doesn't make sense to me given what you just said, i.e. add a "tap" right after capturing from nvcamerasrc. And the other thread I referenced seemed to decode out the raw frames to get them, no? I can do that (I fixed my tee command) but the frame numbers still look funny to me (I expect 1,2,3,4 etc.). -aps _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Fri, Jun 14, 2019 at 7:46 PM pisymbol . <[hidden email]> wrote:
Actually, even simpler. The PTS timestamps look good. Can I just number each timestamp frame 1, 2, 3, etc.? They should correlate to the encoded stream, no? -aps _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
After h264 the stream is "parsed" so 1 buffer == 1 frame, so yes that would work in simple cases. On Fri, Jun 14, 2019 at 7:55 PM pisymbol . <[hidden email]> wrote:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Fri, Jun 14, 2019 at 8:00 PM Thibault Saunier <[hidden email]> wrote:
Thanks Thibault again! That's what I thought and it seems to be working now. Happy day!!! :-) -aps _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |