Calculate the real world time at which a frame was captured?

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Calculate the real world time at which a frame was captured?

George Hawkins
If I use the filesrc element to playback a video file I can retrieve data like this on a per frame basis:

    index=0, timestamp=832000000, stream-time=0
    index=1, timestamp=870000000, stream-time=38000000
    ...

But what is the first timestamp above relative to? How can I retrieve a real-world start time such that I can combine it with this timestamp in order to calculate the real-world time at which the frame was captured?

I control the original file capture process as well as the playback but I haven't found how to capture and recover the start time that I need for combining with timestamps in this way.

Currently, I capture the video file like so:

    gst-launch-1.0 nvarguscamerasrc \
        ! 'video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1' \
        ! nvjpegenc \
        ! matroskamux \
        ! filesink location=out.mkv

I can change the container and video format if this makes it easier to encode and recover the start time later. I can obviously get an _approximate_ start time by recording the time at which the pipeline started - but I'd prefer something more precise (and _if possible_ I'd prefer that the value was encoded somewhere in the resulting video file rather than stored separately).

I've used GST_DEBUG to see if I could see anything that looked like a start time when replaying the file but didn't spot anything.

And if I look at the file with a tool like mediainfo the only date I see is:

    Encoded date : UTC 2019-07-24 19:20:42

TL;DR - when recording my video file how do I capture and later recover a value that can be combined with a relative timestamp value (like the one for index 0 up above) to give the real world time at which the frame was captured.

For reference: I retrieved the above timestamp values etc. from the command line like so:

    $ GST_DEBUG=GST_BUS:5 gst-launch-1.0 filesrc location=out.mkv \
        ! matroskademux \
        ! multifilesink post-messages=true location=/dev/null &> gst-bus-debug.log

    $ sed -n 's/.*gst_bus_source_dispatch.*, \(index=.*\)/\1/p' gst-bus-debug.log

Regards,

/George

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Calculate the real world time at which a frame was captured?

pisymbol .


On Sun, Jul 28, 2019 at 8:10 AM George Hawkins <[hidden email]> wrote:
If I use the filesrc element to playback a video file I can retrieve data like this on a per frame basis:

    index=0, timestamp=832000000, stream-time=0
    index=1, timestamp=870000000, stream-time=38000000
    ...

But what is the first timestamp above relative to? How can I retrieve a real-world start time such that I can combine it with this timestamp in order to calculate the real-world time at which the frame was captured?

I control the original file capture process as well as the playback but I haven't found how to capture and recover the start time that I need for combining with timestamps in this way.

Currently, I capture the video file like so:

    gst-launch-1.0 nvarguscamerasrc \
        ! 'video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1' \
        ! nvjpegenc \
        ! matroskamux \
        ! filesink location=out.mkv

I can change the container and video format if this makes it easier to encode and recover the start time later. I can obviously get an _approximate_ start time by recording the time at which the pipeline started - but I'd prefer something more precise (and _if possible_ I'd prefer that the value was encoded somewhere in the resulting video file rather than stored separately).

I've used GST_DEBUG to see if I could see anything that looked like a start time when replaying the file but didn't spot anything.

And if I look at the file with a tool like mediainfo the only date I see is:

    Encoded date : UTC 2019-07-24 19:20:42

TL;DR - when recording my video file how do I capture and later recover a value that can be combined with a relative timestamp value (like the one for index 0 up above) to give the real world time at which the frame was captured.

Why can't you generate an epoch (UTC) on a per frame basis? That's what I do with 'nvcamerasrc' at least. I read its timestamp it stores from the ISP as well as generate my own epoch when the frame is received in the pipeline (see identity's "handoff" signal).

I mean you have to define "start time" precisely here too.

-aps

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Calculate the real world time at which a frame was captured?

George Hawkins-2
In reply to this post by George Hawkins

On Mon, Jul 29, 2019 at 2:58 PM "pisymbol ." <[hidden email]> wrote:
On Sun, Jul 28, 2019 at 2:10 PM George Hawkins <[hidden email]> wrote:
If I use the filesrc element to playback a video file I can retrieve data like this on a per frame basis:

    index=0, timestamp=832000000, stream-time=0
    index=1, timestamp=870000000, stream-time=38000000
    ...

But what is the first timestamp above relative to? How can I retrieve a real-world start time such that I can combine it with this timestamp in order to calculate the real-world time at which the frame was captured?

I control the original file capture process as well as the playback but I haven't found how to capture and recover the start time that I need for combining with timestamps in this way.

Currently, I capture the video file like so:

    gst-launch-1.0 nvarguscamerasrc \
        ! 'video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1' \
        ! nvjpegenc \
        ! matroskamux \
        ! filesink location=out.mkv

I can change the container and video format if this makes it easier to encode and recover the start time later. I can obviously get an _approximate_ start time by recording the time at which the pipeline started - but I'd prefer something more precise (and _if possible_ I'd prefer that the value was encoded somewhere in the resulting video file rather than stored separately).

I've used GST_DEBUG to see if I could see anything that looked like a start time when replaying the file but didn't spot anything.

And if I look at the file with a tool like mediainfo the only date I see is:

    Encoded date : UTC 2019-07-24 19:20:42

TL;DR - when recording my video file how do I capture and later recover a value that can be combined with a relative timestamp value (like the one for index 0 up above) to give the real world time at which the frame was captured.

Why can't you generate an epoch (UTC) on a per frame basis? That's what I do with 'nvcamerasrc' at least. I read its timestamp it stores from the ISP as well as generate my own epoch when the frame is received in the pipeline (see identity's "handoff" signal). 

I mean you have to define "start time" precisely here too. 

-aps  

Thanks for the reply Aps - and sorry for being slow in following-up. As it seems GStreamer is used in quite a lot of machine vision projects, I thought it would be quite a common thing to want to do, to record video and then, at a later stage, work with the video and time signals from another source.

E.g. one might continuously record video of the night sky and later come back and want to view particular frames when some other information source tells you that there might have been something interesting to see in-frame at e.g. 3:24 AM.

So I didn't want to reinvent the wheel. But in the end, I did as you suggested and coded up a new element, using identity as a template.

The learning curve for getting started with coding GStreamer elements is fairly step - but ultimately, the actual code required to implement what I wanted was trivial.

On the off-chance that it might be useful to someone else, the results can be found here: https://github.com/george-hawkins/gst-absolutetimestamps

The README is far longer than the few lines of code required to print out the timestamps I wanted :)

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Calculate the real world time at which a frame was captured?

George Hawkins
On Mon, Jul 29, 2019 at 2:58 PM "pisymbol ." <[hidden email]> wrote:
On Sun, Jul 28, 2019 at 2:10 PM George Hawkins <[hidden email]> wrote:
If I use the filesrc element to playback a video file I can retrieve data like this on a per frame basis:

    index=0, timestamp=832000000, stream-time=0
    index=1, timestamp=870000000, stream-time=38000000
    ...

But what is the first timestamp above relative to? How can I retrieve a real-world start time such that I can combine it with this timestamp in order to calculate the real-world time at which the frame was captured?

I control the original file capture process as well as the playback but I haven't found how to capture and recover the start time that I need for combining with timestamps in this way.

Currently, I capture the video file like so:

    gst-launch-1.0 nvarguscamerasrc \
        ! 'video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1' \
        ! nvjpegenc \
        ! matroskamux \
        ! filesink location=out.mkv

I can change the container and video format if this makes it easier to encode and recover the start time later. I can obviously get an _approximate_ start time by recording the time at which the pipeline started - but I'd prefer something more precise (and _if possible_ I'd prefer that the value was encoded somewhere in the resulting video file rather than stored separately).

I've used GST_DEBUG to see if I could see anything that looked like a start time when replaying the file but didn't spot anything.

And if I look at the file with a tool like mediainfo the only date I see is:

    Encoded date : UTC 2019-07-24 19:20:42

TL;DR - when recording my video file how do I capture and later recover a value that can be combined with a relative timestamp value (like the one for index 0 up above) to give the real world time at which the frame was captured.

Why can't you generate an epoch (UTC) on a per frame basis? That's what I do with 'nvcamerasrc' at least. I read its timestamp it stores from the ISP as well as generate my own epoch when the frame is received in the pipeline (see identity's "handoff" signal). 

I mean you have to define "start time" precisely here too. 

-aps  

Thanks for the reply Aps - and sorry for being slow in following-up. As it seems GStreamer is used in quite a lot of machine vision projects, I thought it would be quite a common thing to want to do, to record video and then, at a later stage, work with the video and time signals from another source.

E.g. one might continuously record video of the night sky and later come back and want to view particular frames when some other information source tells you that there might have been something interesting to see in-frame at e.g. 3:24 AM.

So I didn't want to reinvent the wheel. But in the end, I did as you suggested and coded up a new element, using identity as a template.

The learning curve for getting started with coding GStreamer elements is fairly step - but ultimately, the actual code required to implement what I wanted was trivial.

On the off-chance that it might be useful to someone else, the results can be found here: https://github.com/george-hawkins/gst-absolutetimestamps

The README is far longer than the few lines of code required to print out the timestamps I wanted :)

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Calculate the real world time at which a frame was captured?

Nicolas Dufresne-5


Le dim. 4 août 2019 18 h 25, George Hawkins <[hidden email]> a écrit :
On Mon, Jul 29, 2019 at 2:58 PM "pisymbol ." <[hidden email]> wrote:
On Sun, Jul 28, 2019 at 2:10 PM George Hawkins <[hidden email]> wrote:
If I use the filesrc element to playback a video file I can retrieve data like this on a per frame basis:

    index=0, timestamp=832000000, stream-time=0
    index=1, timestamp=870000000, stream-time=38000000
    ...

But what is the first timestamp above relative to? How can I retrieve a real-world start time such that I can combine it with this timestamp in order to calculate the real-world time at which the frame was captured?

I control the original file capture process as well as the playback but I haven't found how to capture and recover the start time that I need for combining with timestamps in this way.

Currently, I capture the video file like so:

    gst-launch-1.0 nvarguscamerasrc \
        ! 'video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1' \
        ! nvjpegenc \
        ! matroskamux \
        ! filesink location=out.mkv

I can change the container and video format if this makes it easier to encode and recover the start time later. I can obviously get an _approximate_ start time by recording the time at which the pipeline started - but I'd prefer something more precise (and _if possible_ I'd prefer that the value was encoded somewhere in the resulting video file rather than stored separately).

I've used GST_DEBUG to see if I could see anything that looked like a start time when replaying the file but didn't spot anything.

And if I look at the file with a tool like mediainfo the only date I see is:

    Encoded date : UTC 2019-07-24 19:20:42

TL;DR - when recording my video file how do I capture and later recover a value that can be combined with a relative timestamp value (like the one for index 0 up above) to give the real world time at which the frame was captured.

Why can't you generate an epoch (UTC) on a per frame basis? That's what I do with 'nvcamerasrc' at least. I read its timestamp it stores from the ISP as well as generate my own epoch when the frame is received in the pipeline (see identity's "handoff" signal). 

I mean you have to define "start time" precisely here too. 

-aps  

Thanks for the reply Aps - and sorry for being slow in following-up. As it seems GStreamer is used in quite a lot of machine vision projects, I thought it would be quite a common thing to want to do, to record video and then, at a later stage, work with the video and time signals from another source.

E.g. one might continuously record video of the night sky and later come back and want to view particular frames when some other information source tells you that there might have been something interesting to see in-frame at e.g. 3:24 AM.

TimeCode can be used for that, I believe there is an element to insert wall time as TimeCode, but I'm no expert.

Last time I needed that, I simply encoded the start time in the filename, and parse that in my app to offset the playback position.



So I didn't want to reinvent the wheel. But in the end, I did as you suggested and coded up a new element, using identity as a template.

The learning curve for getting started with coding GStreamer elements is fairly step - but ultimately, the actual code required to implement what I wanted was trivial.

On the off-chance that it might be useful to someone else, the results can be found here: https://github.com/george-hawkins/gst-absolutetimestamps

The README is far longer than the few lines of code required to print out the timestamps I wanted :)
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Calculate the real world time at which a frame was captured?

pisymbol .


On Sun, Aug 4, 2019 at 7:09 PM Nicolas Dufresne <[hidden email]> wrote:


Le dim. 4 août 2019 18 h 25, George Hawkins <[hidden email]> a écrit :
On Mon, Jul 29, 2019 at 2:58 PM "pisymbol ." <[hidden email]> wrote:
On Sun, Jul 28, 2019 at 2:10 PM George Hawkins <[hidden email]> wrote:
If I use the filesrc element to playback a video file I can retrieve data like this on a per frame basis:

    index=0, timestamp=832000000, stream-time=0
    index=1, timestamp=870000000, stream-time=38000000
    ...

But what is the first timestamp above relative to? How can I retrieve a real-world start time such that I can combine it with this timestamp in order to calculate the real-world time at which the frame was captured?

I control the original file capture process as well as the playback but I haven't found how to capture and recover the start time that I need for combining with timestamps in this way.

Currently, I capture the video file like so:

    gst-launch-1.0 nvarguscamerasrc \
        ! 'video/x-raw(memory:NVMM), width=3280, height=2464, framerate=21/1' \
        ! nvjpegenc \
        ! matroskamux \
        ! filesink location=out.mkv

I can change the container and video format if this makes it easier to encode and recover the start time later. I can obviously get an _approximate_ start time by recording the time at which the pipeline started - but I'd prefer something more precise (and _if possible_ I'd prefer that the value was encoded somewhere in the resulting video file rather than stored separately).

I've used GST_DEBUG to see if I could see anything that looked like a start time when replaying the file but didn't spot anything.

And if I look at the file with a tool like mediainfo the only date I see is:

    Encoded date : UTC 2019-07-24 19:20:42

TL;DR - when recording my video file how do I capture and later recover a value that can be combined with a relative timestamp value (like the one for index 0 up above) to give the real world time at which the frame was captured.

Why can't you generate an epoch (UTC) on a per frame basis? That's what I do with 'nvcamerasrc' at least. I read its timestamp it stores from the ISP as well as generate my own epoch when the frame is received in the pipeline (see identity's "handoff" signal). 

I mean you have to define "start time" precisely here too. 

-aps  

Thanks for the reply Aps - and sorry for being slow in following-up. As it seems GStreamer is used in quite a lot of machine vision projects, I thought it would be quite a common thing to want to do, to record video and then, at a later stage, work with the video and time signals from another source.

E.g. one might continuously record video of the night sky and later come back and want to view particular frames when some other information source tells you that there might have been something interesting to see in-frame at e.g. 3:24 AM.

TimeCode can be used for that, I believe there is an element to insert wall time as TimeCode, but I'm no expert.

Last time I needed that, I simply encoded the start time in the filename, and parse that in my app to offset the playback position.

But then you need to interpolate right?

I wound up tapping the pipeline using identity handoff which works fine....and it flushes a meta file with frame number -> timestamp, geocode etc. In the future, I think I would like to add a data stream to the MKV instead - seems more elegant.

-aps

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel