Hi Folks,
I am looking to create 'time shift pipeline' using Gstreamer. This refers to actual recording of scene using cameras, putting the resultant bitstream on disk , and then re-reading the same video (after some delay) for further OpenCV processing. We are looking to record videos on tegra hardware (Tx1/Tx2) and then process the recorded videos further. Could someone please give any pointer / gstreamer code for a time shift pipe ? Thanks |
Hi,
On Thu, Aug 24, 2017 at 4:55 PM, pchaurasia <[hidden email]> wrote: > Could someone please give any pointer / gstreamer code for a time shift pipe > ? Fluendo seems to have it. https://gstconf.ubicast.tv/videos/time-shifting-with-gstreamer/ https://core.fluendo.com/gstreamer/svn/trunk/gst-fluendo-timeshift/ https://github.com/kkonopko/gst-fluendo-timeshift I haven't used it so no idea how it works -- yashi _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by pchaurasia
Hi, how large is your time shift. You could use splitfilrsrc / splitfilesink in two separate pipelines. If the timeshift is short, you could do it with a queue and setting the threshold parameter and a tee to save it to disk.
Grüße -----Ursprüngliche Nachricht----- Von: gstreamer-devel [mailto:[hidden email]] Im Auftrag von pchaurasia Gesendet: Donnerstag, 24. August 2017 09:55 An: [hidden email] Betreff: Gstreamer time shift pipeline Hi Folks, I am looking to create 'time shift pipeline' using Gstreamer. This refers to actual recording of scene using cameras, putting the resultant bitstream on disk , and then re-reading the same video (after some delay) for further OpenCV processing. We are looking to record videos on tegra hardware (Tx1/Tx2) and then process the recorded videos further. Could someone please give any pointer / gstreamer code for a time shift pipe ? Thanks -- View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Gstreamer-time-shift-pipeline-tp4684309.html Sent from the GStreamer-devel mailing list archive at Nabble.com. _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hi Keith,
Our timeshift is of variable duration. Cameras are our producers and 'OpenCV processing' is our consumer. Consumer is typically 3x slower. We want to operate the pipe for at least 30 mins, thus need good amount of buffering. Are there any examples of splitsrc/splitfilesink - that I can learn from ? Thanks, |
In reply to this post by Yasushi SHOJI-2
Hi Yasushi SHOJI-2,
Thanks for your help and pointer. I am starting to look into it. Thanks |
Hi Folks,
After spending some time on this - I think I would like to seek some ideas. 1. I interacted with Fluendo folks, their time-shift element works only with 'live' streams, and would need some customization for working with prestored videos. 2. I am thinking of developing a simple time shift app. It would read camera, encode , store video in mp4 file, read back the same file, and decode. a) I am looking to construct such a pipeline, my first attempt was - ( I am working on Jetson Tx2) - to make a simple encode pipeline , followed by deocde. Like the command line below, which does not work, I am still quoting it here to illustrate my point/aim. gst-launch-1.0 -e nvcamerasrc fpsRange="30.0 30.0" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! omxh265enc ! 'video/x-h265, stream-format=(string)byte-stream' ! h265parse ! matroskamux ! filesink location=/home/ubuntu/junk.mkv filesrc location=/home/ubuntu/junk.mkv ! decodebin ! nvoverlaysink -e b) When I try encode, followed by decode without file - it seems to work , however there are lots of frame drops and jerks. gst-launch-1.0 -e nvcamerasrc fpsRange="30.0 30.0" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! omxh265enc ! 'video/x-h265, stream-format=(string)byte-stream' ! h265parse ! matroskamux ! decodebin ! nvoverlaysink -e What would be right way to encode --> to a file --> read from the same file --> decode --> feed to opencv pipe ? What is best way to write and read from the same file concurrently using gstreamer pipe ? How could I feed multiple files generated by hlssink to decoder element in subsequent part of gstreamer pipeline ? Would the decoderbin element, not complain if a data from file_00 is interrupted midway and later supplied from file_01 ? Thanks -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |