Hi all,
I wrote my own appsrc code that takes an image from OpenCV and streams it to file or to RTP. While RTP works pretty well, I have an issue with "filesink": the video obtained as result is "accelerated" 2x. I'm trying to understand if I'm doing timestamp in the correct way, but I cannot figure out what's wrong. My pipeline is the follow: appsrc name=videosrc do-timestamp=1 is-live=1 format=time ! video/x-raw,format=I420,framerate=30/1,width=1280,height=720 ! queue ! videorate ! videoconvert ! x264enc ! video/x-h264, stream-format=byte-stream, bitrate=8000000, psy-tune=film ! h264parse ! mp4mux ! filesink location=test.mp4 sync=false Thank you for help Walter
Walter Lucetti
www.myzhar.com |
Apparently, the timestamps are incorrect. What you could do is insert the identity element before the videorate element and examine the timestamps by setting silent=false on that element. Next step would be to examine the timestamps before the encoder and see where things go wrong.
|
What I did was printing out the timestamp for the buffer at each "push frame" event... and effectively the timestamps were wrong: at the end of a 10 seconds recording I got about 5 seconds of time stamp. What I noticed is that my source has not a stable fps, varying from 30 fps to 25 fps... I thought to use std::chrono to calculate the time elapsed from one push to the next and use the "real" timestamp setting do-timestamp to false. Is there a more clever way to do this? -- Walter Lucetti email: [hidden email] web: www.robot-home.it - www.opencv.it project: http://myzharbot.robot-home.it Il 04 Feb 2017 20:43, "Arjen Veenhuizen" <[hidden email]> ha scritto: Apparently, the timestamps are incorrect. What you could do is insert the _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Walter Lucetti
www.myzhar.com |
Using std::chrono to calculate the "real timestamp" seems to be good. But now I have another problem: after a bit of seconds where the stream seems well synchronised, it starts "loosing frames". The signal "data-needed" is requested more and more rarely, like if the pipeline cannot handle all the data. Can it be a "buffering problem" related to file writing? -- Walter Lucetti email: [hidden email] web: www.robot-home.it - www.opencv.it project: http://myzharbot.robot-home.it Il 05 Feb 2017 14:46, "Walter Lucetti" <[hidden email]> ha scritto:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Walter Lucetti
www.myzhar.com |
It is definitely a timestamping issue. You are feeding the pipeline with buffers that have a timestamp that does not match the running time in the pipeline. Perhaps this little python snippet can help you out?
buf = Gst.Buffer.new_allocate(None, len(rawData), None) buf.fill(0, rawData) clock = self.elementSource.get_clock() base_time = self.elementSource.base_time; abs_time = clock.get_time(); buf.pts = abs_time - base_time buf.dts = abs_time - base_time Two more things: 1. you can also set tune=zerolatency on x264enc. 2. you are lacking any queues in your pipeline, I suggest you add one at least before and after the encoder. |
This is exactly what I did in my C++ code... you confirmed the validity ;) I have not set zerolatency because I thought that it would be not important for a filesink... I will set it. This is a good remark, I focused on pipeline correctness and I forgot queue! Thank you very much Arjen, you are really kind and helpful!!! Walter
Walter Lucetti
www.myzhar.com |
If it is still not working I would opt for trying a completely different approach (just to see if that works).
For example, you could switch the appsrc for a filesrc and use a named pipe to feed the frames in one-by-one (assuming that your frames are e.g. RGB/YUV/anything supported by GStreamer). Use and configure a videoparse element (just after the filesrc) to parse the raw incoming stream to the appropriate format and width/height/framerate. |
Yes, I continue to have problem, but at this point I think that the issue is with my appsrc code. I'm not using a "gmainloop" and appsrc runs in the main thread, so I think that it waits for the GUI events before requesting for a new frame... I'm going to investigate this way. I noticed that scaling down the frame before h264 encoding makes the resulting video a little better. I would like to find a good example for appsrc getting frame from external elaboration... but I cannot. -- Walter Lucetti email: [hidden email] web: www.robot-home.it - www.opencv.it project: http://myzharbot.robot-home.it Il 07 Feb 2017 12:32, "Arjen Veenhuizen" <[hidden email]> ha scritto: If it is still not working I would opt for trying a completely different _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Walter Lucetti
www.myzhar.com |
Hello Myzhar,
Did you find any working solution? I am currently facing a similar issue. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |