Hello All!
I am working on a streaming a video from a raspberry pi to a live video stream. I am using the pricamsrc for my gstreamer source. I want the stream to be as low-latency as possible, so I am using h264 frames to send the stream over wifi. Here is my pipeline on the raspberry side of things: gst-launch-1.0 -e rpicamsrc bitrate=1000000 rotation=180 \ ! 'video/x-h264,width=640,height=480' \ ! h264parse \ ! queue \ ! rtph264pay config-interval=1 pt=96 \ ! gdppay \ ! udpsink host=MY_IP port=5000 On my video display, the pc splits the stream so it can both save the video and display the stream. The pipeline looks like this: gst-launch-1.0 -e udpsrc port=5000 \ ! gdpdepay \ ! rtph264depay \ ! tee name = t \ t. ! queue ! avdec_h264 ! videoconvert ! autovideosink sync=false \ t. ! queue ! h264parse ! mp4mux! filesink location=tee.mp4 \ This all works OK. But what I need to have working is a timestamp of the absolute time when the frame is actually displayed. I'd then like to save the actual frame displayed and the associated time to any video format that best contains this information (ultimately, I'd like to go through with a program like matlab & have a jpeg of each frame and the associated timestamp). I am very new to this process, and if anybody had any insight as to how to get this working, it'd be a HUGE help! Thanks in advance! -Brett -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
You are transmitting h264 in RTP over UDP. Adding metadata to this stream
would be pretty tricky. You could look into multiplexing the h264 stream into an mpeg2ts. Then you can add KLV data to that stream which can contain arbitrary data (e.g. your wall clock time). This will require some coding though. Perhaps you can do a trick with the RTP timestamps but I am not too sure about that. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by waymond91
Hi, if you want to display the clock timestamp on top of the video source you
can use time overlay element. After that you can read every image with the timestamp in a certain location. This is an easy and noob workaround, let me know if it helps. Link to the element: https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-timeoverlay.html -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
This post was updated on .
I think this may be my best option given the amount of time I have available.
I just did a quick test, and I think I can use the clockoverlay or timeoverlay to get the information on the stream. Then using python tesseract library, I can extract the timestamp into a string. Now I just have one more problem. Again, here is how my sink pipeline is currently setup: gst-launch-1.0 -e udpsrc port=5000 \ ! gdpdepay \ ! rtph264depay \ ! tee name = t \ t. ! queue ! avdec_h264 ! videoconvert ! timeoverlay ! autovideosink sync=false \ t. ! queue ! h264parse ! mp4mux! filesink location=tee.mp4 \ However, I'd like the tee to kick in right after applying the timeoverlay and right before the autovideosink. This way, the timeoverlay will be the same for both the autovideosink and the filesink. For example: gst-launch-1.0 -e udpsrc port=5000 \ ! gdpdepay \ ! rtph264depay \ ! queue ! avdec_h264 ! videoconvert ! timeoverlay \ ! tee name = t \ t. ! autovideosink sync=false \ t. ! somehow ! convert ! to_mp4! filesink location=tee.mp4 \ Is this possible? Any Ideas where I should start? Thank you so much for your help guys!! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list gstreamer-devel@lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Ok. I have been working at. My biggest problem is: I don't really know how to
probe the format my stream is in at different states of the pipeline. This makes it difficult for me to discover what filter element I should use. Any advice on this? I know how to use the gst-inspect-1.0 tool, but it seems like filters like videoconvert and sinks like filesink & autovideosink accept and output such a wide range of formats, so its not clear to me what is exactly happening. Since I know that the autovideosink is working well I am simply dropping the tee and looking for a new way to filesink the stream after the timeoverlay. I feel like this is very doable, I am just inexperienced! Here is how it looks: gst-launch-1.0 -e udpsrc port=5000 \ ! gdpdepay \ ! rtph264depay \ ! queue ! avdec_h264 ! videoconvert ! timeoverlay ! filesink location=tee.mp4 \ Unfortunately, this throws no errors when I run the stream, the video simply won't play back. Are there filters available that make this possible? i.e: gst-launch-1.0 -e udpsrc port=5000 \ ! gdpdepay \ ! rtph264depay \ ! queue ! avdec_h264 ! videoconvert ! timeoverlay ! something ! else ! here ! filesink location=tee.mp4 \ Even just a way to probe the exact format of whats coming out of the timeoverlay would be a huge help. Inspecting the timeoverlay sources shows: Capabilities: video/x-raw format: { BGRx, RGBx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, I420, YV12, AYUV, YUY2, UYVY, v308, Y41B, Y42B, Y444, NV12, NV21, A420, YUV9, YVU9, IYU1, GRAY8 } width: [ 1, 2147483647 ] height: [ 1, 2147483647 ] framerate: [ 0/1, 2147483647/1 ] video/x-raw(ANY) format: { I420, YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, NV16, NV61, NV24, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10LE, I420_10BE, I422_10LE, I422_10BE, Y444_10LE, Y444_10BE, GBR, GBR_10LE, GBR_10BE, NV12_64Z32, A420_10LE, A420_10BE, A422_10LE, A422_10BE, A444_10LE, A444_10BE } But how do I know which one it actually is?? Cheers! Brett -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by waymond91
Ok! I am very close! Sorry if I am spamming the thread!
So I can save the file with the timeoverlay like this: gst-launch-1.0 -e udpsrc port=5000 \ ! gdpdepay \ ! rtph264depay \ ! queue ! avdec_h264 ! videoconvert ! timeoverlay ! qtmux ! filesink location=tee.mp4 \ And I can I display the livestream like this: gst-launch-1.0 -e udpsrc port=5000 \ ! gdpdepay \ ! rtph264depay \ ! queue ! avdec_h264 ! videoconvert ! timeoverlay ! autovideosink sync=false \ BUT, when I try to do both at the same time: gst-launch-1.0 -e udpsrc port=5000 \ ! gdpdepay \ ! rtph264depay \ ! queue ! avdec_h264 ! videoconvert ! timeoverlay \ ! tee name=t \ t. ! autovideosink sync=false \ t. ! qtmux ! filesink location=tee.mp4 \ I get the following error: WARNING: erroneous pipeline: could not link t to qtmux0 Randomly, I tried adding a queue between the tee and qtmux. This doesn't throw errors, but crashes after running for a couple milliseconds. The first frame is displayed and no data is saved. I don't know why this would or would not work, but I've included the error message to be thorough. ERROR: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: Output window was closed Additional debug info: xvimagesink.c(555): gst_xv_image_sink_handle_xevents (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage An error happened while waiting for EOS Execution ended after 0:00:41.633882529 Any ideas gang? Thanks in advance! -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by waymond91
Yes, that is possible. Why are you gdp payloading and rtp payloading on top
of each other? That doesn't add much in your case, so I would get rid of it (also in your sending pipeline). gst-launch-1.0 -e udpsrc port=5000 \ ! rtph264depay \ ! queue ! avdec_h264 ! videoconvert ! timeoverlay \ ! queue ! tee name = t \ t. ! queue ! autovideosink sync=true \ t. ! queue ! x264enc tune=zerolatency speed-preset=ultrafast ! queue ! mp4mux ! queue ! filesink location=tee.mp4 sync=false I would put sync=true on your videosink for smoother playout. You can leave it at false if you don't care about that. (didn't test the exact pipeline so you might have to fiddle around with it a bit to get it to work) -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by waymond91
qtmux and your video sink probably don't agree on a common video
format. There are also a couple of other problems in your pipeline. 1. add a queue after each branch of the tee. 2. add a videoconvert after each branch of the tee as well so that you can get something displayed and muxed. 3. you probably want to encode your file before muxing it into qtmux with e.g. x264enc. You may need larger queue's to account for the large encoding delay by default or use x264enc tune=zerolatency for a minimal latency. 4. gst-launch-1.0 has the -v option which will display the caps chosen by various pads on elements (among other things). If you need information about the actual negotiation, then GStreamer debug logs are your friend. 5. Also, gdp in rtp is probably not going to work very well at all. You can forgo the gdp encapsulation. Cheers -Matt On 18/05/18 12:29, waymond91 wrote: > Ok! I am very close! Sorry if I am spamming the thread! > > So I can save the file with the timeoverlay like this: > > gst-launch-1.0 -e udpsrc port=5000 \ > ! gdpdepay \ > ! rtph264depay \ > ! queue ! avdec_h264 ! videoconvert ! timeoverlay ! qtmux ! filesink > location=tee.mp4 \ > > And I can I display the livestream like this: > > gst-launch-1.0 -e udpsrc port=5000 \ > ! gdpdepay \ > ! rtph264depay \ > ! queue ! avdec_h264 ! videoconvert ! timeoverlay ! autovideosink > sync=false \ > > BUT, when I try to do both at the same time: > > gst-launch-1.0 -e udpsrc port=5000 \ > ! gdpdepay \ > ! rtph264depay \ > ! queue ! avdec_h264 ! videoconvert ! timeoverlay \ > ! tee name=t \ > t. ! autovideosink sync=false \ > t. ! qtmux ! filesink location=tee.mp4 \ > > I get the following error: > WARNING: erroneous pipeline: could not link t to qtmux0 > > Randomly, I tried adding a queue between the tee and qtmux. This doesn't > throw errors, but crashes after running for a couple milliseconds. The first > frame is displayed and no data is saved. I don't know why this would or > would not work, but I've included the error message to be thorough. > > ERROR: from element > /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage: > Output window was closed > Additional debug info: > xvimagesink.c(555): gst_xv_image_sink_handle_xevents (): > /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage > An error happened while waiting for EOS > Execution ended after 0:00:41.633882529 > > Any ideas gang? Thanks in advance! _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (499 bytes) Download Attachment |
Free forum by Nabble | Edit this page |