Raspberry Pi Low-Latency Stream with GStreamer

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Raspberry Pi Low-Latency Stream with GStreamer

waymond91
Hello All!
I am trying to setup a raspberry pi as a low latency network camera (max.
latency 200ms).
In an ideal world, the raspberry pi would read the camera, send it over wifi
to a host pc.
This host PC would both display the video stream and save it to a file.
A key aspect that I am really struggling with is:
I would like the displayed image to be saved to a file that included a
timestamp of when an individual frame was actually displayed.
That being said, this is my first time doing any video processing
whatsoever.
I was able to find and build a gstreamer source for the pi camera from the
git repository here:
https://github.com/thaytan/gst-rpicamsrc

This seems to work. With the raspberry pi I am able to start a network
pipeline with the following command:
*$ gst-launch-1.0 rpicamsrc bitrate=1000000 \
    ! 'video/x-h264,width=640,height=480' \
    ! h264parse \
    ! queue \
    ! rtph264pay config-interval=1 pt=96 \
    ! gdppay \
    ! udpsink host=[MY IP] port=5000*

And on my host PC I can display a live stream with about 150mS latency with:
*$ gst-launch-1.0 udpsrc port=5000 \
    ! gdpdepay \
    ! rtph264depay \
    ! avdec_h264 \
    ! videoconvert \
    ! autovideosink sync=false*

Or save the stream with:
*$ gst-launch-1.0 udpsrc port=5000 \
    ! gdpdepay \
    ! rtph264depay \
    ! avdec_h264 \
    ! videoconvert \
    ! filesink location=video.h264*

So I am beginning to feel like I am on track.
Ultimately, however, I feel like I need to decode the h264 into some other
format (MP4?) so that I can actually start displaying and saving individual
frames and associating them with the current time (or at the least the time
between frames).
At the very least, I would like to be able to save and display the h264
stream with one pipeline.

On my host side, I have tried a tee element like so:
*$ gst-launch-1.0 udpsrc port=5000 \
  ! gdpdepay \
  ! rtph264depay \
  ! avdec_h264 \
  ! videoconvert \
  ! tee name = t \
  t. ! autovideosink sync=false \
  t. ! filesink location=tee.h264*

However, the video stream freezes up shortly after initialization. Any ideas
of how I can fix this?
Any ideas of how I can save the decoded, displayed frames and the time it
was displayed? I don't mind doing a little post-processing to get this lined
up correctly :P
Any ideas on how to cut latency?

Thanks again for the help!
PS Sorry I couldn't get code tags to work...



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: Raspberry Pi Low-Latency Stream with GStreamer

Gary Thomas-2
I suggest you add queue elements downstream from the tee. I do something similar with RPi and the receive pipeline I use is:

gst-launch-1.0 tcpclientsrc host=pizero1 port=5000 ! gdpdepay ! rtph264depay ! h264parse ! tee name=parsed parsed. ! queue ! avdec_h264 ! autovideosink sync=false parsed. ! queue ! mpegtsmux ! filesink location=cam.ts

Gary

-----Original Message-----
From: gstreamer-devel [mailto:[hidden email]] On Behalf Of waymond91
Sent: 26 October 2017 08:52
To: [hidden email]
Subject: Raspberry Pi Low-Latency Stream with GStreamer

Hello All!
I am trying to setup a raspberry pi as a low latency network camera (max.
latency 200ms).
In an ideal world, the raspberry pi would read the camera, send it over wifi to a host pc.
This host PC would both display the video stream and save it to a file.
A key aspect that I am really struggling with is:
I would like the displayed image to be saved to a file that included a timestamp of when an individual frame was actually displayed.
That being said, this is my first time doing any video processing whatsoever.
I was able to find and build a gstreamer source for the pi camera from the git repository here:
https://github.com/thaytan/gst-rpicamsrc

This seems to work. With the raspberry pi I am able to start a network pipeline with the following command:
*$ gst-launch-1.0 rpicamsrc bitrate=1000000 \
    ! 'video/x-h264,width=640,height=480' \
    ! h264parse \
    ! queue \
    ! rtph264pay config-interval=1 pt=96 \
    ! gdppay \
    ! udpsink host=[MY IP] port=5000*

And on my host PC I can display a live stream with about 150mS latency with:
*$ gst-launch-1.0 udpsrc port=5000 \
    ! gdpdepay \
    ! rtph264depay \
    ! avdec_h264 \
    ! videoconvert \
    ! autovideosink sync=false*

Or save the stream with:
*$ gst-launch-1.0 udpsrc port=5000 \
    ! gdpdepay \
    ! rtph264depay \
    ! avdec_h264 \
    ! videoconvert \
    ! filesink location=video.h264*

So I am beginning to feel like I am on track.
Ultimately, however, I feel like I need to decode the h264 into some other format (MP4?) so that I can actually start displaying and saving individual frames and associating them with the current time (or at the least the time between frames).
At the very least, I would like to be able to save and display the h264 stream with one pipeline.

On my host side, I have tried a tee element like so:
*$ gst-launch-1.0 udpsrc port=5000 \
  ! gdpdepay \
  ! rtph264depay \
  ! avdec_h264 \
  ! videoconvert \
  ! tee name = t \
  t. ! autovideosink sync=false \
  t. ! filesink location=tee.h264*

However, the video stream freezes up shortly after initialization. Any ideas of how I can fix this?
Any ideas of how I can save the decoded, displayed frames and the time it was displayed? I don't mind doing a little post-processing to get this lined up correctly :P Any ideas on how to cut latency?

Thanks again for the help!
PS Sorry I couldn't get code tags to work...



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: Raspberry Pi Low-Latency Stream with GStreamer

Matteo Valdina-3
Hi
Do you really need gdpay? The RTP should be enough.

About your tee element you need to add queue element after the tee.
About the latency the queue element need some tweak for lowering the latency but if your latency is 150 ms is good.

Best
Matteo


On Oct 26, 2017 04:52, "Gary Thomas" <[hidden email]> wrote:
I suggest you add queue elements downstream from the tee. I do something similar with RPi and the receive pipeline I use is:

gst-launch-1.0 tcpclientsrc host=pizero1 port=5000 ! gdpdepay ! rtph264depay ! h264parse ! tee name=parsed parsed. ! queue ! avdec_h264 ! autovideosink sync=false parsed. ! queue ! mpegtsmux ! filesink location=cam.ts

Gary

-----Original Message-----
From: gstreamer-devel [mailto:[hidden email]] On Behalf Of waymond91
Sent: 26 October 2017 08:52
To: [hidden email]
Subject: Raspberry Pi Low-Latency Stream with GStreamer

Hello All!
I am trying to setup a raspberry pi as a low latency network camera (max.
latency 200ms).
In an ideal world, the raspberry pi would read the camera, send it over wifi to a host pc.
This host PC would both display the video stream and save it to a file.
A key aspect that I am really struggling with is:
I would like the displayed image to be saved to a file that included a timestamp of when an individual frame was actually displayed.
That being said, this is my first time doing any video processing whatsoever.
I was able to find and build a gstreamer source for the pi camera from the git repository here:
https://github.com/thaytan/gst-rpicamsrc

This seems to work. With the raspberry pi I am able to start a network pipeline with the following command:
*$ gst-launch-1.0 rpicamsrc bitrate=1000000 \
    ! 'video/x-h264,width=640,height=480' \
    ! h264parse \
    ! queue \
    ! rtph264pay config-interval=1 pt=96 \
    ! gdppay \
    ! udpsink host=[MY IP] port=5000*

And on my host PC I can display a live stream with about 150mS latency with:
*$ gst-launch-1.0 udpsrc port=5000 \
    ! gdpdepay \
    ! rtph264depay \
    ! avdec_h264 \
    ! videoconvert \
    ! autovideosink sync=false*

Or save the stream with:
*$ gst-launch-1.0 udpsrc port=5000 \
    ! gdpdepay \
    ! rtph264depay \
    ! avdec_h264 \
    ! videoconvert \
    ! filesink location=video.h264*

So I am beginning to feel like I am on track.
Ultimately, however, I feel like I need to decode the h264 into some other format (MP4?) so that I can actually start displaying and saving individual frames and associating them with the current time (or at the least the time between frames).
At the very least, I would like to be able to save and display the h264 stream with one pipeline.

On my host side, I have tried a tee element like so:
*$ gst-launch-1.0 udpsrc port=5000 \
  ! gdpdepay \
  ! rtph264depay \
  ! avdec_h264 \
  ! videoconvert \
  ! tee name = t \
  t. ! autovideosink sync=false \
  t. ! filesink location=tee.h264*

However, the video stream freezes up shortly after initialization. Any ideas of how I can fix this?
Any ideas of how I can save the decoded, displayed frames and the time it was displayed? I don't mind doing a little post-processing to get this lined up correctly :P Any ideas on how to cut latency?

Thanks again for the help!
PS Sorry I couldn't get code tags to work...



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: Raspberry Pi Low-Latency Stream with GStreamer

waymond91
Thank you guys for the help! Adding the queue seemed to do the trick!
Now my receive side pipeline looks like this:
*gst-launch-1.0 udpsrc port=5000
  ! gdpdepay
  ! rtph264depay
  ! avdec_h264
  ! videoconvert
  ! tee name = t
  t. ! queue ! autovideosink sync=false
  t. ! queue ! filesink location=tee.h264
*

I am quite happy with this solution because all the decoding takes place
before the tee. Is the video being displayed and saved synchronously?

Any ideas for how to timestamp and save an invidual frame as it is
displayed? Would converting the stream to MJPEG before displaying and saving
the video add a lot of latency? Or should I just be streaming in MJPEG in
the first place?

I tried removing the gdppay from the source device and the gdpdepay from the
sink device, but now I get the following error from the sink device:

*GstPipeline:pipeline0GstRtpH264Depay:rtph264depay0:
Input buffers need to have RTP caps set on them. This is usually achieved by
setting the 'caps' property of the upstream source element (often udpsrc or
appsrc), or by putting a capsfilter element before the depayloader and
setting the 'caps' property on that. *

Reading the post from freedesktop, the gdppay seems to create some sort of
serial buffer specific to gstreamer. Not sure why I need it or if finding a
way around it would help with performance.



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Raspberry Pi Low-Latency Stream with GStreamer

Michael MacIntosh
You can use the "last-sample" property to get a sample, which has a
timestamp on it, I personally then convert it to a jpeg using another
pipeline with an appsrc, although there are probably better ways if you
are more familiar with GTK than I am.  Although I have no idea how you
would do that from the command line interface.

Also it looks like you are saving raw video buffers... You might want to
move where your tee is, for instance:

*gst-launch-1.0 udpsrc port=5000
   ! gdpdepay
   ! rtph264depay
   ! tee name = t
   t. ! queue ! avdec_h264 ! videoconvert ! autovideosink sync=false
   t. ! queue ! filesink location=tee.h264
*

That is assuming you don't want raw video, but with that pipeline with
any video feed at a decent resolution you will fill your storage very
quickly.

I am not personally familiar with gdppay and gdpdepay.

Good luck!

Cheers,
Michael.

On 10/26/2017 6:21 PM, waymond91 wrote:

> Thank you guys for the help! Adding the queue seemed to do the trick!
> Now my receive side pipeline looks like this:
> *gst-launch-1.0 udpsrc port=5000
>    ! gdpdepay
>    ! rtph264depay
>    ! avdec_h264
>    ! videoconvert
>    ! tee name = t
>    t. ! queue ! autovideosink sync=false
>    t. ! queue ! filesink location=tee.h264
> *
>
> I am quite happy with this solution because all the decoding takes place
> before the tee. Is the video being displayed and saved synchronously?
>
> Any ideas for how to timestamp and save an invidual frame as it is
> displayed? Would converting the stream to MJPEG before displaying and saving
> the video add a lot of latency? Or should I just be streaming in MJPEG in
> the first place?
>
> I tried removing the gdppay from the source device and the gdpdepay from the
> sink device, but now I get the following error from the sink device:
>
> *GstPipeline:pipeline0GstRtpH264Depay:rtph264depay0:
> Input buffers need to have RTP caps set on them. This is usually achieved by
> setting the 'caps' property of the upstream source element (often udpsrc or
> appsrc), or by putting a capsfilter element before the depayloader and
> setting the 'caps' property on that. *
>
> Reading the post from freedesktop, the gdppay seems to create some sort of
> serial buffer specific to gstreamer. Not sure why I need it or if finding a
> way around it would help with performance.
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel