Hello!
I'm experimenting with using the SRT [1] protocol and I noticed that GStreamer has a collection of SRT elements, so I've been trying to get them to work. However, I can't actually get any streaming to actually start. With my distro not offering any SRT packages, I've compiled up a full GStreamer instance from git sources (so up to date on 1.17 as of yesterday morning) with libsrt at version 1.4.1. As a first step, I've been following a guide from Collabora [2] to experiment with it, and I'm using the following pipelines as server and client respectively (on the same physical machine): > gst-launch-1.0 videotestsrc ! video/x-raw,height=800,width=600 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264,profile=high ! srtsink uri=srt://:8888/ > gst-launch-1.0 srtsrc uri=srt://127.0.0.1:8888/ ! decodebin ! autovideosink The pipeline on both sides of the equation enter the PLAYING state, and I can see the two peers connect and perform the handshake, but the playback counter on the server never begins counting and I never get a video window opening on the client side. With GST_DEBUG="srt:9", I can see that the server seems to be stuck waiting for a caller request after the client has "connected": > 0:00:02.768093620 16974 0x555ed162d8f0 DEBUG srtobject gstsrtobject.c:679:thread_func:<srtsink0> Waiting a request from caller Reading through the code, I can't figure out how the caller is supposed to submit a request, that will presumably kick off the sending of the stream. The client is just sat there waiting for frames it seems. Does anyone have any experience with these elements and might be able to shed some light on what these problems are? Best Regards, Sam [1]: https://github.com/Haivision/srt [2]: https://www.collabora.com/news-and-blog/blog/2018/02/16/srt-in-gstreamer/ -- Project R&D Engineer BCS Internet Distribution BBC Research and Development 5th Floor Dock House, MediaCityUK _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (235 bytes) Download Attachment |
On Tue, 21 Apr 2020 at 12:22, Samuel Hurst <[hidden email]> wrote: Hello! Hi Sam, I don't have direct experience with Gstreamer SRT itself, but have used SRT extensively on other applications. One thing that might be worth trying is testing one pipeline at a time. I would tend to put the destination as srt://127.0.0.1:8888 rather than without the IP address when testing locally. And to receive, try using VLC. VLC will play srt, eg in this case open a network stream in VLC with url srt://@:8888 or even srt://127.0.0.1:8888 SRT communication is setup between a caller and a listener and they have to have established a connection before any stream will transmit. If you download the source from github then you can build srt-live-transmit and use that to send and/or receive an SRT stream and convert it from/to UDP. Cheers, Simon _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hi Simon,
On 21/04/2020 12:31, Simon Brown wrote: > I would tend to put the destination as srt://127.0.0.1:8888 > <http://127.0.0.1:8888> rather than without the IP address when testing > locally. And to receive, try using VLC. > VLC will play srt, eg in this case open a network stream in VLC with url > srt://@:8888 or even srt://127.0.0.1:8888 <http://127.0.0.1:8888> I tried setting both ends specifically and got nothing, but when setting just the server to a specific address and the client without one, this at least made the server start sending packets which I'm guessing contained the video stream, although the client still didn't do anything helpful. I additionally tried to get VLC to consume the stream from GStreamer, but even VLC just sat there flashing the seek bar and not getting the stream. I didn't see GStreamer create any packets either. > SRT communication is setup between a caller and a listener and they have > to have established a connection before any stream will transmit. > > If you download the source from github then you can build > srt-live-transmit and use that to send and/or receive an SRT stream and > convert it from/to UDP. Cheers, I'll give that idea a go. Best Regards, Sam _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (235 bytes) Download Attachment |
On 21/04/2020 16:55, Samuel Hurst wrote:
> On 21/04/2020 12:31, Simon Brown wrote: >> SRT communication is setup between a caller and a listener and they have >> to have established a connection before any stream will transmit. >> >> If you download the source from github then you can build >> srt-live-transmit and use that to send and/or receive an SRT stream and >> convert it from/to UDP. > > Cheers, I'll give that idea a go. Thanks again for the hint Simon, I managed to take another look at this today and by replacing srtsink with udpsink and sticking srt-live-transmit in the middle, I managed to successfully get this working. Is requiring this extra hop in the middle to be expected? I know that the rtmp/2 elements don't support sending RTMP streams directly from rtmpsink to rtmpsrc and you have to run an RTMP server in the middle. Admittedly I'm quite new to SRT as a protocol, so I might have the wrong end of the stick. Alternatively, could this be a bug in the srtsink element? Best Regards, Sam _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (235 bytes) Download Attachment |
As I said, I'm not familiar with the SRT elements in Gstreamer, but I'd have thought that if your pipeline works with a UDP sink then sending the 'raw' SRT to the srtsink should work - you shouldn't need the srt-live-transmit in the middle if the Gstreamer srtsink supports SRT correctly. In the same way you can use srt-live-transmit to retransmit udp and pick that up with VLC, but if you ask VLC to pick the srt stream from source it is quite happy to do it. The only other thought I've had, and I'm sure you've checked, but srtsink isn't about subtitles is it? It is the biggest pain about the SRT abbreviation is that it has two completely different uses in broadcast media. Regards, Simon _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Samuel Hurst
Ühel kenal päeval, T, 21.04.2020 kell 12:17, kirjutas Samuel Hurst:
> Hello! > > I'm experimenting with using the SRT [1] protocol and I noticed that > GStreamer has a collection of SRT elements, so I've been trying to > get > them to work. However, I can't actually get any streaming to actually > start. > > With my distro not offering any SRT packages, I've compiled up a full > GStreamer instance from git sources (so up to date on 1.17 as of > yesterday morning) with libsrt at version 1.4.1. > > As a first step, I've been following a guide from Collabora [2] to > experiment with it, and I'm using the following pipelines as server > and > client respectively (on the same physical machine): > > > gst-launch-1.0 videotestsrc ! video/x-raw,height=800,width=600 ! > > videoconvert ! x264enc tune=zerolatency ! video/x-h264,profile=high > > ! srtsink uri=srt://:8888/ > > gst-launch-1.0 srtsrc uri=srt://127.0.0.1:8888/ ! decodebin ! > > autovideosink > > The pipeline on both sides of the equation enter the PLAYING state, > and > I can see the two peers connect and perform the handshake, but the > playback counter on the server never begins counting and I never get > a > video window opening on the client side. > > With GST_DEBUG="srt:9", I can see that the server seems to be stuck > waiting for a caller request after the client has "connected": > > > 0:00:02.768093620 16974 0x555ed162d8f0 DEBUG srtobject > > gstsrtobject.c:679:thread_func:<srtsink0> Waiting a request from > > caller > > Reading through the code, I can't figure out how the caller is > supposed > to submit a request, that will presumably kick off the sending of the > stream. The client is just sat there waiting for frames it seems. > > Does anyone have any experience with these elements and might be able > to > shed some light on what these problems are? If you change srt://:8888 to srt://127.0.0.1:8888, then you make it be a caller iirc, so careful with that. SRT has a caller and a listener - listener waits for callers. You could think of a listener as the server and caller as a client in most scenarios. So one side needs to be a listener, and the other a caller. You should be able to enforce it in the URL, e.g srt://127.0.0.1:8888?mode=listener or the mode property on the srt elements. There's also a rendezvous mode (both set sides that then) for some firewall punching use cases and such. Often one side is an actual server and doesn't have firewall trouble and thus is the listener. In my experience UDP on the same machine can be rather troublesome with an out of the box kernel configuration, so I suggest also trying it with the sending and receiving pipelines on different computers. I've also had a bit better luck by using the router assigned local NAT IP instead of loopback address for quick tests with both sides on the same machine. Mart _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |