complicated setup

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

complicated setup

jouke
Hello all,

First of all thanks for accepting me on your mailing list. I'm working on a client-server aplication to pilot a number of rpi cams to film the experiments we do in our lab.
The idea is that the rpi camera is piloted using pythons picamera package. This results in a H264 stream, I then use gstream to send this to a central server using something like this
- Gst.parse_launch("appsrc name="source" ! h264parse ! mpegtsmux ! udpsink host='...' port=5000")

The server, on command, records the streams and forwards 1 to a pilot (this is to be able to adjust the cameras in the lab).
If I simply do :
- gst-launch-1.0 udpsrc port=5000 ! video/mpegts ! udpsink host="..." port=5000
I can read the stream on the pilot by using
- gst-launch-1.0 udpsrc port=5000 ! decodebin ! autovideosink
but the video has massive latency and the quality is really bad

I'm sure I'm doing something wrong but I can't find what.

Thanks in advance,

Jouke
Reply | Threaded
Open this post in threaded view
|

Re: complicated setup

Arjen Veenhuizen
You will have to add RTP in the mix to get it running properly.

E.g. on the sending side:
appsrc ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink
Note the caps of the udpsink (should read something like "application/x-rtp,payload=33,....")
and on the receiving side:
udpsrc caps="<insert caps on sending side>" ! rtmp2tdepay ! tsdemux ! decodebin ! autovideosink
Reply | Threaded
Open this post in threaded view
|

Re: complicated setup

jouke
Thank you Arjen, that worked.
Now that I have several cameras running I need to add 1 audio source (a mic), stream it to the server and mux it with the different video streams. This kind of works but video and audio are very much out of sync. Is there a way to avoid this ? Note that the audio comes from a separate microphone !

Jouke