Hi there, I need a little bit of help with a long gstreamer pipeline and sending/receiving video/audio. I have two pipelines which work successfully... Sending Video
gst-launch gstrtpbin name=rtpbin latency=0 ksvideosrc device-index=0 typefind=true ! typefind ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv, width=640, height=480 ! videorate ! video/x-raw-yuv, framerate=15/1 ! ffenc_mpeg4 ! rtpmp4vpay send-config=true ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! udpsink port=5502 host=192.168.10.175 rtpbin.send_rtcp_src_0 ! udpsink port=5510 host=192.168.10.175 sync=false async=false udpsrc port=5510 ! rtpbin.recv_rtcp_sink_0 autoaudiosrc samplesperbuffer=1000 ! alawenc ! rtppcmapay ! rtpbin.send_rtp_sink_1 rtpbin.send_rtp_src_1 ! udpsink port=5504 host=192.168.10.175 rtpbin.send_rtcp_src_1 ! udpsink port=5512 host=192.168.10.175 sync=false async=false udpsrc port=5512 ! rtpbin.recv_rtcp_sink_1
Receiving Video gst-launch gstrtpbin name=rtpbin2 latency=0 udpsrc caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1" port=5502 ! rtpbin2.recv_rtp_sink_0 rtpbin2. ! rtpmp4vdepay ! ffdec_mpeg4 ! ffmpegcolorspace ! autovideosink udpsrc port=5510 ! rtpbin2.recv_rtcp_sink_0 rtpbin2.send_rtcp_src_0 ! udpsink host=192.168.10.175 port=5510 sync=false async=false udpsrc caps=application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)PCMA port=5504 ! rtpbin2.recv_rtp_sink_1 rtpbin2. ! rtppcmadepay ! alawdec ! autoaudiosink buffer-time=10000 udpsrc port=5512 ! rtpbin2.recv_rtcp_sink_1 rtpbin2.send_rtcp_src_1 ! udpsink host=192.168.10.175 port=5512 sync=false async=false
I wanted to combine them so that I could have sending/receiving in one pipeline. Is this possible? Right now I simply launch two gstreamer instances (one for sending, one for receiving). The problem is that in the receiving instance I would like to access the local camera and place it as a local preview window into the receiving video. This works fine but the problem is, you can't have two gstreamer instances accessing the camera. I thought if I could combine the above into one pipeline I could use tee and split them then there would only ever be one instance of the camera used at once.
If I simply combine the above two into something like: gst-launch \ gstrtpbin name=rtpbin latency=0 ksvideosrc device-index=0 typefind=true ! typefind ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv, width=640, height=480 ! videorate ! video/x-raw-yuv, framerate=15/1 ! ffenc_mpeg4 ! rtpmp4vpay send-config=true ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! udpsink port=5502 host=192.168.10.175 rtpbin.send_rtcp_src_0 ! udpsink port=5510 host=192.168.10.175 sync=false async=false udpsrc port=5510 ! rtpbin.recv_rtcp_sink_0 autoaudiosrc samplesperbuffer=1000 ! alawenc ! rtppcmapay ! rtpbin.send_rtp_sink_1 rtpbin.send_rtp_src_1 ! udpsink port=5504 host=192.168.10.175 rtpbin.send_rtcp_src_1 ! udpsink port=5512 host=192.168.10.175 sync=false async=false udpsrc port=5512 ! rtpbin.recv_rtcp_sink_1 \
gstrtpbin name=rtpbin2 latency=0 udpsrc caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1" port=5502 ! rtpbin2.recv_rtp_sink_0 rtpbin2. ! rtpmp4vdepay ! ffdec_mpeg4 ! ffmpegcolorspace ! autovideosink udpsrc port=5510 ! rtpbin2.recv_rtcp_sink_0 rtpbin2.send_rtcp_src_0 ! udpsink host=192.168.10.175 port=5510 sync=false async=false udpsrc caps=application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)PCMA port=5504 ! rtpbin2.recv_rtp_sink_1 rtpbin2. ! rtppcmadepay ! alawdec ! autoaudiosink buffer-time=10000 udpsrc port=5512 ! rtpbin2.recv_rtcp_sink_1 rtpbin2.send_rtcp_src_1 ! udpsink host=192.168.10.175 port=5512 sync=false async=false
It doesn't appear to work... gstreamer starts successfully but I can't recieve video on either end Has this been attempted before? Is there a best way to do it? (I presume people normally use two pipelines?)
Kind regards, Andy Savage
------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Hi,
On Mon, Sep 20, 2010 at 7:10 AM, Andy Savage <[hidden email]> wrote:
I've tried something like that a few months ago and it didn't work (even if using an application instead of gst-launch). As I didn't have much time to dig I just continued using two separate pipelines.
Can't you put a tee in the sending pipeline and attach for instance an xvimagesink to it? In case you're not sending data (no rtp involved) it's even simpler, as you just need a kind-of v4lsrc ! xvimagesink pipe. Regards
------------------------------------------------------------------------------ Start uncovering the many advantages of virtual appliances and start using them to simplify application deployment and accelerate your shift to cloud computing. http://p.sf.net/sfu/novell-sfdev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |