rtpbin + mpegtsmux

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

rtpbin + mpegtsmux

Gary Thomas
I'm trying to stream MPEG-TS data via RTP/RTCP using these pipelines:

Server:
   gst-launch -v gstrtpbin name=rtpbin v4l2src ! video/x-raw-yuv,width=720,height=480 !
     x264enc ! mpegtsmux ! rtpmp2tpay ! rtpbin.send_rtp_sink_0
     rtpbin.send_rtp_src_0 ! udpsink port=5000 host=192.168.1.101 ts-offset=0 name=vrtpsink
     rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=192.168.1.101 sync=false async=false name=vrtcpsink
     udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0

Client:
   gst-launch -v gstrtpbin name=rtpbin latency=200
     udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES port=5000 !
        rtpbin.recv_rtp_sink_0 rtpbin. ! rtpmp2tdepay ! mpegtsdemux name=demux ! ffdec_h264 ! xvimagesink demux.
     udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 rtpbin.send_rtcp_src_0 !
        udpsink port=5005 host=192.168.1.101 sync=false async=false

When the server starts up, I get these notices:
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)480, fr
amerate=(fraction)30/1
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)8
0, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)
480, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstTIVidenc1:tividenc10.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)48
0, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstTIVidenc1:tividenc10.GstPad:src: caps = video/x-h264, framerate=(fraction)30/1, width=(int)720, height=(int)4
80
/GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:sink_64: caps = video/x-h264, framerate=(fraction)30/1, width=(int)720, height=(int)
480
/GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, enco
ding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188
/GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, streamhea
der=(buffer)< 47400030a600fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 474020308b00f
fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
fffff0002b0280001c10000e040f00c050448444d5688040ffffcfc1be040f00a050848444d56ff1b443ffba2e249 >
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, str
eamheader=(buffer)< 47400030a600fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 4740203
08b00ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
fffffffffff0002b0280001c10000e040f00c050448444d5688040ffffcfc1be040f00a050848444d56ff1b443ffba2e249 >
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtp_sink: caps = application/x-rtp, media=(string)video,
clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=
(uint)41726
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)900
00, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_sink_0.GstProxyPad:proxypad1: caps = application/x-rtp, media=(string)vide
o, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-ba
se=(uint)41726
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_src_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)9000
0, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtp_src: caps = application/x-rtp, media=(string)video, c
lock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(
uint)41726
/GstPipeline:pipeline0/GstUDPSink:vrtpsink.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-
name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_src_0.GstProxyPad:proxypad2: caps = application/x-rtp, media=(string)video
, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-bae
=(uint)41726
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp
/GstPipeline:pipeline0/GstUDPSink:vrtcpsink.GstPad:sink: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad3: caps = application/x-rtcp

Similarly for the client:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0: ntp-ns-base = 3493752949569988000
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)MP2T-ES
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)MP2T-ES
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)MP2T-ES
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)MP2T-ES
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)MP2T-ES
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpPtDemux:rtpptdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES
/GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:src: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true
/GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, payload=(int)33
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_src_0_4255958994_33.GstProxyPad:proxypad3: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)MP2T-ES, payload=(int)33
/GstPipeline:pipeline0/GstMpegTSDemux:demux.GstPad:sink: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true
/GstPipeline:pipeline0/GstMpegTSDemux:demux: pat-info = ((GValueArray*) 0xb3f02c80)
/GstPipeline:pipeline0/GstMpegTSDemux:demux: pmt-info = ((MpegTsPmtInfo*) 0xb3f02520)
/GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad2: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:sync_src: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_sink: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_src_-39008302: caps = application/x-rtcp
/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink_rtcp: caps = application/x-rtcp

Sadly, even though the server is pumping out data, I don't see
anything at the client (my xvimagesink window never opens up)

Any ideas what I'm doing wrong or how to diagnose this?

Thanks

Note: I'm also a bit unsure how to write these pipelines if I want
to put audio data into the .TS container as well.  Any pointers
on this would be most helpful.

--
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Gary Thomas
On 09/17/2010 05:01 PM, Gary Thomas wrote:

> I'm trying to stream MPEG-TS data via RTP/RTCP using these pipelines:
>
> Server:
>     gst-launch -v gstrtpbin name=rtpbin v4l2src ! video/x-raw-yuv,width=720,height=480 !
>       x264enc ! mpegtsmux ! rtpmp2tpay ! rtpbin.send_rtp_sink_0
>       rtpbin.send_rtp_src_0 ! udpsink port=5000 host=192.168.1.101 ts-offset=0 name=vrtpsink
>       rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=192.168.1.101 sync=false async=false name=vrtcpsink
>       udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0
>
> Client:
>     gst-launch -v gstrtpbin name=rtpbin latency=200
>       udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES port=5000 !
>          rtpbin.recv_rtp_sink_0 rtpbin. ! rtpmp2tdepay ! mpegtsdemux name=demux ! ffdec_h264 ! xvimagesink demux.
>       udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 rtpbin.send_rtcp_src_0 !
>          udpsink port=5005 host=192.168.1.101 sync=false async=false
>
> When the server starts up, I get these notices:
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)480, fr
> amerate=(fraction)30/1
> Pipeline is live and does not need PREROLL ...
> Setting pipeline to PLAYING ...
> New clock: GstSystemClock
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)8
> 0, framerate=(fraction)30/1
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)
> 480, framerate=(fraction)30/1
> /GstPipeline:pipeline0/GstTIVidenc1:tividenc10.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)UYVY, width=(int)720, height=(int)48
> 0, framerate=(fraction)30/1
> /GstPipeline:pipeline0/GstTIVidenc1:tividenc10.GstPad:src: caps = video/x-h264, framerate=(fraction)30/1, width=(int)720, height=(int)4
> 80
> /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:sink_64: caps = video/x-h264, framerate=(fraction)30/1, width=(int)720, height=(int)
> 480
> /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188
> /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, enco
> ding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726
> /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188
> /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, streamhea
> der=(buffer)<  47400030a600fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
> fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
> ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 474020308b00f
> fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
> fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
> fffff0002b0280001c10000e040f00c050448444d5688040ffffcfc1be040f00a050848444d56ff1b443ffba2e249>
> /GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, str
> eamheader=(buffer)<  47400030a600fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
> fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
> ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 4740203
> 08b00ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
> fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
> fffffffffff0002b0280001c10000e040f00c050448444d5688040ffffcfc1be040f00a050848444d56ff1b443ffba2e249>
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtp_sink: caps = application/x-rtp, media=(string)video,
> clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=
> (uint)41726
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)900
> 00, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_sink_0.GstProxyPad:proxypad1: caps = application/x-rtp, media=(string)vide
> o, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-ba
> se=(uint)41726
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_src_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)9000
> 0, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtp_src: caps = application/x-rtp, media=(string)video, c
> lock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(
> uint)41726
> /GstPipeline:pipeline0/GstUDPSink:vrtpsink.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-
> name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-base=(uint)41726
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtp_src_0.GstProxyPad:proxypad2: caps = application/x-rtp, media=(string)video
> , clock-rate=(int)90000, encoding-name=(string)MP2T-ES, ssrc=(uint)4255958994, payload=(int)33, clock-base=(uint)3087999769, seqnum-bae
> =(uint)41726
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstUDPSink:vrtcpsink.GstPad:sink: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad3: caps = application/x-rtcp
>
> Similarly for the client:
> Setting pipeline to PAUSED ...
> Pipeline is live and does not need PREROLL ...
> Setting pipeline to PLAYING ...
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0: ntp-ns-base = 3493752949569988000
> New clock: GstSystemClock
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
> encoding-name=(string)MP2T-ES
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
> encoding-name=(string)MP2T-ES
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
> encoding-name=(string)MP2T-ES
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
> encoding-name=(string)MP2T-ES
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
> encoding-name=(string)MP2T-ES
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpPtDemux:rtpptdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES
> /GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:src: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true
> /GstPipeline:pipeline0/GstRtpMP2TDepay:rtpmp2tdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP2T-ES, payload=(int)33
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:recv_rtp_src_0_4255958994_33.GstProxyPad:proxypad3: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000,
> encoding-name=(string)MP2T-ES, payload=(int)33
> /GstPipeline:pipeline0/GstMpegTSDemux:demux.GstPad:sink: caps = video/mpegts, packetsize=(int)188, systemstream=(boolean)true
> /GstPipeline:pipeline0/GstMpegTSDemux:demux: pat-info = ((GValueArray*) 0xb3f02c80)
> /GstPipeline:pipeline0/GstMpegTSDemux:demux: pmt-info = ((MpegTsPmtInfo*) 0xb3f02520)
> /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad2: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:sync_src: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_sink: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:rtcp_src_-39008302: caps = application/x-rtcp
> /GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink_rtcp: caps = application/x-rtcp
>
> Sadly, even though the server is pumping out data, I don't see
> anything at the client (my xvimagesink window never opens up)
>
> Any ideas what I'm doing wrong or how to diagnose this?
>
> Thanks
>
> Note: I'm also a bit unsure how to write these pipelines if I want
> to put audio data into the .TS container as well.  Any pointers
> on this would be most helpful.
>

Followup - I tried this between two [similar] x86 desktop systems
and it worked!  It seems to only fail when my server machine is
my embedded OMAP board (running a recent kernel and the same gstreamer
modules as on the x86 systems).  How can I figure out where in the
process (pipeline) it's failing?


--
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Marco Ballesio
In reply to this post by Gary Thomas
Hi,

On Sat, Sep 18, 2010 at 2:01 AM, Gary Thomas <[hidden email]> wrote:
I'm trying to stream MPEG-TS data via RTP/RTCP using these pipelines:

Server:
  gst-launch -v gstrtpbin name=rtpbin v4l2src ! video/x-raw-yuv,width=720,height=480 !
    x264enc ! mpegtsmux ! rtpmp2tpay ! rtpbin.send_rtp_sink_0
    rtpbin.send_rtp_src_0 ! udpsink port=5000 host=192.168.1.101 ts-offset=0 name=vrtpsink
    rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=192.168.1.101 sync=false async=false name=vrtcpsink
    udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0

The feeling is that the muxing/payloading is wrong.. try replacing mpegtsmux ! rtpmp2pay with rtph264pay (and a similar operation in the receiver).
 
Client:
  gst-launch -v gstrtpbin name=rtpbin latency=200
    udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES port=5000 !
       rtpbin.recv_rtp_sink_0 rtpbin. ! rtpmp2tdepay ! mpegtsdemux name=demux ! ffdec_h264 ! xvimagesink demux.
    udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 rtpbin.send_rtcp_src_0 !
       udpsink port=5005 host=192.168.1.101 sync=false async=false



..snip..

if you have more than a few lines to report in a thread, sending them as an attachment (or even using tools like pastebin) will improve readability -and will avoid my poor old eepc 701 going crazy with the web client ;) -.

Regards.
 
Sadly, even though the server is pumping out data, I don't see
anything at the client (my xvimagesink window never opens up)

Any ideas what I'm doing wrong or how to diagnose this?

Thanks

Note: I'm also a bit unsure how to write these pipelines if I want
to put audio data into the .TS container as well.  Any pointers
on this would be most helpful.

--
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel


------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Marc Leeman
In reply to this post by Gary Thomas
> Sadly, even though the server is pumping out data, I don't see
> anything at the client (my xvimagesink window never opens up)
>
> Any ideas what I'm doing wrong or how to diagnose this?

Youŕe probably not sending the data for the decoder to start decoding
(NAL 7/8). Try starting the decoder before the sender and see if you get
video decoded.

If so, add a recent h264parser in the chain and it should remultiplex
the correct data into the stream for the decoder to start decoding.

Why are you first putting your data into TS and then again in RTP?

--
  greetz, marc
Measure twice, cut once.
crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

signature.asc (204 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Gary Thomas
On 09/18/2010 10:51 AM, Marc Leeman wrote:

>> Sadly, even though the server is pumping out data, I don't see
>> anything at the client (my xvimagesink window never opens up)
>>
>> Any ideas what I'm doing wrong or how to diagnose this?
>
> You�e probably not sending the data for the decoder to start decoding
> (NAL 7/8). Try starting the decoder before the sender and see if you get
> video decoded.
>
> If so, add a recent h264parser in the chain and it should remultiplex
> the correct data into the stream for the decoder to start decoding.

I am already using the very latest release of everything.
   gstreamer         0.10.30
   gst-plugins-good  0.10.25
   gst-plugins-base  0.10.30
   gst-plugins-bad   0.10.20
   gst-plugins-ugly  0.10.16

> Why are you first putting your data into TS and then again in RTP?

Because that's what the customer wants :-)

My understanding is that TS is a container that will eventually contain
both video and audio and is not network worthy by itself, hence the RTP
(RealTime [network] protocol)

--
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Gary Thomas
On 09/20/2010 05:36 AM, Gary Thomas wrote:

> On 09/18/2010 10:51 AM, Marc Leeman wrote:
>>> Sadly, even though the server is pumping out data, I don't see
>>> anything at the client (my xvimagesink window never opens up)
>>>
>>> Any ideas what I'm doing wrong or how to diagnose this?
>>
>> You�e probably not sending the data for the decoder to start decoding
>> (NAL 7/8). Try starting the decoder before the sender and see if you get
>> video decoded.
>>
>> If so, add a recent h264parser in the chain and it should remultiplex
>> the correct data into the stream for the decoder to start decoding.
>
> I am already using the very latest release of everything.
> gstreamer 0.10.30
> gst-plugins-good 0.10.25
> gst-plugins-base 0.10.30
> gst-plugins-bad 0.10.20
> gst-plugins-ugly 0.10.16
>
>> Why are you first putting your data into TS and then again in RTP?
>
> Because that's what the customer wants :-)
>
> My understanding is that TS is a container that will eventually contain
> both video and audio and is not network worthy by itself, hence the RTP
> (RealTime [network] protocol)
>

That said, I've also tried this with a raw H264 stream and the same
thing happens.

As I've pointed out, these pipelines do not even work reliably on
my desktop system all the time.  Using just the raw H264 stream, I
stream out and in on my desktop, using the local network (127.0.0.1)
While it may work, even for a while, after some time the receiver no
longer gets new frames (motion stops).

Is there some way to get useful debug information on this?  I don't
see any messages about the RTP stream until level 4 and then it's
too low level to interpret easily.  I'd like to know when packets
come in, how they are parsed, passed on, etc, where the keyframes
are, etc.  This sort of data doesn't seem to show up in the debug
data.


--
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Marc Leeman
> > Because that's what the customer wants :-)

Is this what the customer really wants (getting video reliably over the
network) or is it what you've been told the customer wants :-)

> > My understanding is that TS is a container that will eventually contain
> > both video and audio and is not network worthy by itself, hence the RTP
> > (RealTime [network] protocol)

You might have some problems with the timestamps that are in the RTP
header and those that are in TS. If only one is slightly off; you'll run
into problems.

> That said, I've also tried this with a raw H264 stream and the same
> thing happens.
>
> As I've pointed out, these pipelines do not even work reliably on
> my desktop system all the time.  Using just the raw H264 stream, I
> stream out and in on my desktop, using the local network (127.0.0.1)
> While it may work, even for a while, after some time the receiver no
> longer gets new frames (motion stops).
>
> Is there some way to get useful debug information on this?  I don't
> see any messages about the RTP stream until level 4 and then it's
> too low level to interpret easily.  I'd like to know when packets
> come in, how they are parsed, passed on, etc, where the keyframes
> are, etc.  This sort of data doesn't seem to show up in the debug
> data.
We've been doing quite some h.264 streaming ourselves; and I can't say
we've had many problems.

There are a number of things you need to take into account.

There are a number of encoders that only send NAL 7/8 once. You can
configure the rtp payloader to re-multipex those on a regular interval
in your stream.

In your case; that will not work since it's not h264 you're sending, but
h264/ts. That's why you can also instruct the h264 parser to re-include
those settings into the stream (before you add the mpeg ts layer).

Our focus is mainly on rtp/h264; but AFAIK; streaming is stable, both
from hardware sources as from x264 sources (and file based).

--
  greetz, marc
E = MC ** 2 +- 3db
crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

signature.asc (204 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Gary Thomas
On 09/20/2010 08:07 AM, Marc Leeman wrote:
>>> Because that's what the customer wants :-)
>
> Is this what the customer really wants (getting video reliably over the
> network) or is it what you've been told the customer wants :-)

That's always the $64,000 question!

>>> My understanding is that TS is a container that will eventually contain
>>> both video and audio and is not network worthy by itself, hence the RTP
>>> (RealTime [network] protocol)
>
> You might have some problems with the timestamps that are in the RTP
> header and those that are in TS. If only one is slightly off; you'll run
> into problems.

Any hints on how to diagnose this?

>> That said, I've also tried this with a raw H264 stream and the same
>> thing happens.
>>
>> As I've pointed out, these pipelines do not even work reliably on
>> my desktop system all the time.  Using just the raw H264 stream, I
>> stream out and in on my desktop, using the local network (127.0.0.1)
>> While it may work, even for a while, after some time the receiver no
>> longer gets new frames (motion stops).
>>
>> Is there some way to get useful debug information on this?  I don't
>> see any messages about the RTP stream until level 4 and then it's
>> too low level to interpret easily.  I'd like to know when packets
>> come in, how they are parsed, passed on, etc, where the keyframes
>> are, etc.  This sort of data doesn't seem to show up in the debug
>> data.
>
> We've been doing quite some h.264 streaming ourselves; and I can't say
> we've had many problems.
>
> There are a number of things you need to take into account.
>
> There are a number of encoders that only send NAL 7/8 once. You can
> configure the rtp payloader to re-multipex those on a regular interval
> in your stream.
>
> In your case; that will not work since it's not h264 you're sending, but
> h264/ts. That's why you can also instruct the h264 parser to re-include
> those settings into the stream (before you add the mpeg ts layer).
>
> Our focus is mainly on rtp/h264; but AFAIK; streaming is stable, both
> from hardware sources as from x264 sources (and file based).

In this light, I'm going to concentrate a bit more on the pure H264 streaming.

I have had a little success today, but it's still not great.  I have to
have the client started before the server starts, so I'm guessing that
I have the "only one NAL" issue you mention above.  How can I change the
behaviour as you mention above?

When it does run, I see messages like this on the client/receiver:

WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2686): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
There may be a timestamping problem, or this computer is too slow.

I tried adjusting the latency/jitterbuffer on the receiver, but it
didn't seem to change much.

Thanks for the help

--
------------------------------------------------------------
Gary Thomas                 |  Consulting for the
MLB Associates              |    Embedded world
------------------------------------------------------------

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Marc Leeman
In reply to this post by Gary Thomas


--
  greetz, marc
Chemistry professors never die, they just fail to react.
crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux

> I have had a little success today, but it's still not great.  I have to
> have the client started before the server starts, so I'm guessing that
> I have the "only one NAL" issue you mention above.  How can I change the
> behaviour as you mention above?

I think I already mailed to on Saturday :-)

barco@mgsserver001:~$ gst-inspect rtph264pay |grep config-interval
  config-interval     : Send SPS and PPS Insertion Interval in seconds
(sprop parameter sets will be multiplexed in the data stream when
detected.) (0 = disabled)
barco@mgsserver001:~$ gst-inspect h264parse |grep config-interval
  config-interval     : Send SPS and PPS Insertion Interval in seconds
(sprop parameter sets will be multiplexed in the data stream when
detected.) (0 = disabled)

> When it does run, I see messages like this on the client/receiver:
>
> WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: A lot of buffers are being dropped.
> Additional debug info:
> gstbasesink.c(2686): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
> There may be a timestamping problem, or this computer is too slow.

Is this pure RTP or RTP/TS?

--
  greetz, marc
The opposite of a correct statement is a false statement. But the opposite
of a profound truth may well be another profound truth.
                -- Niels Bohr
crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

signature.asc (204 bytes) Download Attachment
signature.asc (204 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: rtpbin + mpegtsmux

Marc Leeman
In reply to this post by Gary Thomas
> Do I need this on both ends?  My version of h264parse (0.10.19) doesn't support
> that option.

No, we put the option first in rtph264pay and then afterwards in
h264parse. If you use it in the parser with an interval of 1s, adding it
in the payloader will do nothing.

It was added in the parser for exactly the kind of 'strange'
combinations that you are trying; sending h264 in TS, in TS/RTP or even
in ES over the network.

You need this NAL insertion on the sending side only; since this is
essential data for the decoder to start decoding.

>> Is this pure RTP or RTP/TS?
>
> These results are RTP/TS.  When I do just RTP, I never get anything other than the initial [incomplete] frame.

This is probably what I was talking about; try to send it in pure RTP
first and see if you get the sender/receiver working correctly. IIRC,
when receiving this combination; the timestamps in the buffers are set
with RTP and then afterwards adjusted with the TS timing info.

I've also been warned several times that the mpegtsmux/mpegtsdemux is
not completely correct with timestamps.

> Note: I'm using a special H264 encoder (based on the DSP in my processor, not x264enc)

That does not really matter as long as your timestamps in your encoder
is correct; we're doing basically the same thing with all kinds of
boards.

> I tried to use x264enc, but my sensor only does UYVY:
>  gst-launch -vv v4l2src ! video/x-raw-yuv,width=720,height=480 ! ffmpegcolorspace ! x264enc ! filesink location=/tmp/xx
> Pipeline:pipeline0/GstV4l2Src:v4l2src0: Device '/dev/video0' cannot capture in the specified format
> Additional debug info:
> gstv4l2object.c(1971): gst_v4l2_object_set_format (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> Tried to capture in YUYV, but device returned format UYVY

UYVY does not seem to supported in ffmpegcolorspace

> I tried to recode it, but I get an error I don't understand:
>  gst-launch -vv v4l2src ! video/x-raw-yuv,width=720,height=480 ! ffmpegcolorspace ! 'video/x-raw-yuv,format=(fourcc)YUY2'  ! x264enc ! filesink location=/tmp/xx
> WARNING: erroneous pipeline: could not link ffmpegcsp0 to x264enc0

It can't match the caps; so it can't link ffmpegcolorspace to x264enc;
x264enc only supports I420 (gst-inspect x264enc)

> Sorry for all the questions, but my exploration of the documentation has not
> been very fruitful (I'm not always this challenged...)

nah, there is a lot of information; you just need to find out where. IRC
has always been helpful for me.


--
  greetz, marc
One man's "magic" is another man's engineering.  "Supernatural" is a null word.
                -- Robert Heinlein
crichton 2.6.26 #1 PREEMPT Tue Jul 29 21:17:59 CDT 2008 GNU/Linux

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

signature.asc (204 bytes) Download Attachment