Dear all,
I have gst-rtsp-server running on Raspberry Pi feeding it with pipeline from xml file: <machine0_pipeline>rpicamsrc preview=false ! video/x-raw, width=1280,height=720,framerate=30/1 ! tee name=t t. ! queue ! omxh264enc target-bitrate=1000000 control-rate=variable ! video/x-h264,width=1280, height=720, framerate=30/1, profile=(string)high ! rtph264pay config-interval=1 name=pay0 pt=96 t. ! queue ! videoconvert ! videoscale ! video/x-raw, width=640,height=480 ! omxh264enc target-bitrate=1000000 control-rate=variable ! video/x-h264,width=640, height=480, framerate=30/1, profile=(string)high ! rtph264pay config-interval=1 name=pay1 pt=96</machine0_pipeline> I want to run only one of the pipelines therefore I am selecting it via select-stream signal handler: static gboolean on_select_stream_cb(GstElement *element, GstPad *pad, GstCaps* arg1, void *data) { gboolean val = (pad == 1) ? value : !value; return val; } Gstreamer opens window with single image of video stream corresponding to value of variable value but the video does not proceed further. Could you please tell me what I am doing wrong? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
The same I did on Debian machine with pipeline:
filesrc location=/home/horai/Downloads/jellyfish-3-mbps-hd-h264.mkv ! matroskademux ! h264parse ! vaapih264dec ! tee name=t t. ! queue ! videoconvert ! vaapih264enc ! h264parse ! rtph264pay name=pay1 pt=96 t. ! queue ! videoconvert ! videoscale ! video/x-raw, width=640,height=480 ! vaapih264enc ! h264parse ! rtph264pay name=pay0 pt=96 I can select the stream on client according to payload name pay0 or pay1, but I still get only single image. I used this example: https://github.com/GStreamer/gst-rtsp-server/blob/master/examples/test-mp4.c And replaced pipeline corresponding to file containing video and sound and targeted payloads to branches of tee. Does anybody know what to fix in order to play the stream? The server is yielding following report but the window shows only single image: structure: application/x-rtp-source-stats, ssrc=(uint)3839301980, internal=(boolean)true, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)true, seqnum-base=(int)1754, clock-rate=(int)90000, octets-sent=(guint64)109430, packets-sent=(guint64)84, octets-received=(guint64)109430, packets-received=(guint64)84, bitrate=(guint64)0, packets-lost=(int)-84, jitter=(uint)0, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)true, sr-ntptime=(guint64)16240433297496360504, sr-rtptime=(uint)3516624902, sr-octet-count=(uint)109430, sr-packet-count=(uint)84; Sender stats: structure: application/x-rtp-source-stats, ssrc=(uint)3839301980, internal=(boolean)true, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)false, seqnum-base=(int)1754, clock-rate=(int)90000, octets-sent=(guint64)109430, packets-sent=(guint64)84, octets-received=(guint64)109430, packets-received=(guint64)84, bitrate=(guint64)0, packets-lost=(int)-84, jitter=(uint)0, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)true, sr-ntptime=(guint64)16240433297496360504, sr-rtptime=(uint)3516624902, sr-octet-count=(uint)109430, sr-packet-count=(uint)84; structure: application/x-rtp-source-stats, ssrc=(uint)3858573619, internal=(boolean)false, validated=(boolean)true, received-bye=(boolean)false, is-csrc=(boolean)false, is-sender=(boolean)false, seqnum-base=(int)-1, clock-rate=(int)-1, rtcp-from=(string)127.0.0.1:32855, octets-sent=(guint64)0, packets-sent=(guint64)0, octets-received=(guint64)0, packets-received=(guint64)0, bitrate=(guint64)0, packets-lost=(int)0, jitter=(uint)0, sent-pli-count=(uint)0, recv-pli-count=(uint)0, sent-fir-count=(uint)0, recv-fir-count=(uint)0, sent-nack-count=(uint)0, recv-nack-count=(uint)0, have-sr=(boolean)false, sr-ntptime=(guint64)0, sr-rtptime=(uint)0, sr-octet-count=(uint)0, sr-packet-count=(uint)0, sent-rb=(boolean)false, sent-rb-fractionlost=(uint)0, sent-rb-packetslost=(int)0, sent-rb-exthighestseq=(uint)0, sent-rb-jitter=(uint)0, sent-rb-lsr=(uint)0, sent-rb-dlsr=(uint)0, have-rb=(boolean)true, rb-fractionlost=(uint)0, rb-packetslost=(int)0, rb-exthighestseq=(uint)1837, rb-jitter=(uint)4, rb-lsr=(uint)2617580954, rb-dlsr=(uint)249822, rb-round-trip=(uint)61; -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Dear all,
we realized these pieces of information, since we changed approach and modified pipeline this way for server: rpicamsrc preview=false bitrate=1000000 ! video/x-h264, width=1280,height=720,framerate=30/1 ! tee name=t t. ! queue ! h264parse ! rtph264pay name=pay0 pt=96 t. ! queue ! h264parse ! omxh264dec ! video/x-raw ! videoscale ! video/x-raw, width=640,height=480,framerate=30/1 ! omxh264enc target-bitrate=1000000 control-rate=variable ! video/x-h264,width=640, height=480, framerate=30/1, profile=(string)high ! h264parse ! rtph264pay name=pay1 pt=96 It does not work either, but I decomposed both branches to single ones; 1. works gst-launch-1.0 -v rpicamsrc ! video/x-h264, width=1280,height=720,framerate=30/1 ! h264parse ! omxh264dec ! video/x-raw ! videoscale ! video/x-raw, width=640,height=480,framerate=30/1 ! autovideosink 2. works gst-launch-1.0 -v rpicamsrc ! video/x-h264, width=1280,height=720,framerate=30/1 ! h264parse ! omxh264dec ! autovideosink 3. Together switching sinks, don't work: gst-launch-1.0 -v rpicamsrc preview=false bitrate=1000000 ! video/x-h264, width=1280,height=720,framerate=30/1 ! tee name=t t. ! queue ! h264parse ! omxh264dec ! autovideosink t. ! queue ! h264parse ! omxh264dec ! video/x-raw ! videoscale ! video/x-raw, width=640,height=480,framerate=30/1 ! fakesink -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Partially resolved.The correct pipeline for the server is this:
rpicamsrc preview=false bitrate=1000000 ! video/x-h264,width=1280,height=720,framerate=30/1 ! tee name=t t. ! queue ! h264parse ! rtph264pay name=pay1 pt=96 t. ! queue ! h264parse ! omxh264dec ! video/x-raw ! videoscale ! video/x-raw, width=640,height=480,framerate=30/1 ! omxh264enc target-bitrate=1000000 control-rate=variable ! video/x-h264, profile=(string)high ! h264parse ! rtph264pay name=pay0 pt=96 Pipeline to run first branch of tee stream on client: gst-launch-1.0 -v rtspsrc location="rtsp://user:password@192.168.0.102:8554/test" latency=0 ! rtph264depay ! h264parse ! vaapih264dec ! autovideosink For running the other tee branch, you have to swap name=pay1 to name=pay0 on server side and server as well as client. That means our server side is fine, we just have to learn, how to use select-stream signal handler in order to run appropriate payload from C application. If anyone is willing to help, would be perfect. Would be nice to rewrite the pipeline with usage of v4l codecs. Thank you -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Dear all,
For verifying select-stream feature, I found an example here (containing server and client side, but I rather use stable repository rtsp server): https://github.com/James-Holland It works pretty fine when modifying server pipeline this way: gst_rtsp_media_factory_set_launch (factory1, "( " "rpicamsrc preview=false bitrate=2000000 ! video/x-h264, width=1640,height=922,framerate=30/1 ! h264parse ! rtph264pay config-interval=1 name=pay0 pt=96" "videotestsrc ! video/x-raw,width=900,height=900,framerate=2/1 ! " "x264enc ! rtph264pay name=pay1 pt=96" "videotestsrc ! video/x-raw,width=1400,height=1480,framerate=1/1 ! " "x264enc ! rtph264pay name=pay2 pt=96" ")"); I also used https://github.com/James-Holland client example and modified parse_launch() pipeline this way: rtspsrc name=rtspsrc0 location=\"rtsp://user:password@192.168.0.103:8554/test\" latency=0 protocols=7 ! queue ! capsfilter caps=\"application/x-rtp,media=video\" ! rtph264depay name=depay ! h264parse ! vaapih264dec ! autovideosink I can comfortably select stream 0,1,2. BUT ... I cannot select stream when using this pipeline on server side, as it always freezes on single image frame: rpicamsrc preview=false bitrate=1000000 ! video/x-raw, width=1280,height=720,framerate=30/1 ! tee name=t t. ! queue ! omxh264enc target-bitrate=1000000 control-rate=variable ! video/x-h264,width=1280, height=720, framerate=30/1, profile=(string)high ! rtph264pay config-interval=1 name=pay0 pt=96 t. ! queue ! videoconvert ! videoscale ! video/x-raw, width=640,height=480 ! omxh264enc target-bitrate=1000000 control-rate=variable ! video/x-h264,width=640, height=480, framerate=30/1, profile=(string)high ! rtph264pay config-interval=1 name=pay1 pt=96 The only way I am able to run both streams from tee pipeline on client side is in a case that I run payload0 stream on client from command line this way: gst-launch-1.0 -v rtspsrc location="rtsp://user:password@192.168.0.103:8554/test" latency=0 ! rtph264depay ! h264parse ! vaapih264dec ! autovideosink And then I run payload1 stream on client with this example: https://github.com/James-Holland/rtspclient with parse_launch pipeline: rtspsrc name=rtspsrc0 location=\"rtsp://user:password@192.168.0.103:8554/test\" latency=0 protocols=7 ! queue ! capsfilter caps=\"application/x-rtp,media=video\" ! rtph264depay name=depay ! h264parse ! vaapih264dec ! autovideosink It looks like I am missing some initial negotiation when select-stream handler is triggered as it cannot start the stream. Please, could anyone help me resolve this issue? Thank you Ivo -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hello Ivo,
as You post, now You have two streams - pay0 and pay1. The server is sending video streams at one time? Possibly may be the solution to create stream ID. You tell the client, that it should use just one payload. The client should have set async-handling. g_object_set(sink, "async-handling", TRUE, NULL); Here's part of the documentation (waylandsink): async-handling “async-handling” gboolean The bin will handle Asynchronous state changes Flags : Read / Write Default value : false -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Maybe the right way is to use example test-appsrc.c. Through media callback
add pipeline to factory. I would say that rpicamsrc is much more efficient (faster) than appsrc, but this means some work to change the code :) I dont know if appsrc can handle video in h264 format or just in raw, but this is not heavy task :) -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Dear sir,
Thank you for you suggestion. Rpicamsrc seems to have more benefits than any other source element.As far as I understand,the most convenient souce element following all the desired standards would be v4l2src. Just to note,it I understand it correctly, rpicamsrc is written using mmal proprietary api therefore not following v4l standard,anyway I tried v4l2src and rpicsmsrc is still way more suitable for raspberry camera. I hope that in near future,Raspberry v4l kernel driver for Raspberry will improve camera compatibility. Anyway,I don't think my problem lies in the source. If I use rpicamsrc together with videotestsrc on two payloads,pay0 and pay1 in server pipeline, I can select whichever stream I want. Since I cannot place two times rpicamsrc in server pipeline,I have to use tee to split the stream coming from single rpicamsrc element. I don't know why rtsp server cannot keep streaming upon client connection and sends single image only. The possitive aspect is that I can select the stream but stream from server has to be already started externally from pipeline from command line. What do I do so different in my client code instead of pipeline in command line? Stream freezes with single image in case I have select-stream signal handler added to my code,when this signal handler is commented,the client behaves the same way as command line pipeline,but I cannot select the desired stream. One thing I am not happy with is that in this case server sends both streams simultaneously but only one is being displayed therefore assume the problem could lie in some timestamo,sychronization or even in the fact that gst-rtsp-server might not be very friendly with tee element within the pipeline. Whoever resolves it I pay one beer. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |