Hi all! Currently I’m trying to stream audio+video over
rtp. This schematic should explain what I want to do (the
pipes are not complete for more clearness): Sender Side: ========= Audio-Source ---> audioencode ---> audiopayload
----> ---->
RtpMuxer ----> RtpBin (Rtp Src) ----> UdpSink Video-Source ---> videoencode ---> videopayload
-----> Receiver Side: ========== ------>
audiodepayload ---> audiodecode ---> audio-render UdpSrc ---> RtpBin (Rtp Sink) ---> RtpDemuxer ---> ------>
videodepayload ---> videodecode ---> video-render Is this Construct possible? I’m using the “rtpmux” and “gstrtpptdemux”
modules. My problem is that just one Pad is created on the receiver pad, but I want
2 pads. If I connect the video-source-pipe to the rtpmux
sink_0 AND audio-source-pipe to the rtpmux sink_1 on my receiver side only
audio pad is created (src_8) (the second connected on the server-side) If I remove on the server-side only the link between
audio-source-pipe and rtpmux, on my server the video pad is created (src_96) But never both – I get only one time the “pad-added”
signal of my demux element. Anybody knows my problem? Greetings, Thomas ------------------------------------------------------------------------- Check out the new SourceForge.net Marketplace. It's the best place to buy or sell services for just about anything Open Source. http://sourceforge.net/services/buy/index.php _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Hi All, I would like to build and install only
specific plug-ins in gstreamer-base/good/bad/ugly packages. Please let me know the best way to do it. Is there any file where I can configure
this option? Thanks & Regards Sreejesh R B Sr. Project Lead Multitech Software Systems Bangalore, India. ------------------------------------------------------------------------- Check out the new SourceForge.net Marketplace. It's the best place to buy or sell services for just about anything Open Source. http://sourceforge.net/services/buy/index.php _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Tue, Jul 1, 2008 at 7:28 AM, Sreejesh <[hidden email]> wrote:
> Hi All, > > > > I would like to build and install only specific plug-ins in > gstreamer-base/good/bad/ugly packages. Some of the plugins can be explicitly enabled/disabled with configure arguments - see the output of configure --help for details. In many other cases, you can't do this. However, you can obviously just copy only the plugins you're interested in to your target, rather than installing everything. Mike ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by sreejesh-2
After ./configure is done without any disable option,
You can just go inside specific plugin directory which you want to install and do "make; make install". 2008/7/1 Sreejesh <[hidden email]>:
------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by Thomas Winkler-5
Is there anybody who
knows this problem? Any ideas? Von:
[hidden email]
[mailto:[hidden email]] Im Auftrag von Thomas Winkler Hi all! Currently I’m trying to stream audio+video over
rtp. This schematic should explain what I want to do (the
pipes are not complete for more clearness): Sender Side: ========= Audio-Source ---> audioencode ---> audiopayload
---->
---->
RtpMuxer ----> RtpBin (Rtp Src) ----> UdpSink Video-Source ---> videoencode ---> videopayload
-----> Receiver Side: ==========
------> audiodepayload ---> audiodecode ---> audio-render UdpSrc ---> RtpBin (Rtp Sink) ---> RtpDemuxer
--->
------> videodepayload ---> videodecode ---> video-render Is this Construct possible? I’m using the “rtpmux” and
“gstrtpptdemux” modules. My problem is that just one Pad is created
on the receiver pad, but I want 2 pads. If I connect the video-source-pipe to the rtpmux
sink_0 AND audio-source-pipe to the rtpmux sink_1 on my receiver side only
audio pad is created (src_8) (the second connected on the server-side) If I remove on the server-side only the link between
audio-source-pipe and rtpmux, on my server the video pad is created (src_96) But never both – I get only one time the
“pad-added” signal of my demux element. Anybody knows my problem? Greetings, Thomas
------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Tue, 2008-07-08 at 12:47 +0200, Thomas Winkler wrote: > This schematic should explain what I want to do (the pipes are not > complete for more clearness): > > > Sender Side: > > ========= > > Audio-Source ---> audioencode ---> audiopayload ----> > ----> RtpMuxer ----> RtpBin (Rtp Src) ----> UdpSink > Video-Source ---> videoencode ---> videopayload -----> > Rtp muxer is for one specific case (muxing DTMF events into a regular sound stream). -- Olivier Crête [hidden email] Collabora Ltd ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel signature.asc (204 bytes) Download Attachment |
Ok, thanks.
In this case there is no change to mux audio and video into one stream while using one udpsink, isn't it? I've understand rtpmux as element which combines 2 rtp streams (audio and video) into one rtpstream... Greetings, Thomas > -----Ursprüngliche Nachricht----- > Von: [hidden email] [mailto:gstreamer- > [hidden email]] Im Auftrag von Olivier Crête > Gesendet: Dienstag, 8. Juli 2008 16:08 > An: [hidden email] > Betreff: Re: [gst-devel] RtpMux and RtpDemux > > > On Tue, 2008-07-08 at 12:47 +0200, Thomas Winkler wrote: > > > This schematic should explain what I want to do (the pipes are not > > complete for more clearness): > > > > > > Sender Side: > > > > ========= > > > > Audio-Source ---> audioencode ---> audiopayload ----> > > ----> RtpMuxer ----> RtpBin (Rtp Src) ----> UdpSink > > Video-Source ---> videoencode ---> videopayload -----> > > > > You're doing it wrong, you should have one RTP session per media type. > > Rtp muxer is for one specific case (muxing DTMF events into a regular > sound stream). > > -- > Olivier Crête > [hidden email] > Collabora Ltd ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Tue, 2008-07-08 at 17:15 +0200, Thomas Winkler wrote: > Ok, thanks. > In this case there is no change to mux audio and video into one stream while > using one udpsink, isn't it? Well, its not recommended, you could use mpeg, I think you can mux audio and video into an mpeg stream and then payload that. But the way RTP was designed is to separate audio and video. > I've understand rtpmux as element which combines 2 rtp streams (audio and > video) into one rtpstream... Nope nope, thats not what it was designed for. it only works if both streams use the same clock-rate, etc. -- Olivier Crête [hidden email] Collabora Ltd ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel signature.asc (204 bytes) Download Attachment |
In reply to this post by Olivier Crête-2
Dnia Tuesday 08 of July 2008, Olivier Crête napisał:
> You're doing it wrong, you should have one RTP session per media type. > I also heard about such recomendations. But where do they come from, what is the rationale for them? In multicast it may be of some use but in point to point connections I see nothing wrong in sending 2 streams over one socket (except the fact that everyone is used to 'one stream - one socket' approach). In fact I see some benefits in sending two streams over one connection: you could use congestion control mechanisms of certain protocols such as DCCP for the whole transmission and not separately for each stream. That said I would be very interested in having a pipeline described by Thomas working. -- Regards, Tomasz Grobelny ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
2008/7/8 Tomasz Grobelny <[hidden email]>:
> Dnia Tuesday 08 of July 2008, Olivier Crête napisał: >> You're doing it wrong, you should have one RTP session per media type. >> > I also heard about such recomendations. But where do they come from, what is > the rationale for them? I guess RFC 3550 (RTP), section 5.2 - Multiplexing RTP Sessions is the place to look at. -- Damien Lespiau ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by Thomas Winkler-5
Seriously, someone capable should do some Howtos / tutorials about these rtp/udp pipelines, in every/most possible combinations .
90% of messages seem to about that. ;) Sometimes just reading api and looking at diagrams doesn't help the not-so-experienced. From wood and paint it red. ( <- proverb or something.) ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Italian Spiderman wrote:
> Seriously, someone capable should do some Howtos / tutorials about these > rtp/udp pipelines, in every/most possible combinations . > 90% of messages seem to about that. ;) > Sometimes just reading api and looking at diagrams doesn't help the > not-so-experienced. +1 vote for this. rtp, rtcp, rtsp + a rtp push and pull server and client example would be useful. even a simplest would be enough with a few gst-lunch command line example. -- Levente "Si vis pacem para bellum!" ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by munky-2
hi,
Italian Spiderman schrieb: > Seriously, someone capable should do some Howtos / tutorials about these > rtp/udp pipelines, in every/most possible combinations . > 90% of messages seem to about that. ;) > Sometimes just reading api and looking at diagrams doesn't help the > not-so-experienced. > We have an FAQ on the wiki http://gstreamer.freedesktop.org/wiki/FAQ Maybe someone can start a rtp/rtsp section. Stefan > > > From wood and paint it red. ( <- proverb or something.) > > > ------------------------------------------------------------------------ > > ------------------------------------------------------------------------- > Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! > Studies have shown that voting for your favorite open source project, > along with a healthy diet, reduces your potential for chronic lameness > and boredom. Vote Now at http://www.sourceforge.net/community/cca08 > > > ------------------------------------------------------------------------ > > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------- Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW! Studies have shown that voting for your favorite open source project, along with a healthy diet, reduces your potential for chronic lameness and boredom. Vote Now at http://www.sourceforge.net/community/cca08 _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by Olivier Crête-2
Hi Olivier.
Is there a way to mux 2 video streams and send over rtp on the server side and on the client to receive them with VLC. I know that it works if I dump into a ts file but when I send it over rtp, it's a different story. Here is the client command running on Ubuntu 11.04 vlc udp://@:5000 Here is my server pipeline: gst-launch --gst-debug=2 videotestsrc name ="alpha" pattern="snow" \ ! 'video/x-raw-yuv, format=(fourcc)I420, width=240, height=120, framerate=(fraction)25/1' \ ! x264enc bitrate=512 ! muxer. \ videotestsrc name ="beta" \ ! 'video/x-raw-yuv, format=(fourcc)I420, width=640, height=480, framerate=(fraction)25/1' \ ! textoverlay text="Hello and welcome to GStreamer!" valignment=3 \ ! x264enc bitrate=512 ! muxer. flutsmux name="muxer" ! rtpmp2tpay ! \ udpsink host=192.168.100.2 port=5000 ts-offset=0 I am getting some warnings saying that the latency cannot be set and in VLC nothing opens. I've also tried to replace flutsmux by mpegtsmux, but no luck either. I'm pretty sure it's coming from the mux. If anyone could help with this one, I'd be very grateful as I've already spent a good while on it. Thanks a lot Biloute |
Free forum by Nabble | Edit this page |