Dear all,
Some months ago I started developping an example to send vorbis audio and theora video through RTP in a "standalone" form, without RTSP or any other session control. I finally succeeded (with the valuable help of #gstreamer IRC channel) and you can find the implementation here: http://www.ctr.unican.es/asignaturas/dec/Doc/ogg_audio_video_rtp_sender_receiver.tar.gz I would propose to have it added to the test directory of RTP. The main difficulty to overcome was that Vorbis and Theora receivers need an initial content-depent information which is not available in the RTP incoming payload. Without that information the receivers cannot decode the stream. NOTE: This happens with some other formats as well such has H.264 according to what I was informed in the IRC channel. The information is the "configuration" GstStructure which consists in a long base64 blob. This information is typically sent via a previous server protocol such RTSP. However I stubbornly decided not to use any RTSP or HTTP server at all. The solution I came up with is dumping the information to a file that could be copied by other means to the client. So in order to use my demo program you need to: 1. Start the sender in order to obtain the audio_caps and video_caps file. 2. Pass the audio_cpas and video_caps by other means (e.g. SSH). 3. Start the client providing the audio_caps and video_caps files. Since all I wanted to do was a proof of concept this solution satisfied me for my needs. I hope this will help other Gstreamer beginners and of course I am open to suggestions on how to make the same task in a more easy way. Regards, Miguel Telleria -- (O-O) ---oOO-(_)-OOo----------------------------------------------------- Miguel TELLERIA DE ESTEBAN http://www.mtelleria.com Email: miguel at mtelleria.com Tel GSM: +34 650 801098 Tel Fix: +34 942 280174 Miembro de http://www.linuca.org Membre du http://www.bxlug.be ¿Usuario captivo o libre? http://www.obtengalinux.org/windows/ Free or captive user? http://www.getgnulinux.org/windows/ ------------------------------------------------------------------- ------------------------------------------------------------------------------ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel signature.asc (204 bytes) Download Attachment |
Am 21.05.2010 18:21, schrieb Miguel Telleria de Esteban:
> Dear all, > > Some months ago I started developping an example to send vorbis audio > and theora video through RTP in a "standalone" form, without RTSP or > any other session control. > > I finally succeeded (with the valuable help of #gstreamer IRC channel) > and you can find the implementation here: > > http://www.ctr.unican.es/asignaturas/dec/Doc/ogg_audio_video_rtp_sender_receiver.tar.gz > > I would propose to have it added to the test directory of RTP. Just add it localy and create a patch. File a bug in bugzilla and add your patch there. Thanks Stefan > > The main difficulty to overcome was that Vorbis and Theora receivers > need an initial content-depent information which is not available in > the RTP incoming payload. Without that information the receivers > cannot decode the stream. > > NOTE: This happens with some other formats as well such has H.264 > according to what I was informed in the IRC channel. > > The information is the "configuration" GstStructure which consists in a > long base64 blob. This information is typically sent via a previous > server protocol such RTSP. > > However I stubbornly decided not to use any RTSP or HTTP server at all. > > The solution I came up with is dumping the information to a file that > could be copied by other means to the client. So in order to use my > demo program you need to: > > 1. Start the sender in order to obtain the audio_caps and video_caps > file. > > 2. Pass the audio_cpas and video_caps by other means (e.g. SSH). > > 3. Start the client providing the audio_caps and video_caps files. > > Since all I wanted to do was a proof of concept this solution satisfied > me for my needs. > > I hope this will help other Gstreamer beginners and of course I am open > to suggestions on how to make the same task in a more easy way. > > Regards, > > Miguel Telleria > ------------------------------------------------------------------------------ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by Miguel Telleria de Esteban
On Fri, May 21, 2010 at 6:21 PM, Miguel Telleria de Esteban
<[hidden email]> wrote: > > NOTE: This happens with some other formats as well such has H.264 > according to what I was informed in the IRC channel. These information can usually be carried over a standard RTP stream, it is the (in)famous distinction made between in-band and out-band so-called "codec-data" in GStreamer that makes the things a little messy. For x264enc you should overcome this by using byte-stream=true but, basically, if the encoder does not support the distinction you must hack it or rely on a signalling/control protocol to get your application working. Regards > > The information is the "configuration" GstStructure which consists in a > long base64 blob. This information is typically sent via a previous > server protocol such RTSP. > > However I stubbornly decided not to use any RTSP or HTTP server at all. > > The solution I came up with is dumping the information to a file that > could be copied by other means to the client. So in order to use my > demo program you need to: > > 1. Start the sender in order to obtain the audio_caps and video_caps > file. > > 2. Pass the audio_cpas and video_caps by other means (e.g. SSH). > > 3. Start the client providing the audio_caps and video_caps files. > > Since all I wanted to do was a proof of concept this solution satisfied > me for my needs. > > I hope this will help other Gstreamer beginners and of course I am open > to suggestions on how to make the same task in a more easy way. > > Regards, > > Miguel Telleria > > > -- > > (O-O) > ---oOO-(_)-OOo----------------------------------------------------- > Miguel TELLERIA DE ESTEBAN http://www.mtelleria.com > Email: miguel at mtelleria.com Tel GSM: +34 650 801098 > Tel Fix: +34 942 280174 > > Miembro de http://www.linuca.org Membre du http://www.bxlug.be > ¿Usuario captivo o libre? http://www.obtengalinux.org/windows/ > Free or captive user? http://www.getgnulinux.org/windows/ > ------------------------------------------------------------------- > > > ------------------------------------------------------------------------------ > > > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > > ------------------------------------------------------------------------------ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Thu, May 27, 2010 at 9:29 AM, Marco Ballesio <[hidden email]> wrote:
> On Fri, May 21, 2010 at 6:21 PM, Miguel Telleria de Esteban > <[hidden email]> wrote: >> >> NOTE: This happens with some other formats as well such has H.264 >> according to what I was informed in the IRC channel. > > These information can usually be carried over a standard RTP stream, > it is the (in)famous distinction made between in-band and out-band > so-called "codec-data" in GStreamer that makes the things a little > messy. > > For x264enc you should overcome this by using byte-stream=true but, > basically, if the encoder does not support the distinction you must > hack it or rely on a signalling/control protocol to get your > application working. thinking again about this.. if an encoder always sends the codec-data in the caps it should be easy to modify all the payloaders and add them an option to convert it from outband to inband so to make any encoder working in absence of a signalling/control protocol. Has anybody already worked at something like this? Regards > > Regards > >> >> The information is the "configuration" GstStructure which consists in a >> long base64 blob. This information is typically sent via a previous >> server protocol such RTSP. >> >> However I stubbornly decided not to use any RTSP or HTTP server at all. >> >> The solution I came up with is dumping the information to a file that >> could be copied by other means to the client. So in order to use my >> demo program you need to: >> >> 1. Start the sender in order to obtain the audio_caps and video_caps >> file. >> >> 2. Pass the audio_cpas and video_caps by other means (e.g. SSH). >> >> 3. Start the client providing the audio_caps and video_caps files. >> >> Since all I wanted to do was a proof of concept this solution satisfied >> me for my needs. >> >> I hope this will help other Gstreamer beginners and of course I am open >> to suggestions on how to make the same task in a more easy way. >> >> Regards, >> >> Miguel Telleria >> >> >> -- >> >> (O-O) >> ---oOO-(_)-OOo----------------------------------------------------- >> Miguel TELLERIA DE ESTEBAN http://www.mtelleria.com >> Email: miguel at mtelleria.com Tel GSM: +34 650 801098 >> Tel Fix: +34 942 280174 >> >> Miembro de http://www.linuca.org Membre du http://www.bxlug.be >> ¿Usuario captivo o libre? http://www.obtengalinux.org/windows/ >> Free or captive user? http://www.getgnulinux.org/windows/ >> ------------------------------------------------------------------- >> >> >> ------------------------------------------------------------------------------ >> >> >> _______________________________________________ >> gstreamer-devel mailing list >> [hidden email] >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel >> >> > ------------------------------------------------------------------------------ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |