receiving and displaying video streams in a non-GTK application

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

receiving and displaying video streams in a non-GTK application

Paul E. Rybski
Hi,
        I've just recently discovered gstreamer and have been exploring its use
for transmitting audio and video from one linux machine to another.  So
far I've only been exploring the use of gst-launch but now want to try
to receive the video stream in a separate application rather than using
xvimagesink.  This is complicated by the fact that I'm using FLTK
(http://www.fltk.org/) rather than GTK for the GUI.  (The reasons for
using FLTK are essentially historical and it's currently impractical for
me to consider throwing away my old legacy GUI code and rewriting it
from scratch in GTK.)  I can see two different paths that I can try to
follow to achieve my goal:

1) My first option is to try to integrate the gstreamer API directly
into my fltk application.  I've started to look at the documentation for
how to encode gstreamer pipelines in a C application but one thing that
currently escapes me is how I get access to the raw uncompressed frames
of video at the end of the pipeline.  The way I understand it, I should
be able to encode my pipeline so that the application receives the video
stream from a socket and decodes it (I'm using smokeenc) but then I'm
completely unclear as to how I might copy the image into a buffer that I
can feed into an FLTK widget for drawing.  I'm also completely unclear
how easy or difficult it would be to integrate the GTK main event loop
with the FLTK main event loop as the gstreamer API seems to be heavily
wedded to GTK.  I have no experience programming with GTK at the moment
either.

2) My second option is to keep the client gst-launch command as it
stands now but instead of piping the video to xvimagesink, I create a
new local socket (or pipe) and shove the frames of video into those
(perhaps encoded as jpegs) and then have my FLTK application receive the
data from this pipe, decode each jpeg, and display it.  This seems
somewhat easier to achieve because then all I need to do is to figure
out how the data is encoded into the socket so I can write the code to
decode it.

Any thoughts, advice, or experiences that people could share with this?
 I'd kind of like to do the first option because it's conceptually
simpler for the end-user of my system but I'm concerned that I might end
up needing to rewrite my entire GUI in GTK which I'd rather not have to
do at this time.

Here are the gst-launch commands that I'm using right now.

Server:

gst-launch-0.10 -vm oggmux name=mux ! filesink location=movie.ogg v4lsrc
! video/x-raw-yuv,width=320,height=240 ! tee name=t_vnet ! queue !
ffmpegcolorspace ! smokeenc qmin=1 qmax=50 ! udpsink port=5000
host=localhost sync=false t_vnet. ! queue ! videorate !
'video/x-raw-yuv' ! theoraenc ! mux. alsasrc device=hw:0,0 !
audio/x-raw-int,rate=48000,channels=2,depth=16 ! tee name=t_anet ! queue
! audioconvert ! flacenc ! udpsink port=4000 host=localhost sync=false
t_anet. !queue ! audioconvert ! vorbisenc ! mux.


Client:

gst-launch-0.10 -vm tee name=vid -vm udpsrc port=5000 ! smokedec !
xvimagesink vid. !tee name=aud udpsrc port=4000 ! flacdec ! audioconvert
! audioresample ! alsasink sync=false aud.


I'm on Ubuntu 8.04 LTS 64-bit using the gstreamer packages that come
with that distro.  I've found that these commands also work for me on
Ubuntu 10.4 LTS 64-bit.

Thanks,

-Paul

--
Paul E. Rybski, Ph.D., Systems Scientist
The Robotics Institute, Carnegie Mellon University
Phone: 412-268-7417, Fax: 412-268-7350
Web: http://www.cs.cmu.edu/~prybski

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: receiving and displaying video streams in a non-GTK application

Andrey Nechypurenko-2
Hi Paul,

> 1) My first option is to try to integrate the gstreamer API directly
> into my fltk application.  I've started to look at the documentation for
> how to encode gstreamer pipelines in a C application but one thing that
> currently escapes me is how I get access to the raw uncompressed frames
> of video at the end of the pipeline.  The way I understand it, I should
> be able to encode my pipeline so that the application receives the video
> stream from a socket and decodes it (I'm using smokeenc) but then I'm
> completely unclear as to how I might copy the image into a buffer that I
> can feed into an FLTK widget for drawing.

I would suggest to take a look at appsink[1,2] and fakesink[3]
elements. Somehow I feel like appsink is the preffered way. However,
fakesink could be used as well with its hand-off mechanism.

> I'm also completely unclear
> how easy or difficult it would be to integrate the GTK main event loop
> with the FLTK main event loop as the gstreamer API seems to be heavily
> wedded to GTK.  I have no experience programming with GTK at the moment
> either.

I think the simplies way would be to run gtk (gstreamer) event loop in
a separate thread. Using appsink or fakesink mentioned above, you will
get access to raw frame. Then you will need to implement thread safe
mechanism to pass raw buffers from gstreamer thread to your UI thread.
For example, there is a set of examples on how to integrate Qt with
gstreamer [4] where similar technique is used. In particular,
qglwtextureshare shows how to run gstreamer event loop in separate
thread and interact with Qt GUI (please note, that this example uses
GL texture sharing mechanism instead of passing raw buffers through
memory buffers). In addition, this example illustrates how to easily
construct the pipeline in essentially the same way as with gst-launch
using gst_parse_launch() function.

> 2) My second option is to keep the client gst-launch command as it
> stands now but instead of piping the video to xvimagesink, I create a
> new local socket (or pipe)

I personally would not suggest such approach because of greater
complexity compared to the first one.

> Any thoughts, advice, or experiences that people could share with this?

As I understand, you are working on robotics and remotely controlled
vehicles. That is why, it might be interesting for you to take a look
at this project [6,7]. Here you can find the complete example how to
control vehicle over wireless/internet. In particular there is
gstreamer based video capturing and encoding from on-board camera,
networking infrastructure to stream video/sensor data to driver
cockpit and transmit control signals back to the vehicle using Ice
middleware [8]. In addition there is SDL/OpenGL based UI to display
live video with hardware acceleration which uses the
appsink/fakesink/thread approach I mentioned above.

Regards,
Andrey.

[1] http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-appsink.html
[2] http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/tests/examples/app?id=e17b42181c2cbcc389f87a35539f7a1b07d3dd54
[3] http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-fakesink.html
[4] http://cgit.freedesktop.org/gstreamer/gst-plugins-gl/tree/tests/examples/qt?id=fab824ea01f43c3fecaa2fed5e9e828774db5b24
[5] http://cgit.freedesktop.org/gstreamer/gst-plugins-gl/tree/tests/examples/qt/qglwtextureshare?id=fab824ea01f43c3fecaa2fed5e9e828774db5b24
[6] http://www.gitorious.org/veter/pages/Home
[7] http://veter-project.blogspot.com/
[8] http://www.zeroc.com

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: receiving and displaying video streams in a non-GTK application

Stefan Sauer
In reply to this post by Paul E. Rybski
On 29.09.2010 17:21, Paul E. Rybski wrote:

> Hi,
> I've just recently discovered gstreamer and have been exploring its use
> for transmitting audio and video from one linux machine to another.  So
> far I've only been exploring the use of gst-launch but now want to try
> to receive the video stream in a separate application rather than using
> xvimagesink.  This is complicated by the fact that I'm using FLTK
> (http://www.fltk.org/) rather than GTK for the GUI.  (The reasons for
> using FLTK are essentially historical and it's currently impractical for
> me to consider throwing away my old legacy GUI code and rewriting it
> from scratch in GTK.)  I can see two different paths that I can try to
> follow to achieve my goal:
>
> 1) My first option is to try to integrate the gstreamer API directly
> into my fltk application.  I've started to look at the documentation for
> how to encode gstreamer pipelines in a C application but one thing that
> currently escapes me is how I get access to the raw uncompressed frames
> of video at the end of the pipeline.  The way I understand it, I should
> be able to encode my pipeline so that the application receives the video
> stream from a socket and decodes it (I'm using smokeenc) but then I'm
> completely unclear as to how I might copy the image into a buffer that I
> can feed into an FLTK widget for drawing.  I'm also completely unclear
> how easy or difficult it would be to integrate the GTK main event loop
> with the FLTK main event loop as the gstreamer API seems to be heavily
> wedded to GTK.  I have no experience programming with GTK at the moment
> either.
>  
If there is a drawable widget in fltk that is backed by a xwindow then
you should be able to use the xoverlay interface just fine. A quick
google search turned up this:
http://www.fltk.org/doc-1.0/osissues.html


        Window fl_xid(const Fl_Window *)

Stefan

> 2) My second option is to keep the client gst-launch command as it
> stands now but instead of piping the video to xvimagesink, I create a
> new local socket (or pipe) and shove the frames of video into those
> (perhaps encoded as jpegs) and then have my FLTK application receive the
> data from this pipe, decode each jpeg, and display it.  This seems
> somewhat easier to achieve because then all I need to do is to figure
> out how the data is encoded into the socket so I can write the code to
> decode it.
>
> Any thoughts, advice, or experiences that people could share with this?
>  I'd kind of like to do the first option because it's conceptually
> simpler for the end-user of my system but I'm concerned that I might end
> up needing to rewrite my entire GUI in GTK which I'd rather not have to
> do at this time.
>
> Here are the gst-launch commands that I'm using right now.
>
> Server:
>
> gst-launch-0.10 -vm oggmux name=mux ! filesink location=movie.ogg v4lsrc
> ! video/x-raw-yuv,width=320,height=240 ! tee name=t_vnet ! queue !
> ffmpegcolorspace ! smokeenc qmin=1 qmax=50 ! udpsink port=5000
> host=localhost sync=false t_vnet. ! queue ! videorate !
> 'video/x-raw-yuv' ! theoraenc ! mux. alsasrc device=hw:0,0 !
> audio/x-raw-int,rate=48000,channels=2,depth=16 ! tee name=t_anet ! queue
> ! audioconvert ! flacenc ! udpsink port=4000 host=localhost sync=false
> t_anet. !queue ! audioconvert ! vorbisenc ! mux.
>
>
> Client:
>
> gst-launch-0.10 -vm tee name=vid -vm udpsrc port=5000 ! smokedec !
> xvimagesink vid. !tee name=aud udpsrc port=4000 ! flacdec ! audioconvert
> ! audioresample ! alsasink sync=false aud.
>
>
> I'm on Ubuntu 8.04 LTS 64-bit using the gstreamer packages that come
> with that distro.  I've found that these commands also work for me on
> Ubuntu 10.4 LTS 64-bit.
>
> Thanks,
>
> -Paul
>
>  


------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel