Need help using gst-launch to display a PNG image, Part 3+

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Need help using gst-launch to display a PNG image, Part 3+

Bill West
Hi All,

I followed Tim's advice on using gst-launch to display a PNG file, and Luciana's fix for getting my PNG to "mix" with a video stream (thanks again to you both), but doing that, I discovered that the PNG file, which has a transparent alpha component, is showing opaque, so the video never shows thru.  (I know it's there though, 'cause if I draw the video on top of the PNG instead of underneath it, it shows fine.

The command I'm using is:

gst-launch-0.10 filesrc location=overlay.png ! pngdec ! ffmpegcolorspace ! videoscale ! imagefreeze ! videomixer name=mix ! autovideosink  videotestsrc ! video/x-raw-yuv, width=320, height=240 ! mix.

Looking at the documentation, the only possible solution I could find is an element called "alphacolor", but nowhere placed in the pipeline does it work, so I'm stumped again.

So, that's my main question:  how to I get a PNG file WITH transparent sections to properly render?

And, if I could ask a few more questions, I'd appreciate it (my boss says I'm taking too long at this :)...

2.  we have ALSA audio, and I see there's an element called "alsasink", though I couldn't find much documentation on it.  Is there a way with alsasink to specify the output channel?  (our card has 8, so 4 stereo pairs).

3.  Is there a way to have the other PNG images fade in and fade out of the window to/from black?

4.  Is there a way to loop between two particular frames on a video?

5.  Is videobox the best way to position my video behind the overlay PNG?  I got it to work kind of, but there were artifacts, like everything to the left and above the video is black.

6.  I thought someone must have written a translator that takes a gst-launch command line and converts it to the equivalent C code, but couldn't find one.  Did I just not look in the right place?

Sorry for the pile of questions, but as I say, the heat is on, and these are all problems I'd better solve soon, or I'm back to using straight X11 and OpenGL, and I guess libxine to pull apart my video into frames.

Thanks, thanks, thanks!

Bill.

------------------------------------------------------------------------------
This SF.net email is sponsored by

Make an app they can't live without
Enter the BlackBerry Developer Challenge
http://p.sf.net/sfu/RIM-dev2dev 
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Need help using gst-launch to display a PNG image, Part 3+

Tim-Philipp Müller-2
On Thu, 2010-08-12 at 15:56 -0700, Bill West wrote:

> The command I'm using is:
> gst-launch-0.10 filesrc location=overlay.png ! pngdec !
> ffmpegcolorspace ! videoscale ! imagefreeze ! videomixer name=mix !
> autovideosink  videotestsrc ! video/x-raw-yuv, width=320, height=240 !
> mix.
>

> Looking at the documentation, the only possible solution I could find
> is an element called "alphacolor", but nowhere placed in the pipeline
> does it work, so I'm stumped again.

Try the 'alpha' element.

Something like this works for me:

$ gst-launch-0.10 filesrc location=foo.png ! pngdec ! alpha alpha=0.5 !
videoscale ! imagefreeze ! videomixer name=mix ! ffmpegcolorspace !
autovideosink    videotestsrc ! alpha alpha=0.5 ! mix.

or if the png already has an alpha channel:

gst-launch-0.10 filesrc location=alpha.png ! pngdec ! videoscale !
imagefreeze ! videomixer name=mix ! ffmpegcolorspace ! autovideosink
videotestsrc ! alpha alpha=0.5 ! video/x-raw-rgb,bpp=32,depth=32 ! mix.


>
> 2.  we have ALSA audio, and I see there's an element called
> "alsasink", though I couldn't find much documentation on it.  Is there
> a way with alsasink to specify the output channel?  (our card has 8,
> so 4 stereo pairs).

No. You can specify an output device though using the "device" property.
And you can probably do some .asoundrc magic to route stuff to the right
outputs (ie. expose the 4 stereo pairs as 4 stereo devices or 8 mono
devices or whatever).

>
> 3.  Is there a way to have the other PNG images fade in and fade out
> of the window to/from black?

Yes, but for this it would be best to write some code (python script or
C or whatever).

The "alpha" property of the alpha element is controllable (see
GstController interface), and videomixer's pads also have properties
like "xpos", "ypos", "zorder" and "alpha", which are also controllable
and can be changed at runtime.


> 4.  Is there a way to loop between two particular frames on a video?
>
> 5.  Is videobox the best way to position my video behind the overlay
> PNG?  I got it to work kind of, but there were artifacts, like
> everything to the left and above the video is black.

videobox? No, just use videomixer and set xpos/ypos/zorder/alpha via the
pad properties.

>
> 6.  I thought someone must have written a translator that takes a
> gst-launch command line and converts it to the equivalent C code, but
> couldn't find one.  Did I just not look in the right place?

There isn't one, and it's probably not a good idea either (RIP Glade
code generator).

You can use gst_parse_launch() from C code though (give elements a name
with name=foo and then use gst_bin_get_by_name() on the pipeline to
retrieve the elements by name to manipulate them).

Cheers
 -Tim




------------------------------------------------------------------------------
This SF.net email is sponsored by

Make an app they can't live without
Enter the BlackBerry Developer Challenge
http://p.sf.net/sfu/RIM-dev2dev 
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Need help using gst-launch to display a PNG image, Part 3+

Donald Poole
Tim-Philipp Müller <t.i.m <at> zen.co.uk> writes:

>
> On Thu, 2010-08-12 at 15:56 -0700, Bill West wrote:
>
> > The command I'm using is:
> > gst-launch-0.10 filesrc location=overlay.png ! pngdec !
> > ffmpegcolorspace ! videoscale ! imagefreeze ! videomixer name=mix !
> > autovideosink  videotestsrc ! video/x-raw-yuv, width=320, height=240 !
> > mix.
> >
>
> > Looking at the documentation, the only possible solution I could find
> > is an element called "alphacolor", but nowhere placed in the pipeline
> > does it work, so I'm stumped again.
>
> Try the 'alpha' element.
>
> Something like this works for me:
>
> $ gst-launch-0.10 filesrc location=foo.png ! pngdec ! alpha alpha=0.5 !
> videoscale ! imagefreeze ! videomixer name=mix ! ffmpegcolorspace !
> autovideosink    videotestsrc ! alpha alpha=0.5 ! mix.
>
> or if the png already has an alpha channel:
>
> gst-launch-0.10 filesrc location=alpha.png ! pngdec ! videoscale !
> imagefreeze ! videomixer name=mix ! ffmpegcolorspace ! autovideosink
> videotestsrc ! alpha alpha=0.5 ! video/x-raw-rgb,bpp=32,depth=32 ! mix.
>
> >
> > 2.  we have ALSA audio, and I see there's an element called
> > "alsasink", though I couldn't find much documentation on it.  Is there
> > a way with alsasink to specify the output channel?  (our card has 8,
> > so 4 stereo pairs).
>
> No. You can specify an output device though using the "device" property.
> And you can probably do some .asoundrc magic to route stuff to the right
> outputs (ie. expose the 4 stereo pairs as 4 stereo devices or 8 mono
> devices or whatever).
>
> >
> > 3.  Is there a way to have the other PNG images fade in and fade out
> > of the window to/from black?
>
> Yes, but for this it would be best to write some code (python script or
> C or whatever).
>
> The "alpha" property of the alpha element is controllable (see
> GstController interface), and videomixer's pads also have properties
> like "xpos", "ypos", "zorder" and "alpha", which are also controllable
> and can be changed at runtime.
>
> > 4.  Is there a way to loop between two particular frames on a video?
> >
> > 5.  Is videobox the best way to position my video behind the overlay
> > PNG?  I got it to work kind of, but there were artifacts, like
> > everything to the left and above the video is black.
>
> videobox? No, just use videomixer and set xpos/ypos/zorder/alpha via the
> pad properties.
>
> >
> > 6.  I thought someone must have written a translator that takes a
> > gst-launch command line and converts it to the equivalent C code, but
> > couldn't find one.  Did I just not look in the right place?
>
> There isn't one, and it's probably not a good idea either (RIP Glade
> code generator).
>
> You can use gst_parse_launch() from C code though (give elements a name
> with name=foo and then use gst_bin_get_by_name() on the pipeline to
> retrieve the elements by name to manipulate them).
>
> Cheers
>  -Tim
>
> ------------------------------------------------------------------------------
> This SF.net email is sponsored by
>
> Make an app they can't live without
> Enter the BlackBerry Developer Challenge
> http://p.sf.net/sfu/RIM-dev2dev 
>

Hello All,

I have been reading this thread for the last couple of days because I am trying
to do the same thing that Bill is doing.  But, instead overlaying the PNG image
on the videotestsrc, i'm trying to overlay it on a live video stream from a
network video encoder (AXIS M7001 to be specific).  It's a MPJEG http stream
using SOUP.  I first used this pipeline to make sure I could get something
working:

$ gst-launch filesrc location=images/cross_hair.png ! pngdec ! videoscale !
imagefreeze ! ffmpegcolorspace ! videomixer name=mix ! ffmpegcolorspace !
xvimagesink videotestsrc ! alpha alpha=1 ! mix.

I had to use another ffmpegcolorspace to get it to work, but it displays the PNG
image overlayed on the videotestsrc with the alpha blending. Next, I tried
replacing the videotestsrc with souphttpsrc so that I could get the video stream
to show underneath the PNG image.  So I tried this pipeline, but received the
following errors:

$ gst-launch filesrc location=images/cross_hair.png ! pngdec ! videoscale !
imagefreeze ! ffmpegcolorspace ! videomixer name=mix ! ffmpegcolorspace !
xvimagesink souphttpsrc location=http://10.200.30.30/axis-cgi/mjpg/video.cgi !
alpha alpha=1 ! mix.
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstAlpha:alpha0: not negotiated
Additional debug info:
gstbasetransform.c(2073): gst_base_transform_handle_buffer ():
/GstPipeline:pipeline0/GstAlpha:alpha0:
not negotiated
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...

I'm fairly new to GStreamer, so these error messages appear rather arcane at
this time.  But, is there something that I'm not understanding to get the PNG to
overlay on my live video the way it does on the videotestsrc?  I thank everyone
in advance for any information and advice provided.

Thanks,
Donald


------------------------------------------------------------------------------
What happens now with your Lotus Notes apps - do you make another costly
upgrade, or settle for being marooned without product support? Time to move
off Lotus Notes and onto the cloud with Force.com, apps are easier to build,
use, and manage than apps on traditional platforms. Sign up for the Lotus
Notes Migration Kit to learn more. http://p.sf.net/sfu/salesforce-d2d
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel