Problem using gnlcomposition

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Problem using gnlcomposition

Hardeep Singh
Hi

I am new to gstreamer development( and also Python!) and beginning to like it. I am trying to merge two video files using gnonlin components. The problem is a very common one, concatenating two video files together.

Gnlcomposition does that perfectly when test sources are used with gnlsource as data source for gnlcomposition. However if we use two file sources( either via gnlsource or gnlfilesource), the output video consists only of the second file. It looks like some problem with handling multiple file sources by gnlcomposition. I used the latest gnonlin libraries release this month. Is there a fix or work around for this? Or I am doing something wrong?

Here is the code for merging the files.

import sys
import pygst
import gst
import gobject

class Merger:

    #Constructor
    def __init__(self):
        '''
        Constructor
        '''
        self.mainloop = gobject.MainLoop()
        self.player = gst.Pipeline("player")
       
        self.conv = gst.element_factory_make ("ffmpegcolorspace","ffmpeg-colorspace")
        mpeg4Encoder = gst.element_factory_make ("ffenc_mpeg4","encoder")
        mpeg4Mux = gst.element_factory_make ("ffmux_mp4","muxer")
        #imageSink = gst.element_factory_make ("xvimagesink","imageSink")
       
        sink = gst.element_factory_make ("filesink","sink")
        sink.set_property("location","/home/developer/merge.mp4")
       
        comp = gst.element_factory_make("gnlcomposition", "mycomposition")
        gnlfilesource1 = gst.element_factory_make("gnlfilesource", "video1")
        gnlfilesource2 = gst.element_factory_make("gnlfilesource", "video2")                      

        gnlfilesource1.set_property("location", sys.argv[1])
        gnlfilesource1.set_property("start", 0 * gst.SECOND)
        gnlfilesource1.set_property("duration", 6 * gst.SECOND)
        gnlfilesource1.set_property("media-start", 0 * gst.SECOND)
        gnlfilesource1.set_property("media-duration", 3 * gst.SECOND)
        gnlfilesource1.set_property('caps', gst.caps_from_string('video/x-raw-yuv,width=190,height=240,framerate=30/1'))
           
        gnlfilesource2.set_property("location", sys.argv[2])
        gnlfilesource2.set_property("start", 3 * gst.SECOND)
        gnlfilesource2.set_property("duration", 6 * gst.SECOND)
        gnlfilesource2.set_property("media-start", 0 * gst.SECOND)
        gnlfilesource2.set_property("media-duration", 3 * gst.SECOND)
        gnlfilesource2.set_property('caps', gst.caps_from_string('video/x-raw-yuv,width=190,height=240,framerate=30/1'))
       
        comp.add(gnlfilesource1,gnlfilesource2)

        #put all elements in a bin
        self.player.add(comp, self.conv, mpeg4Encoder, mpeg4Mux, sink)
        gst.element_link_many(self.conv, mpeg4Encoder, mpeg4Mux, sink)

        bus = self.player.get_bus()
        bus.add_signal_watch ()
        bus.connect("message",self.onMessage)

        comp.connect("pad-added", self.onDynamicPad)
   

    #functions
    def run(self):
        # Now set to playing and iterate.
        print "Setting to Playing"
        self.player.set_state(gst.STATE_PLAYING)
        self.mainloop.run()
        #clean up
        print "Returned, stopping playback"
        self.player.set_state (gst.STATE_NULL)
        print "Deleting pipeline\n"
   
    def onMessage(self,bus, message):
        t = message.type
        if t == gst.MESSAGE_EOS:
            #print "Message EOS"
            self.mainloop.quit()
        elif t == gst.MESSAGE_ERROR:
            err, debug = message.parse_error()
            print "Error: %s" % err, debug
            self.mainloop.quit()
           
    def onDynamicPad(self,compbin,pad):
        print "onDynamicPad called"
        convSinkPad = self.conv.get_compatible_pad(pad, pad.get_caps())
        pad.link(convSinkPad)



Thanks


      The INTERNET now has a personality. YOURS! See your Yahoo! Homepage. http://in.yahoo.com/

------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Problem using gnlcomposition

Edward Hervey
Administrator
Hi,

On Thu, 2010-03-25 at 05:33 +0530, Hardeep Singh wrote:

> Hi
>
> I am new to gstreamer development( and also Python!) and beginning to
>  like it. I am trying to merge two video files using gnonlin
>  components. The problem is a very common one, concatenating two video
>  files together.
>
> Gnlcomposition does that perfectly when test sources are used with
>  gnlsource as data source for gnlcomposition. However if we use two
>  file sources( either via gnlsource or gnlfilesource), the output video
>  consists only of the second file. It looks like some problem with
>  handling multiple file sources by gnlcomposition. I used the latest
>  gnonlin libraries release this month. Is there a fix or work around
>  for this? Or I am doing something wrong?
>
> Here is the code for merging the files.
>
> import sys
> import pygst
> import gst
> import gobject
>
> class Merger:
>
>     #Constructor
>     def __init__(self):
>         '''
>         Constructor
>         '''
>         self.mainloop = gobject.MainLoop()
>         self.player = gst.Pipeline("player")
>        
>         self.conv = gst.element_factory_make ("ffmpegcolorspace","ffmpeg-colorspace")
>         mpeg4Encoder = gst.element_factory_make ("ffenc_mpeg4","encoder")
>         mpeg4Mux = gst.element_factory_make ("ffmux_mp4","muxer")
>         #imageSink = gst.element_factory_make ("xvimagesink","imageSink")
>        
>         sink = gst.element_factory_make ("filesink","sink")
>         sink.set_property("location","/home/developer/merge.mp4")
>        
>         comp = gst.element_factory_make("gnlcomposition", "mycomposition")
>         gnlfilesource1 = gst.element_factory_make("gnlfilesource", "video1")
>         gnlfilesource2 = gst.element_factory_make("gnlfilesource", "video2")                      
>
>         gnlfilesource1.set_property("location", sys.argv[1])
>         gnlfilesource1.set_property("start", 0 * gst.SECOND)
>         gnlfilesource1.set_property("duration", 6 * gst.SECOND)
>         gnlfilesource1.set_property("media-start", 0 * gst.SECOND)
>         gnlfilesource1.set_property("media-duration", 3 * gst.SECOND)

 You've set a media-duration which is different from the duration.
You're attempting to playback 3seconds of media from your file in 6
seconds (If you connect a video sink to the output of the composition
you will see it playing back slower).

 Unless you *want* that behaviour, you should set duration and
media-duration to the same values.


>         gnlfilesource1.set_property('caps', gst.caps_from_string('video/x-raw-yuv,width=190,height=240,framerate=30/1'))

 Unless your file contains multiple video streams, you don't need to be
that specific about the stream. You can just use 'video/x-raw-yuv'.

>            
>         gnlfilesource2.set_property("location", sys.argv[2])
>         gnlfilesource2.set_property("start", 3 * gst.SECOND)

 And here is your problem :)

 You asked the first clip to be positioned from 0 (start) to 6s (start
+duration) ... but then you put the second clip to be played from 3s
(start) to 9s (start+duration).

 Since the clips are overlapping and have the same priority, this is
causing total havoc.

>         gnlfilesource2.set_property("duration", 6 * gst.SECOND)
>         gnlfilesource2.set_property("media-start", 0 * gst.SECOND)
>         gnlfilesource2.set_property("media-duration", 3 * gst.SECOND)
>         gnlfilesource2.set_property('caps', gst.caps_from_string('video/x-raw-yuv,width=190,height=240,framerate=30/1'))
>        
>         comp.add(gnlfilesource1,gnlfilesource2)
>
>         #put all elements in a bin
>         self.player.add(comp, self.conv, mpeg4Encoder, mpeg4Mux, sink)
>         gst.element_link_many(self.conv, mpeg4Encoder, mpeg4Mux, sink)
>
>         bus = self.player.get_bus()
>         bus.add_signal_watch ()
>         bus.connect("message",self.onMessage)
>
>         comp.connect("pad-added", self.onDynamicPad)
>    
>
>     #functions
>     def run(self):
>         # Now set to playing and iterate.
>         print "Setting to Playing"
>         self.player.set_state(gst.STATE_PLAYING)
>         self.mainloop.run()
>         #clean up
>         print "Returned, stopping playback"
>         self.player.set_state (gst.STATE_NULL)
>         print "Deleting pipeline\n"
>    
>     def onMessage(self,bus, message):
>         t = message.type
>         if t == gst.MESSAGE_EOS:
>             #print "Message EOS"
>             self.mainloop.quit()
>         elif t == gst.MESSAGE_ERROR:
>             err, debug = message.parse_error()
>             print "Error: %s" % err, debug
>             self.mainloop.quit()
>            
>     def onDynamicPad(self,compbin,pad):
>         print "onDynamicPad called"
>         convSinkPad = self.conv.get_compatible_pad(pad, pad.get_caps())
>         pad.link(convSinkPad)

  The rest of your code seems correct.

  So to sum up:

  * Make sure you set duration and media-duration to the same values,
unless you want to change the playback rate.

  * Make sure clips with the same priority don't overlap in time.


  Tell us if that fixes your issues,

    Edward

>
>
>
> Thanks
>
>
>       The INTERNET now has a personality. YOURS! See your Yahoo! Homepage. http://in.yahoo.com/
>
> ------------------------------------------------------------------------------
> Download Intel® Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel



------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Problem using gnlcomposition

Hardeep Singh
Hi,


Please see reply below.


----- Original Message ----
From: Edward Hervey <[hidden email]>
To: Discussion of the development of GStreamer <[hidden email]>
Sent: Thu, 25 March, 2010 12:31:53 AM
Subject: Re: [gst-devel] Problem using gnlcomposition

Hi,

On Thu, 2010-03-25 at 05:33 +0530, Hardeep Singh wrote:

> Hi
>
> I am new to gstreamer development( and also Python!) and beginning to
>  like it. I am trying to merge two video files using gnonlin
>  components. The problem is a very common one, concatenating two video
>  files together.
>
> Gnlcomposition does that perfectly when test sources are used with
>  gnlsource as data source for gnlcomposition. However if we use two
>  file sources( either via gnlsource or gnlfilesource), the output video
>  consists only of the second file. It looks like some problem with
>  handling multiple file sources by gnlcomposition. I used the latest
>  gnonlin libraries release this month. Is there a fix or work around
>  for this? Or I am doing something wrong?
>
> Here is the code for merging the files.
>
> import sys
> import pygst
> import gst
> import gobject
>
> class Merger:
>
>     #Constructor
>     def __init__(self):
>         '''
>         Constructor
>         '''
>         self.mainloop = gobject.MainLoop()
>         self.player = gst.Pipeline("player")
>        
>         self.conv = gst.element_factory_make ("ffmpegcolorspace","ffmpeg-colorspace")
>         mpeg4Encoder = gst.element_factory_make ("ffenc_mpeg4","encoder")
>         mpeg4Mux = gst.element_factory_make ("ffmux_mp4","muxer")
>         #imageSink = gst.element_factory_make ("xvimagesink","imageSink")
>        
>         sink = gst.element_factory_make ("filesink","sink")
>         sink.set_property("location","/home/developer/merge.mp4")
>        
>         comp = gst.element_factory_make("gnlcomposition", "mycomposition")
>         gnlfilesource1 = gst.element_factory_make("gnlfilesource", "video1")
>         gnlfilesource2 = gst.element_factory_make("gnlfilesource", "video2")                      
>
>         gnlfilesource1.set_property("location", sys.argv[1])
>         gnlfilesource1.set_property("start", 0 * gst.SECOND)
>         gnlfilesource1.set_property("duration", 6 * gst.SECOND)
>         gnlfilesource1.set_property("media-start", 0 * gst.SECOND)
>         gnlfilesource1.set_property("media-duration", 3 * gst.SECOND)

You've set a media-duration which is different from the duration.
You're attempting to playback 3seconds of media from your file in 6
seconds (If you connect a video sink to the output of the composition
you will see it playing back slower).

Unless you *want* that behaviour, you should set duration and
media-duration to the same values.


>         gnlfilesource1.set_property('caps', gst.caps_from_string('video/x-raw-yuv,width=190,height=240,framerate=30/1'))

Unless your file contains multiple video streams, you don't need to be
that specific about the stream. You can just use 'video/x-raw-yuv'.

>            
>         gnlfilesource2.set_property("location", sys.argv[2])
>         gnlfilesource2.set_property("start", 3 * gst.SECOND)

And here is your problem :)

You asked the first clip to be positioned from 0 (start) to 6s (start
+duration) ... but then you put the second clip to be played from 3s
(start) to 9s (start+duration).

Since the clips are overlapping and have the same priority, this is
causing total havoc.

>         gnlfilesource2.set_property("duration", 6 * gst.SECOND)
>         gnlfilesource2.set_property("media-start", 0 * gst.SECOND)
>         gnlfilesource2.set_property("media-duration", 3 * gst.SECOND)
>         gnlfilesource2.set_property('caps', gst.caps_from_string('video/x-raw-yuv,width=190,height=240,framerate=30/1'))
>        
>         comp.add(gnlfilesource1,gnlfilesource2)
>
>         #put all elements in a bin
>         self.player.add(comp, self.conv, mpeg4Encoder, mpeg4Mux, sink)
>         gst.element_link_many(self.conv, mpeg4Encoder, mpeg4Mux, sink)
>
>         bus = self.player.get_bus()
>         bus.add_signal_watch ()
>         bus.connect("message",self.onMessage)
>
>         comp.connect("pad-added", self.onDynamicPad)
>    
>
>     #functions
>     def run(self):
>         # Now set to playing and iterate.
>         print "Setting to Playing"
>         self.player.set_state(gst.STATE_PLAYING)
>         self.mainloop.run()
>         #clean up
>         print "Returned, stopping playback"
>         self.player.set_state (gst.STATE_NULL)
>         print "Deleting pipeline\n"
>    
>     def onMessage(self,bus, message):
>         t = message.type
>         if t == gst.MESSAGE_EOS:
>             #print "Message EOS"
>             self.mainloop.quit()
>         elif t == gst.MESSAGE_ERROR:
>             err, debug = message.parse_error()
>             print "Error: %s" % err, debug
>             self.mainloop.quit()
>            
>     def onDynamicPad(self,compbin,pad):
>         print "onDynamicPad called"
>         convSinkPad = self.conv.get_compatible_pad(pad, pad.get_caps())
>         pad.link(convSinkPad)

  The rest of your code seems correct.

  So to sum up:

  * Make sure you set duration and media-duration to the same values,
unless you want to change the playback rate.

  * Make sure clips with the same priority don't overlap in time.


  Tell us if that fixes your issues,

    Edward

++++++++++++++++++++++++++
Hi,

Thanks for your reply.

I did the changes mentioned below, but the problem remains in writing
the encoded file, however i can see the merged video on screen using
xvimagesink plugin.

Changes:
1) Kept duration and media-duration to same i.e. 3 secs.
2) Introduced a gap of 1 sec in ending of first stream and start of
second stream. Thus second stream now starts at 4 sec. ( also tried with 3 sec)
3) Cleaned up the caps filter.
4) Commented the property setting code, so as to let them use the
default priority. Although that would be same too. ( also tried with
specifying same priority)

Observations:
1) If I replace the ffmpeg encoding/writing to file with xvimage sink
then the merging of video works It seems like the problem is with
encoding the resulting stream.
2) Even in this case introduction of Gap of 1 sec or more, between the
start time of the streams, does not appear as a black( or some other
default image). I am not sure if this is even the expected behavior as
per design.
 
Here the GST_DEBUG, level 2 log. The log starts at the end of first file decoding and encoding and then switches to the second file. I had left put similar messages od encoding error and put the end log. Any ideas what can be done? Not sure if this is gnlcomposition issue. Should gnlcomposition be supplying proper timestamps in the composed video? Didn't find a fix on searching google. Please suggest.


:00:01.256419819  5860  0xa64c400 ERROR                 ffmpeg :0:: header damaged
0:00:01.256516309  5860  0xa64c400 WARN                  ffmpeg gstffmpegdec.c:2169:gst_ffmpegdec_frame:<ffdec_mpeg40> ffdec_mpeg4: decoding error (len: -1, have_data: 0)
0:00:01.268266134  5860  0xa64c400 ERROR                 ffmpeg :0:: header damaged
0:00:01.268358391  5860  0xa64c400 WARN                  ffmpeg gstffmpegdec.c:2169:gst_ffmpegdec_frame:<ffdec_mpeg40> ffdec_mpeg4: decoding error (len: -1, have_data: 0)
0:00:01.280180810  5860  0xa64c400 ERROR                 ffmpeg :0:: header damaged
0:00:01.280321670  5860  0xa64c400 WARN                  ffmpeg gstffmpegdec.c:2169:gst_ffmpegdec_frame:<ffdec_mpeg40> ffdec_mpeg4: decoding error (len: -1, have_data: 0)
0:00:01.288291885  5860  0xa42eb40 WARN               gnlsource gnlsource.c:545:gnl_source_change_state:<video2> Couldn't find a valid source pad
0:00:01.294969099  5860  0xa42eb40 WARN          GST_SCHEDULING gstpad.c:4603:gst_pad_get_range:<source:src> getrange failed unexpected
0:00:01.295057946  5860  0xa42eb40 WARN          GST_SCHEDULING gstpad.c:4715:gst_pad_pull_range:<decodebin21:sink> pullrange failed unexpected
0:00:01.295112804  5860  0xa42eb40 WARN          GST_SCHEDULING gstpad.c:4603:gst_pad_get_range:<source:src> getrange failed unexpected
0:00:01.295141566  5860  0xa42eb40 WARN          GST_SCHEDULING gstpad.c:4715:gst_pad_pull_range:<decodebin21:sink> pullrange failed unexpected
0:00:01.300732647  5860  0xa5f0908 WARN                 qtdemux qtdemux.c:4399:qtdemux_parse_trak:<qtdemux1> unknown version 00000000
0:00:01.304895239  5860  0xa5f0908 WARN               gnlsource gnlsource.c:221:element_pad_added_cb:<video2> We already have (pending) ghost-ed a valid source pad (ghostpad:'':'', pendingblock:1
0:00:01.306105978  5860  0xa8087d8 WARN                    faad gstfaad.c:345:gst_faad_setcaps:<faad1> buggy faad version, wrong nr of channels 2 instead of 1
0:00:01.315703657  5860 0xb6cade48 ERROR                 ffmpeg :0:: header damaged
0:00:01.315852115  5860 0xb6cade48 WARN                  ffmpeg gstffmpegdec.c:2169:gst_ffmpegdec_frame:<ffdec_mpeg41> ffdec_mpeg4: decoding error (len: -1, have_data: 0)
0:00:01.319513949  5860 0xb6cade48 ERROR                 ffmpeg :0:: Error, Invalid timestamp=0, last=90
0:00:01.319609271  5860 0xb6cade48 ERROR                 ffmpeg gstffmpegenc.c:685:gst_ffmpegenc_chain_video:<encoder> ffenc_mpeg4: failed to encode buffer
0:00:01.319662395  5860 0xb6cade48 ERROR                 ffmpeg :0:: header damaged
0:00:01.319689864  5860 0xb6cade48 WARN                  ffmpeg gstffmpegdec.c:2169:gst_ffmpegdec_frame:<ffdec_mpeg41> ffdec_mpeg4: decoding error (len: -1, have_data: 0)
0:00:01.321465267  5860 0xb6cade48 ERROR                 ffmpeg :0:: Error, Invalid timestamp=1, last=90
0:00:01.321559781  5860 0xb6cade48 ERROR                 ffmpeg gstffmpegenc.c:685:gst_ffmpegenc_chain_video:<encoder> ffenc_mpeg4: failed to encode buffer
0:00:01.321608665  5860 0xb6cade48 ERROR                 ffmpeg :0:: header damaged
0:00:01.321636356  5860 0xb6cade48 WARN                  ffmpeg gstffmpegdec.c:2169:gst_ffmpegdec_frame:<ffdec_mpeg41> ffdec_mpeg4: decoding error (len: -1, have_data: 0)
0:00:01.323454629  5860 0xb6cade48 ERROR                 ffmpeg :0:: Error, Invalid timestamp=2, last=90
0:00:01.323540002  5860 0xb6cade48 ERROR                 ffmpeg gstffmpegenc.c:685:gst_ffmpegenc_chain_video:<encoder> ffenc_mpeg4: failed to encode buffer
0:00:01.323584935  5860 0xb6cade48 ERROR                 ffmpeg :0:: header damaged
0:00:01.323611885  5860 0xb6cade48 WARN                  ffmpeg gstffmpegdec.c:2169:gst_ffmpegdec_frame:<ffdec_mpeg41> ffdec_mpeg4: decoding error (len: -1, have_data: 0)
0:00:01.324909930  5860 0xb6cade48 ERROR                 ffmpeg :0:: Error, Invalid timestamp=3, last=90
0:00:01.325040951  5860 0xb6cade48 ERROR                 ffmpeg gstffmpegenc.c:685:gst_ffmpegenc_chain_video:<encoder> ffenc_mpeg4: failed to encode buffer
...
...
...
0:00:01.542786038  5860 0xb6cade48 ERROR                 ffmpeg :0:: Error, Invalid timestamp=89, last=90
0:00:01.542918119  5860 0xb6cade48 ERROR                 ffmpeg gstffmpegenc.c:685:gst_ffmpegenc_chain_video:<encoder> ffenc_mpeg4: failed to encode buffer
0:00:01.542990577  5860 0xb6cade48 ERROR                 ffmpeg :0:: header damaged
0:00:01.543026786  5860 0xb6cade48 WARN                  ffmpeg gstffmpegdec.c:2169:gst_ffmpegdec_frame:<ffdec_mpeg41> ffdec_mpeg4: decoding error (len: -1, have_data: 0)
0:00:01.543141192  5860 0xb6cade48 ERROR                 ffmpeg :0:: Error, Invalid timestamp=90, last=90
0:00:01.543182888  5860 0xb6cade48 ERROR                 ffmpeg gstffmpegenc.c:685:gst_ffmpegenc_chain_video:<encoder> ffenc_mpeg4: failed to encode buffer

Thanks

>
>
>
> Thanks
>
>
>       The INTERNET now has a personality. YOURS! See your Yahoo! Homepage. http://in.yahoo.com/
>
> ------------------------------------------------------------------------------
> Download Intel® Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel



------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel



      The INTERNET now has a personality. YOURS! See your Yahoo! Homepage. http://in.yahoo.com/

------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Problem using gnlcomposition

Edward Hervey
Administrator
On Thu, 2010-03-25 at 23:30 +0530, Hardeep Singh wrote:
> Hi,
>
>
> Please see reply below.
>
[...]
> Hi,
>
> Thanks for your reply.
>
> I did the changes mentioned below, but the problem remains in writing
> the encoded file, however i can see the merged video on screen using
> xvimagesink plugin.

  Perfect, that means your code is fine from a gnonlin point of view.

  The problem you're facing is that most elements don't know how to
handle segments. GstBaseSink (and any implementations like xvimagesink)
*do* know how to handle that properly. Basically the encoder is choking
on that.

  You can solve that by inserting just after your gnlcomposition a
identity element with the 'single-segment' property set to True. This
will effectively:
  * Consume all incoming segments and only output one
  * Change the timestamps of the buffers to running time, making them
appear as a continuous stream of data.

  That should solve your issues.

    Edward

>
> Changes:
> 1) Kept duration and media-duration to same i.e. 3 secs.
> 2) Introduced a gap of 1 sec in ending of first stream and start of
> second stream. Thus second stream now starts at 4 sec. ( also tried with 3 sec)
> 3) Cleaned up the caps filter.
> 4) Commented the property setting code, so as to let them use the
> default priority. Although that would be same too. ( also tried with
> specifying same priority)
>
> Observations:
> 1) If I replace the ffmpeg encoding/writing to file with xvimage sink
> then the merging of video works It seems like the problem is with
> encoding the resulting stream.
> 2) Even in this case introduction of Gap of 1 sec or more, between the
> start time of the streams, does not appear as a black( or some other
> default image). I am not sure if this is even the expected behavior as
> per design.
>  


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Problem using gnlcomposition

Hardeep Singh
Thanks, that solved the problem. Works fine now :)


----- Original Message ----
From: Edward Hervey <[hidden email]>
To: Discussion of the development of GStreamer <[hidden email]>
Sent: Fri, 26 March, 2010 1:33:42 AM
Subject: Re: [gst-devel] Problem using gnlcomposition

On Thu, 2010-03-25 at 23:30 +0530, Hardeep Singh wrote:
> Hi,
>
>
> Please see reply below.
>
[...]
> Hi,
>
> Thanks for your reply.
>
> I did the changes mentioned below, but the problem remains in writing
> the encoded file, however i can see the merged video on screen using
> xvimagesink plugin.

  Perfect, that means your code is fine from a gnonlin point of view.

  The problem you're facing is that most elements don't know how to
handle segments. GstBaseSink (and any implementations like xvimagesink)
*do* know how to handle that properly. Basically the encoder is choking
on that.

  You can solve that by inserting just after your gnlcomposition a
identity element with the 'single-segment' property set to True. This
will effectively:
  * Consume all incoming segments and only output one
  * Change the timestamps of the buffers to running time, making them
appear as a continuous stream of data.

  That should solve your issues.

    Edward

>
> Changes:
> 1) Kept duration and media-duration to same i.e. 3 secs.
> 2) Introduced a gap of 1 sec in ending of first stream and start of
> second stream. Thus second stream now starts at 4 sec. ( also tried with 3 sec)
> 3) Cleaned up the caps filter.
> 4) Commented the property setting code, so as to let them use the
> default priority. Although that would be same too. ( also tried with
> specifying same priority)
>
> Observations:
> 1) If I replace the ffmpeg encoding/writing to file with xvimage sink
> then the merging of video works It seems like the problem is with
> encoding the resulting stream.
> 2) Even in this case introduction of Gap of 1 sec or more, between the
> start time of the streams, does not appear as a black( or some other
> default image). I am not sure if this is even the expected behavior as
> per design.
>  


------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel



      Your Mail works best with the New Yahoo Optimized IE8. Get it NOW! http://downloads.yahoo.com/in/internetexplorer/

------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel