VideoConcat

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

VideoConcat

Ruben Gonzalez Uvigo
Hi all,

I am writing an video concat script with nonlin, to merge two videos.

I am currently using the below pipeline :

gst-launch  \
  gnlcomposition. \(  name=myaudiocomp \
    gnlsource. \( name=asource1 start=0  duration=2000000000  media-start=0 media-duration=2000000000  \
       audiotestsrc wave=3 \
    \) \
    gnlsource. \( name=asource2 start=2000000000  duration=2000000000  media-start=0 media-duration=2000000000  \
       audiotestsrc wave=2 \
    \) \
  \) \
  myaudiocomp. ! queue ! audioconvert ! audioresample ! faac ! mux.    \
  gnlcomposition. \(  name=myvideocomp \
    gnlsource. \( name=vsource1 start=0  duration=2000000000  media-start=0 media-duration=2000000000  \
       videotestsrc pattern=1  \
    \) \
    gnlsource. \( name=vsource2 start=2000000000  duration=2000000000  media-start=0 media-duration=2000000000  \
       videotestsrc pattern=0  \
    \) \
  \) \
  myvideocomp. ! queue ! videorate ! ffmpegcolorspace ! videoscale ! x264enc ! mux. \
  ffmux_mp4 name=mux ! filesink location=outfile.mp4


Mi problem is use a filesrc and not a test source. ¿I need four or two filesrc?, ¿Do I use filesosrc and decodebin in eatch gnlsource?

Sorry for my English, best Regards.
Ruben

------------------------------------------------------------------------------
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: VideoConcat

IamTrying
If you are not doing live full duplex broadcasting. Just use `input-selector` that works well for X sources. But that does not work well for live broadcasting cases.

1) create a file py.py

2) copy and paste it

#!/usr/bin/env python

import sys, os
import pygtk, gtk, gobject
import pygst
pygst.require("0.10")
import gst

class GTK_Main:

        def __init__(self):
                window = gtk.Window(gtk.WINDOW_TOPLEVEL)
                window.set_title("Test")
                window.set_default_size(100, 100)
                window.connect("destroy", gtk.main_quit, "WM destroy")
                vbox = gtk.VBox()
                window.add(vbox)
                self.movie_window = gtk.DrawingArea()
                vbox.add(self.movie_window)
                hbox = gtk.HBox()
                vbox.pack_start(hbox, False)
                hbox.set_border_width(10)
                hbox.pack_start(gtk.Label())
                self.button = gtk.Button("Start")
                self.button.connect("clicked", self.start_stop)
                hbox.pack_start(self.button, False)
                self.button2 = gtk.Button("Switch now")
                self.button2.connect("clicked", self.exit)
                hbox.pack_start(self.button2, False)
                hbox.add(gtk.Label())
                window.show_all()

                # Default sink1
                self.mySwitch = "sink1"
                self.player = gst.parse_launch ("multifilesrc location=pipe1 caps=\"image/jpeg,framerate=100/1\" ! decodebin2 ! videoscale ! video/x-raw-yuv,width=1280,height=720 ! queue ! s.sink0 videotestsrc ! videoscale ! video/x-raw-yuv,width=1280,height=720 ! queue ! s.sink1 input-selector name=s ! tee name=t ! queue ! xvimagesink name=gl sync=false t. ! queue ! ffmpegcolorspace ! x264enc ! rtph264pay ! udpsink buffer-size=512000 host=12.0.0.8 port=558 name=udpuploadersink0")
                bus = self.player.get_bus()
                bus.add_signal_watch()
                bus.enable_sync_message_emission()

        # Stop/Start button
        def start_stop(self, w):
                if self.button.get_label() == "Start":
                        self.button.set_label("Stop")
                        self.player.set_state(gst.STATE_PLAYING)
                else:
                        self.player.set_state(gst.STATE_NULL)
                        self.button.set_label("Start")

        # Switch Button
        def exit(self, widget, data=None):
                print "Switching to: " + self.mySwitch
                sel                          = self.player.get_by_name("s")
                new_pad          = sel.get_static_pad(self.mySwitch)
                stop_time  =  sel.emit('block')
                start_time = new_pad.get_property('running-time')
                sel.emit('switch', new_pad, stop_time,start_time)
                if self.mySwitch.startswith("sink0"):
                                    self.mySwitch = "sink1"
                else:
                                    self.mySwitch = "sink0"

GTK_Main()
gtk.gdk.threads_init()
gtk.main()


3) run it by using those buttons.


Sorry but not working for broadcasting, but you can play with this for any local use.