Hello, I am writing an application who needs to make a snasphot of a n
image videostream when a button is clicked. The application is written in python and I use pygst. I need a confirmation for the way to do that : I use the add_buffer_probe function 'padded' on a bin. And with the callback associated to add_buffer_probe i save to a jpeg file and deactivate the probe Is this the correct way to handle that kind of feature with pytgon gst ? Thanks in advance Nico ------------------------------------------------------------------------------ Register Now & Save for Velocity, the Web Performance & Operations Conference from O'Reilly Media. Velocity features a full day of expert-led, hands-on workshops and two days of sessions from industry leaders in dedicated Performance & Operations tracks. Use code vel09scf and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Sat, 2009-05-02 at 19:07 +0200, Nicolas Bertrand wrote:
Hi, > I am writing an application who needs to make a snasphot of an > image videostream when a button is clicked. The application is written > in python and I use pygst. > > I need a confirmation for the way to do that : I use the > add_buffer_probe function 'padded' on a bin. And with the callback > associated to add_buffer_probe i save to a jpeg file and deactivate the > probe > > Is this the correct way to handle that kind of feature with pytgon gst ? It's generally much easier to figure out what exactly you're doing and whether that's a good way of doing things or not with the exact code at hand. Any chance you could post some of your code that captures the essence of your app, ideally as a stand-alone program? Cheers -Tim ------------------------------------------------------------------------------ Register Now & Save for Velocity, the Web Performance & Operations Conference from O'Reilly Media. Velocity features a full day of expert-led, hands-on workshops and two days of sessions from industry leaders in dedicated Performance & Operations tracks. Use code vel09scf and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
> > It's generally much easier to figure out what exactly you're doing and > whether that's a good way of doing things or not with the exact code at > hand. Any chance you could post some of your code that captures the > essence of your app, ideally as a stand-alone program? > Hi Find attached a standalone python program. This program display a stream ( videotestsec) and allow also making a snapshot of th stream ( Capture button) . the image result is in snapshot.jpg The snapshot is made via method capture in class my_gst. Is that the correct way to that kind of operation ; i.e. storibng an image from stream in file. #!/usr/bin/env python # -*- coding: utf-8 -*- # -*- mode: python -*- # vi:si:ai:et:sw=4:sts=4:ts=4 import gobject import gtk import pygst import gst class VideoWidget(gtk.DrawingArea): def __init__(self): gtk.DrawingArea.__init__(self) self.imagesink = None self.unset_flags(gtk.DOUBLE_BUFFERED) def do_expose_event(self, event): if self.imagesink: self.imagesink.expose() return False else: return True def set_sink(self, sink): assert self.window.xid self.imagesink = sink self.imagesink.set_xwindow_id(self.window.xid) class my_gui(gtk.Window) : def __init__(self,cb_capture,cb_start): gtk.Window.__init__(self) self.set_default_size(330, 325) self.set_title("Snapshot Example") self.connect("destroy", lambda w: gtk.main_quit()) Vbox = gtk.VBox() self.area = VideoWidget() self.area.set_size_request(300,300) Vbox.pack_start(self.area) Button = gtk.Button('Start/Stop Stream') Button.connect('clicked',cb_start) Vbox.pack_start(Button) Button = gtk.Button('Capture') Button.connect('clicked',cb_capture) Vbox.pack_start(Button) self.add(Vbox) class my_gst(object): def __init__(self,drawarea) : ElementList = [] self.pipe = gst.Pipeline() self.videowidget = drawarea MyVideoSrc = gst.element_factory_make('videotestsrc','VideoInput') ElementList.append(MyVideoSrc) Myffmpeg = gst.element_factory_make('ffmpegcolorspace') ElementList.append(Myffmpeg) MyTee = gst.element_factory_make("tee", "MyTee") ElementList.append(MyTee) # # Display branch # queueDisplay = gst.element_factory_make("queue","queueDisplay") ElementList.append(queueDisplay) VideoSink = gst.element_factory_make('xvimagesink') ElementList.append(VideoSink) # # To file image branch # queueFile = gst.element_factory_make("queue","queueFile") ElementList.append(queueFile) self.Myjpegenc = gst.element_factory_make('jpegenc','Jpegenc') ElementList.append(self.Myjpegenc) ImageSink = gst.element_factory_make('fakesink') ElementList.append(ImageSink) # # Add elements to pipeline # for elem in ElementList : self.pipe.add(elem) # # link pipeline elements # gst.element_link_many(MyVideoSrc, Myffmpeg, MyTee) # link video branch gst.element_link_many(MyTee, queueDisplay, VideoSink) # link image branch gst.element_link_many(MyTee, queueFile,self.Myjpegenc, ImageSink) self.on_eos = None bus = self.pipe.get_bus() bus.enable_sync_message_emission() bus.add_signal_watch() bus.connect('sync-message::element', self.on_sync_message) bus.connect('message', self.on_message) self.playing = False def on_message(self, bus, message): t = message.type if t == gst.MESSAGE_ERROR: err, debug = message.parse_error() print "Error: %s" % err, debug if self.on_eos: self.on_eos() self.playing = False elif t == gst.MESSAGE_EOS: if self.on_eos: self.on_eos() self.playing = False def on_sync_message(self, bus, message): if message.structure is None: return if message.structure.get_name() == 'prepare-xwindow-id': # Sync with the X server before giving the X-id to the sink gtk.gdk.display_get_default().sync() self.videowidget.set_sink(message.src) message.src.set_property('force-aspect-ratio', True) def play(self): gst.info("playing player") self.pipe.set_state(gst.STATE_PLAYING) self.playing = True def stop(self): self.pipe.set_state(gst.STATE_NULL) gst.info("stopped player") self.playing = False def capture(self) : # add probe on jpegenc to capture image pad = self.Myjpegenc.get_static_pad("src") self.capure_probe = pad.add_buffer_probe(self.make_capture) def make_capture(self, pad, buffer): # remove the probe pad.remove_buffer_probe(self.capure_probe) # save buufer to file image_file = open('./snapshot.jpeg','w') image_file.write(buffer) image_file.close() return True if __name__ == "__main__": def cb_capture(wdg) : print "Button click" g.capture() def cb_start(wdg) : if g.playing == False : g.play() else : g.stop() w = my_gui(cb_capture,cb_start) g = my_gst(w.area) w.show_all() gtk.main() ------------------------------------------------------------------------------ Register Now & Save for Velocity, the Web Performance & Operations Conference from O'Reilly Media. Velocity features a full day of expert-led, hands-on workshops and two days of sessions from industry leaders in dedicated Performance & Operations tracks. Use code vel09scf and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Srs.
Taking advantage of this topic, Ive been looking on how to save a real time input in a buffer for a later treatment. ( What I need exacly is read a sound from alsasrc and save in a buffer. I hav it done saving in a file, I didn't find any api to save a file in a buffer ;o/ Can I get some help?? Thanks in advance! Tks! ------------------- Guilherme Longo Dept. Eng. da Computação Unaerp Linux User - #484927 *Before Asking http://www.istf.com.br/?page=perguntas !- I'd rather die on my feet than live on my knees -! Nicolas Bertrand wrote: >> It's generally much easier to figure out what exactly you're doing and >> whether that's a good way of doing things or not with the exact code at >> hand. Any chance you could post some of your code that captures the >> essence of your app, ideally as a stand-alone program? >> >> > > Hi Find attached a standalone python program. > This program display a stream ( videotestsec) and allow also making a > snapshot of th stream ( Capture button) . the image result is in > snapshot.jpg > > > The snapshot is made via method capture in class my_gst. Is that the > correct way to that kind of operation ; i.e. storibng an image from > stream in file. > > > ------------------------------------------------------------------------ > > ------------------------------------------------------------------------------ > Register Now & Save for Velocity, the Web Performance & Operations > Conference from O'Reilly Media. Velocity features a full day of > expert-led, hands-on workshops and two days of sessions from industry > leaders in dedicated Performance & Operations tracks. Use code vel09scf > and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf > ------------------------------------------------------------------------ > > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------------ Register Now & Save for Velocity, the Web Performance & Operations Conference from O'Reilly Media. Velocity features a full day of expert-led, hands-on workshops and two days of sessions from industry leaders in dedicated Performance & Operations tracks. Use code vel09scf and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Just a little comment
I have my program dumping the output to the stdout. pipeline = gst_pipeline_new ("audio-player"); source = gst_element_factory_make ("alsasrc", "file-source") filesink = gst_element_factory_make ("fakesink", "audio-output"); if (!pipeline || !source || !filesink) { g_printerr ("One element could not be created. Exiting.\n"); return -1; } g_object_set (G_OBJECT (filesink), "dump", TRUE, NULL); Just the nescessary ... how can I set a buffer with a custom specification, 'cause I have after that implement a fftw library to read the content of this buffer and transform that reading in others 2. I presume the content inside the buffer are 0's and 1's... so I need the fast fourier transform reading this buffer "in real time" and creating another data. Tks! ------------------- Guilherme Longo Dept. Eng. da Computação Unaerp Linux User - #484927 *Before Asking http://www.istf.com.br/?page=perguntas !- I'd rather die on my feet than live on my knees -! Guilherme wrote: > Srs. > > Taking advantage of this topic, Ive been looking on how to save a real > time input in a buffer for a later treatment. ( > > What I need exacly is read a sound from alsasrc and save in a buffer. > I hav it done saving in a file, I didn't find any api to save a file > in a buffer ;o/ > > Can I get some help?? > > Thanks in advance! > > Tks! > > ------------------- > > Guilherme Longo > Dept. Eng. da Computação > Unaerp > > Linux User - #484927 > > *Before Asking > http://www.istf.com.br/?page=perguntas > > !- I'd rather die on my feet than live on my knees -! > > > > Nicolas Bertrand wrote: >>> It's generally much easier to figure out what exactly you're doing and >>> whether that's a good way of doing things or not with the exact code at >>> hand. Any chance you could post some of your code that captures the >>> essence of your app, ideally as a stand-alone program? >>> >> >> Hi Find attached a standalone python program. >> This program display a stream ( videotestsec) and allow also making a >> snapshot of th stream ( Capture button) . the image result is in >> snapshot.jpg >> >> >> The snapshot is made via method capture in class my_gst. Is that the >> correct way to that kind of operation ; i.e. storibng an image from >> stream in file. >> >> >> ------------------------------------------------------------------------ >> >> ------------------------------------------------------------------------------ >> >> Register Now & Save for Velocity, the Web Performance & Operations >> Conference from O'Reilly Media. Velocity features a full day of >> expert-led, hands-on workshops and two days of sessions from industry >> leaders in dedicated Performance & Operations tracks. Use code >> vel09scf and Save an extra 15% before 5/3. >> http://p.sf.net/sfu/velocityconf >> ------------------------------------------------------------------------ >> >> _______________________________________________ >> gstreamer-devel mailing list >> [hidden email] >> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel > ------------------------------------------------------------------------------ Register Now & Save for Velocity, the Web Performance & Operations Conference from O'Reilly Media. Velocity features a full day of expert-led, hands-on workshops and two days of sessions from industry leaders in dedicated Performance & Operations tracks. Use code vel09scf and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by Nicolas Bertrand-4
> Hi Find attached a standalone python program. > This program display a stream ( videotestsec) and allow also making a > snapshot of th stream ( Capture button) . the image result is in > snapshot.jpg > > > The snapshot is made via method capture in class my_gst. Is that the > correct way to that kind of operation ; i.e. storibng an image from > stream in file. > > buffer, in an asynchronous way, is better to made this with the add_buffer_probe/remove_buffer_probe methods or use a connect/disconnect 'handoff" signal ? Is there differences ? a more datasafe mode ? gstreamerly mode ? Nico. ------------------------------------------------------------------------------ Register Now & Save for Velocity, the Web Performance & Operations Conference from O'Reilly Media. Velocity features a full day of expert-led, hands-on workshops and two days of sessions from industry leaders in dedicated Performance & Operations tracks. Use code vel09scf and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Mon, 2009-05-04 at 12:00 +0200, Nicolas Bertrand wrote:
> Actually my question is to know if for save to file a single gstreamer > buffer, in an asynchronous way, is better to made this with the > add_buffer_probe/remove_buffer_probe methods or use a connect/disconnect > 'handoff" signal ? Is there differences ? a more datasafe mode ? > gstreamerly mode ? Both more or less do the same thing. If you want to do things asynhronously you could post an application message conaining the buffer on the bus in the callback and then handle it later in your application (didn't check if that fits your use case. just thought I'd mention it). Cheers -Tim ------------------------------------------------------------------------------ Register Now & Save for Velocity, the Web Performance & Operations Conference from O'Reilly Media. Velocity features a full day of expert-led, hands-on workshops and two days of sessions from industry leaders in dedicated Performance & Operations tracks. Use code vel09scf and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by Nicolas Bertrand-4
Hi Nicolas
Just a quick note about your approach: notice that your jpeg-encoding pipeline is always running, so basically you are permanently converting the video source into JPEG, and using it infrequently. It could be considered as a waste of CPU. For a similar task, I have chosen to use 2 different pipelines (1 for playing, the other for encoding), and had a solution (see the recent "Image conversion through a pipeline" thread) which I need to update since it recently broke. I will keep the list posted when I update my code. Olivier On Sat, 2009-05-02 at 19:07 +0200, Nicolas Bertrand wrote: > Hello, I am writing an application who needs to make a snasphot of a n > image videostream when a button is clicked. The application is written > in python and I use pygst. > > I need a confirmation for the way to do that : I use the > add_buffer_probe function 'padded' on a bin. And with the callback > associated to add_buffer_probe i save to a jpeg file and deactivate the > probe > > Is this the correct way to handle that kind of feature with pytgon gst ? ------------------------------------------------------------------------------ The NEW KODAK i700 Series Scanners deliver under ANY circumstances! Your production scanning environment may not be a perfect world - but thanks to Kodak, there's a perfect scanner to get the job done! With the NEW KODAK i700 Series Scanner you'll get full speed at 300 dpi even with all image processing features enabled. http://p.sf.net/sfu/kodak-com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |