Hello everyone!
I'm very new to GStreamer, i'm experimenting with the framework since a week. My current goal would be to write a radio scheduler application which can handle audio playback required for a web radio, stream the result with different encoders and optionally do audio output on a local audio hardware.
I did some tests about how can i plugin different sources to an Adder so i can use the pipeline as an audio mixer. After finding the pad blocking capability of GStreamer, i could add and remove audio sources into the pipeline without stopping it. http://stackoverflow.com/questions/3899666/adding-and-removing-audio-sources-to-from-gstreamer-pipeline-on-the-go I want to do this with the audio outputs as well, i like to add and remove different audio sinks while the pipeline is running.
I've added a Tee after the Adder so i can split up sources to many outputs. If i fire up the pipeline with an autoaudiosink (with it's own queue), playback works fine. If i add a fakesink to the Tee's output on the go (requesting a new src request pad an linking it with an another queue in the middle), one of my audio sources pauses the pipeline with a
basesrc gstbasesrc.c:2447:gst_base_src_loop:<Buzzer1> pausing after gst_pad_push() = wrong-state error message in the debug output. I've tried to put a block on the Adder's source output while adding the new audio sink, but it wouldn't help me.
Here is the code i'm working with), it's the Java rewrite of the StackOverflow Python script what i linked above. I don't know what to try next. If i put the pipeline together with two sinks before the first start, everything works fine.
package test; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.gstreamer.Element;
import org.gstreamer.ElementFactory; import org.gstreamer.Gst; import org.gstreamer.Pad; import org.gstreamer.Pipeline; import org.gstreamer.State; public class Main {
public static void main(String[] args) { Gst.init("RadioBeans", args); Pipeline pipe = new Pipeline("Scheduler"); Element mixer = ElementFactory.make("adder", "Mixer");
pipe.add(mixer); Element tee = ElementFactory.make("tee", "Splitter"); pipe.add(tee); mixer.link(tee);
Element audioSink = ElementFactory.make("autoaudiosink","AudioOutput"); pipe.add(audioSink); Element audioQueue = ElementFactory.make("queue","AudioQueue");
pipe.add(audioQueue); tee.link(audioQueue); audioQueue.link(audioSink); Pad mixerLine1=mixer.getRequestPad("sink%d"); Pad mixerLine2=mixer.getRequestPad("sink%d");
Element buzzer1 = ElementFactory.make("audiotestsrc", "Buzzer1"); pipe.add(buzzer1); buzzer1.set("freq", 1000); Pad buzzer1src=buzzer1.getSrcPads().get(0);
buzzer1src.link(mixerLine1); Element buzzer2 = ElementFactory.make("audiotestsrc", "Buzzer2"); pipe.add(buzzer2); buzzer2.set("freq", 500);
Pad buzzer2src=buzzer2.getSrcPads().get(0); buzzer2src.link(mixerLine2); pipe.play(); try { System.in.read();
} catch (IOException ex) { Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex); } Element fakeOutput = ElementFactory.make("fakesink", "FakeOutput");
pipe.add(fakeOutput); Element fakeQueue = ElementFactory.make("queue","FakeQueue"); pipe.add(fakeQueue); fakeQueue.link(fakeOutput);
tee.link(fakeQueue); pipe.play(); System.out.println("Added fakesink"); try { System.in.read();
} catch (IOException ex) { Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex); } buzzer1src.setBlocked(true); buzzer1.setState(State.NULL);
buzzer1src.unlink(mixerLine1); mixer.releaseRequestPad(mixerLine1); System.out.println("Released buzzer1"); try {
System.in.read(); } catch (IOException ex) { Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex); } System.out.println("Stopping.");
pipe.stop(); } } ------------------------------------------------------------------------------ Download new Adobe(R) Flash(R) Builder(TM) 4 The new Adobe(R) Flex(R) 4 and Flash(R) Builder(TM) 4 (formerly Flex(R) Builder(TM)) enable the development of rich applications that run across multiple browsers and platforms. Download your free trials today! http://p.sf.net/sfu/adobe-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
When you add an element to a bin, you must
match its state to that of the bin. I think the proper sequence is:
When I add a branch to a tee, I put
everything on that branch in its own bin. That way, on addition & removal,
I’m just dealing with that one bin. Matt From: Nagy István
[mailto:[hidden email]] Hello everyone! I'm very new to GStreamer, i'm experimenting with the framework since a
week. My current goal would be to write a radio scheduler application which can
handle audio playback required for a web radio, stream the result with
different encoders and optionally do audio output on a local audio hardware. I did some tests about how can i plugin different sources to an Adder
so i can use the pipeline as an audio mixer. After finding the pad
blocking capability of GStreamer, i could add and remove audio
sources into the pipeline without stopping it. http://stackoverflow.com/questions/3899666/adding-and-removing-audio-sources-to-from-gstreamer-pipeline-on-the-go I
want to do this with the audio outputs as well, i like to add and remove
different audio sinks while the pipeline is running. I've added a Tee after the Adder so i can split up sources to many
outputs. If i fire up the pipeline with an autoaudiosink (with it's own queue),
playback works fine. If i add a fakesink to the Tee's output on the go
(requesting a new src request pad an linking it with an another queue in the
middle), one of my audio sources pauses the pipeline with a basesrc gstbasesrc.c:2447:gst_base_src_loop:<Buzzer1> pausing
after gst_pad_push() = wrong-state error message in the debug output. I've tried to put a block on the Adder's source output while adding the
new audio sink, but it wouldn't help me. Here is the code i'm working with), it's the Java rewrite of the
StackOverflow Python script what i linked above. I don't know what to try next.
If i put the pipeline together with two sinks before the first start,
everything works fine. package test; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.gstreamer.Element; import org.gstreamer.ElementFactory; import org.gstreamer.Gst; import org.gstreamer.Pad; import org.gstreamer.Pipeline; import org.gstreamer.State; public class public static void main(String[] args) { Gst.init("RadioBeans",
args); Pipeline pipe = new
Pipeline("Scheduler"); Element mixer =
ElementFactory.make("adder", "Mixer"); pipe.add(mixer); Element tee =
ElementFactory.make("tee", "Splitter"); pipe.add(tee); mixer.link(tee); Element audioSink =
ElementFactory.make("autoaudiosink","AudioOutput"); pipe.add(audioSink); Element audioQueue =
ElementFactory.make("queue","AudioQueue"); pipe.add(audioQueue); tee.link(audioQueue); audioQueue.link(audioSink); Pad
mixerLine1=mixer.getRequestPad("sink%d"); Pad
mixerLine2=mixer.getRequestPad("sink%d"); Element buzzer1 =
ElementFactory.make("audiotestsrc", "Buzzer1"); pipe.add(buzzer1); buzzer1.set("freq", 1000); Pad
buzzer1src=buzzer1.getSrcPads().get(0); buzzer1src.link(mixerLine1); Element buzzer2 =
ElementFactory.make("audiotestsrc", "Buzzer2"); pipe.add(buzzer2); buzzer2.set("freq", 500); Pad
buzzer2src=buzzer2.getSrcPads().get(0); buzzer2src.link(mixerLine2); pipe.play(); try { System.in.read(); } catch (IOException ex) {
Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex); } Element fakeOutput =
ElementFactory.make("fakesink", "FakeOutput"); pipe.add(fakeOutput); Element fakeQueue =
ElementFactory.make("queue","FakeQueue"); pipe.add(fakeQueue); fakeQueue.link(fakeOutput); tee.link(fakeQueue); pipe.play(); System.out.println("Added
fakesink"); try { System.in.read(); } catch (IOException ex) {
Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex); } buzzer1src.setBlocked(true); buzzer1.setState(State.NULL); buzzer1src.unlink(mixerLine1); mixer.releaseRequestPad(mixerLine1); System.out.println("Released
buzzer1"); try { System.in.read(); } catch (IOException ex) {
Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex); }
System.out.println("Stopping."); pipe.stop(); } } ------------------------------------------------------------------------------ Download new Adobe(R) Flash(R) Builder(TM) 4 The new Adobe(R) Flex(R) 4 and Flash(R) Builder(TM) 4 (formerly Flex(R) Builder(TM)) enable the development of rich applications that run across multiple browsers and platforms. Download your free trials today! http://p.sf.net/sfu/adobe-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Thank you Matt!
I did it as you said and it worked like a charm. :)
I had to enable synchronization on fakesink if i add that first to the tee before playback. If i don't do so i can't connect an autoaudiosink while the pipeline is running. I get a buffer overflow error from the new sink. I guess fakesink starts in non realtime mode as a default.
István On Fri, Oct 15, 2010 at 6:24 AM, Gruenke, Matt <[hidden email]> wrote:
------------------------------------------------------------------------------ Download new Adobe(R) Flash(R) Builder(TM) 4 The new Adobe(R) Flex(R) 4 and Flash(R) Builder(TM) 4 (formerly Flex(R) Builder(TM)) enable the development of rich applications that run across multiple browsers and platforms. Download your free trials today! http://p.sf.net/sfu/adobe-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |