playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

Mark Boots
I'm trying to use GStreamer (0.10.28 and 0.10.35 tested, Linux) to play MJPEG streams delivered over HTTP from an Axis video camera.  The network or the camera tend to be somewhat slow.

If I create the following pipeline, everything works great:

gst-launch-0.10 souphttpsrc location=http://ccd1611-403/mjpg/video.mjpg ! multipartdemux ! jpegdec ! autovideosink

However, the final outcome is to use gstreamer inside of Qt's QMultimediaKit, which automatically uses a playbin or playbin2 element, and doesn't offer any capability to manually construct the pipeline.  In this situation:

gst-launch-0.10 playbin2 uri=http://ccd1611-403/mjpg/video.mjpg

I'm experiencing two problems:

1)  Playback doesn't start for about 35 seconds.  During this time, frames are received but I see "Buffering... 0%".  After about 30 seconds, the output window appears and all the frames that piled up during this time are displayed very quickly.  

[Some people have suggested elsewhere on this list that the multipartdemux or multiqueue might be waiting to find out if there is an audio stream, and then blocking until the audio stream is up (or confirmed non-existent).


2)  Once the output window is created and playback starts,  playback pauses and buffers for about 3s between each set of frames... making the real-time experience basically unusable.   It will pause and buffer for 3s, and then quickly show all the frames received during those 3s, and then pause and buffer again [repeat].

I can get around this by setting playbin2's buffer-duration=10000, and then each frame is displayed as soon as it is received.  However, the stream now alternates between playing and paused on every single frame.  [Unfortunately, I have no way of specifying the buffer-duration property when using QMultimediaKit's gst backend.]

This makes the camera essentially useless as a real-time video monitor.  I've tested the same camera(s) using VLC, and it starts playing immediately and without 3s pauses.

Any suggestions for what could be going on here?  Any suggestions for how to work around it?  If I could set up the pipeline manually like the first example, all would be well.  However, I'm stuck using whatever playbin2 creates because I need to use this inside the multimedia module that comes with Qt/QtMobility.

Thanks a lot!
-Mark

================
Mark Boots

Controls Analyst
Canadian Light Source
University of Saskatchewan

[hidden email]

101 Perimeter Rd
Saskatoon, SK
S7N 0X4
================


Verbose log:  No limit on buffering-duration: (Shows problem 1 and 2)

gst-launch-0.10 -v --gst-plugin-spew playbin2 uri=http://ccd1611-403/mjpg/video.mjpg
Setting pipeline to PAUSED ...
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0: connection-speed = 0
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0: download = FALSE
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0: uri = "http://ccd1611-403/mjpg/video.mjpg"
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0: use-buffering = FALSE
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0: buffer-duration = -1
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0: buffer-size = -1
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0: source = (GstSoupHTTPSrc) source
Pipeline is PREROLLING ...
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstTypeFindElement:typefindelement0.GstPad:src: caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstTypeFindElement:typefind: force-caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20: sink-caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstTypeFindElement:typefind.GstPad:src: caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstQueue2:queue20.GstPad:sink: caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstQueue2:queue20.GstPad:src: caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstTypeFindElement:typefind.GstPad:sink: caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20.GstGhostPad:sink: caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20.GstGhostPad:sink.GstProxyPad:proxypad0: caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstMultipartDemux:multipartdemux0.GstPad:sink: caps = multipart/x-mixed-replace
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstMultiQueue:multiqueue0.GstPad:sink0: caps = image/jpeg
buffering... 0%

#[35 seconds go by right here]

/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstMultiQueue:multiqueue0: max-size-buffers = 5
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstMultiQueue:multiqueue0: max-size-time = 0
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstMultiQueue:multiqueue0: max-size-bytes = 2097152
/GstPlayBin2:playbin20/GstPlaybin2InputSelector:playbin2inputselector0.GstPlaybin2SelectorPad:sink0: always-ok = FALSE
/GstPlayBin2:playbin20/GstPlaybin2InputSelector:playbin2inputselector0: active-pad = (GstPlaybin2SelectorPad) sink0
/GstPlayBin2:playbin20/GstPlaybin2InputSelector:playbin2inputselector0.GstPlaybin2SelectorPad:sink0: tags = ((GstTagList*) 0xb6500aa8)
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstMultiQueue:multiqueue0.GstPad:src0: caps = image/jpeg
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0.GstGhostPad:src0: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20.GstDecodePad:src0: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstJpegDec:jpegdec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaybin2InputSelector:playbin2inputselector0.GstPlaybin2SelectorPad:sink0: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0.GstGhostPad:src0.GstProxyPad:proxypad4: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20.GstDecodePad:src0.GstProxyPad:proxypad3: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaybin2InputSelector:playbin2inputselector0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstQueue:vqueue.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin.GstGhostPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0.GstGhostPad:video_raw_sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0.GstGhostPad:video_raw_sink.GstProxyPad:proxypad5: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin.GstGhostPad:sink.GstProxyPad:proxypad7: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstQueue:vqueue.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstFFMpegCsp:vconv.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstFFMpegCsp:vconv.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstVideoScale:vscale.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstVideoScale:vscale.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstAutoVideoSink:videosink/GstXvImageSink:videosink-actual-sink-xvimage.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstAutoVideoSink:videosink.GstGhostPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPlayBin2:playbin20/GstPlaySink:playsink0/GstBin:vbin/GstAutoVideoSink:videosink.GstGhostPad:sink.GstProxyPad:proxypad6: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, framerate=(fraction)0/1
Prerolled, waiting for buffering to finish...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

# [Lots of frames... all piled up during those 30 seconds... shown very quickly here]

Buffering, setting pipeline to PAUSED ...
Prerolled, waiting for buffering to finish...  # [3 seconds go by]
Done buffering, setting pipeline to PLAYING ...  # [All frames received during those 3 seconds now shown quickly]
Buffering, setting pipeline to PAUSED ...
Prerolled, waiting for buffering to finish... # [3 seconds go by]
Done buffering, setting pipeline to PLAYING ... #[All frames received during those 3 seconds now shown quickly]

# [and so on:]

Buffering, setting pipeline to PAUSED ...
Prerolled, waiting for buffering to finish...
Done buffering, setting pipeline to PLAYING ...
Buffering, setting pipeline to PAUSED ...
Prerolled, waiting for buffering to finish...
Done buffering, setting pipeline to PLAYING ...
Buffering, setting pipeline to PAUSED ...
Prerolled, waiting for buffering to finish...
Done buffering, setting pipeline to PLAYING ...
# [...]





_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

dhoyt
> 1)  Playback doesn't start for about 35 seconds.  During this time, frames are received but I see "Buffering... 0%".  After about 30 seconds, the output window appears and all the frames that piled up during this time are displayed very quickly.

I’ve dealt with this exact same problem. The problem is that one of the elements is waiting to see if another pad will need to be created in case another type of stream is muxed into the multipart stream. That is, some cameras will intermix audio and video in the same stream so you'll have some parts in the multipart stream as the jpeg image and others that are audio. The trick is to emit the no-more-pads signal as soon as possible. There's an enhancement for this described here: https://bugzilla.gnome.org/show_bug.cgi?id=616686

To use that, you'll need to find when playbin2 adds a uridecodebin and then find when the uridecodebin adds a decodebin and then locate the multipartdemux in the decodebin and set the property.

> 2)  Once the output window is created and playback starts,  playback pauses and buffers for about 3s between each set of frames... making the real-time experience basically unusable.   It will pause and buffer for 3s, and then quickly show all the frames received during those 3s, and then pause and buffer again [repeat].

I believe setting the single-stream=true property on multipartdemux using the method I described above will resolve this as well.
_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

Mark Boots
Thanks for the suggestion.

> To use that, you'll need to find when playbin2 adds a uridecodebin and then find when the uridecodebin adds a decodebin and then locate the multipartdemux in the decodebin and set the property. [single-stream=true]

Any idea how to do this programmatically, if all I have access to is the playbin2 element?

Thanks!
-Mark

On 2011-06-29, at 6:16 PM, Hoyt, David wrote:

>> 1)  Playback doesn't start for about 35 seconds.  During this time, frames are received but I see "Buffering... 0%".  After about 30 seconds, the output window appears and all the frames that piled up during this time are displayed very quickly.
>
> I’ve dealt with this exact same problem. The problem is that one of the elements is waiting to see if another pad will need to be created in case another type of stream is muxed into the multipart stream. That is, some cameras will intermix audio and video in the same stream so you'll have some parts in the multipart stream as the jpeg image and others that are audio. The trick is to emit the no-more-pads signal as soon as possible. There's an enhancement for this described here: https://bugzilla.gnome.org/show_bug.cgi?id=616686
>
> To use that, you'll need to find when playbin2 adds a uridecodebin and then find when the uridecodebin adds a decodebin and then locate the multipartdemux in the decodebin and set the property.
>
>> 2)  Once the output window is created and playback starts,  playback pauses and buffers for about 3s between each set of frames... making the real-time experience basically unusable.   It will pause and buffer for 3s, and then quickly show all the frames received during those 3s, and then pause and buffer again [repeat].
>
> I believe setting the single-stream=true property on multipartdemux using the method I described above will resolve this as well.
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

dhoyt
>> To use that, you'll need to find when playbin2 adds a uridecodebin and then find when the uridecodebin adds a decodebin and then locate the multipartdemux in the decodebin and set the property. [single-stream=true]

> Any idea how to do this programmatically, if all I have access to is the playbin2 element?

Take a look at the code available here to get you started: http://code.google.com/p/ossbuild/source/browse/trunk/Main/GStreamer/Source/gstreamer/tools/gst-player.c#2407

Follow the callback into playbin_element_added() on line 1427, then into uridecodebin_element_added() on line 1450, then into decodebin_element_added() on line 1471, and finally into examine_element() on line 506. The property is explicitly set on line 514. It's rather ugly, but it does the trick.
_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

Mark Boots
I managed to find the multipartdemux element, but the single-stream property doesn't exist:

If I take out the if(...property exists...) check and simply try to set the property:

(<unknown>:26920): GLib-GObject-WARNING **: g_object_set_valist: object class `GstMultipartDemux' has no property named `single-stream'

I'm using GStreamer version 0.10.35, and gst-plugins-good-0.10.30.  What version was the single-stream property added in?

-Mark

On Wed, 2011-06-29 at 20:55 -0700, Hoyt, David wrote:
>> To use that, you'll need to find when playbin2 adds a uridecodebin and then find when the uridecodebin adds a decodebin and then locate the multipartdemux in the decodebin and set the property. [single-stream=true]

> Any idea how to do this programmatically, if all I have access to is the playbin2 element?

Take a look at the code available here to get you started: http://code.google.com/p/ossbuild/source/browse/trunk/Main/GStreamer/Source/gstreamer/tools/gst-player.c#2407

Follow the callback into playbin_element_added() on line 1427, then into uridecodebin_element_added() on line 1450, then into decodebin_element_added() on line 1471, and finally into examine_element() on line 506. The property is explicitly set on line 514. It's rather ugly, but it does the trick.
_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

dhoyt
> I'm using GStreamer version 0.10.35, and gst-plugins-good-0.10.30.  What version was the single-stream property added in?

It appears that it was committed before the gst-plugins-good 0.10.30 release, but 0.10.30 was just a minor bump from 0.10.29 with a few cherry picked patches. So even though it was committed before the 0.10.30 release, it's not included in it. Apparently it seems it will be available in the next major release. For now, you could compile from git.

See http://cgit.freedesktop.org/gstreamer/gst-plugins-good/commit/?id=86f9fa785a7878ba3c85e37cc4d31ac1dd69a03f dated May 25, 2011 and then http://cgit.freedesktop.org/gstreamer/gst-plugins-good/commit/?id=673d519898d18c513c3b5eeecd91f5d3091ddf79 dated June 15, 2011. Look at the source in the tree for the 0.10.30 release and you'll see the property isn't there: http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/multipart/multipartdemux.c?id=673d519898d18c513c3b5eeecd91f5d3091ddf79 
_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

Mark Boots
Thanks a lot!  I was able to apply your patch to gst-plugins-good-0.10.30, and patch the QtMobility QGStreamerPlayerSession class to set the property.  That fixed it!

  This certainly solves the problem of 30s delays before HTTP MJPEG cameras start to stream.  However, like you said, it's an ugly and cumbersome thing to do... You need to connect three signals and follow three callbacks make it work.  Two questions:

1) Inside a player application that uses playbin2, is there any disadvantage to always setting the "single-stream" property to true? For example, are there any cases where making 'single-stream=true' the default would be undesirable?  Does it break, for example, playing a mpeg4 movie with sound and video from a local file?

2) Should it be considered a bug that, by default, playbin2 and decodebin2 can't play from HTTP MJPEG sources (at least without excessively long delays on playback start, and messed-up buffering)?

-Mark



On 2011-06-30, at 6:56 PM, Hoyt, David wrote:

>> I'm using GStreamer version 0.10.35, and gst-plugins-good-0.10.30.  What version was the single-stream property added in?
>
> It appears that it was committed before the gst-plugins-good 0.10.30 release, but 0.10.30 was just a minor bump from 0.10.29 with a few cherry picked patches. So even though it was committed before the 0.10.30 release, it's not included in it. Apparently it seems it will be available in the next major release. For now, you could compile from git.
>
> See http://cgit.freedesktop.org/gstreamer/gst-plugins-good/commit/?id=86f9fa785a7878ba3c85e37cc4d31ac1dd69a03f dated May 25, 2011 and then http://cgit.freedesktop.org/gstreamer/gst-plugins-good/commit/?id=673d519898d18c513c3b5eeecd91f5d3091ddf79 dated June 15, 2011. Look at the source in the tree for the 0.10.30 release and you'll see the property isn't there: http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/multipart/multipartdemux.c?id=673d519898d18c513c3b5eeecd91f5d3091ddf79 
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: playbin2 with MJPEG streams over http: long delay before playback starts, and buffering delays on each frame

dhoyt
> Thanks a lot!  I was able to apply your patch to gst-plugins-good-0.10.30, and patch the QtMobility QGStreamerPlayerSession class to set the property.  That fixed it!

Glad to hear it! Congrats!

> 1) Inside a player application that uses playbin2, is there any disadvantage to always setting the "single-stream" property to true? For example, are there any cases where making 'single-stream=true' the default would be undesirable?  Does it break, for example, playing a mpeg4 movie with sound and video from a local file?

It's not likely, but in theory it could if you had some container that contained muxed (audio + video) multipart content and you tried to play it back -- whichever media type it found first in the file would be the only one likely to be played (w/ "single-stream" turned on by default). Most (I would guess the vast majority) formats do *not* contain multipart content and therefore the multipartdemux element would not likely be created and added. In my own program, I knew and had control over the types of content being produced so it wasn't a problem turning that flag on if a multipartdemux element was added to the pipeline since I knew there would never be a src that produced audio + video muxed. YMMV.

I think the issue would be changing expected default behavior in the 0.10 series. If it can be avoided, that's a plus -- and I don't think this is considered a major issue worth changing default behavior over. Especially with a known workaround. :/

Now, however, would be a good time to get this changed for the 0.11 work. I'd be in favor of the opposite behavior -- you have to explicitly enable support for multiple streams (e.g. a "multiple-streams" property instead of a "single-stream" property). You could write a patch to do this -- it should be fairly straight forward and easy -- and then submit it against the 0.11 series.

> 2) Should it be considered a bug that, by default, playbin2 and decodebin2 can't play from HTTP MJPEG sources (at least without excessively long delays on playback start, and messed-up buffering)?

I pretty much agree with you because I figure that most people using gstreamer in conjunction w/ HTTP and MJPEG are doing what you and I have done so in the typical use case it seems problematic. However, there may be other use cases where the existing behavior is the correct one and at the time it was written, it was the most applicable solution. But I don't think the gstreamer devs would agree that it's a "bug" -- it simply was not the expected behavior that you or I had when we first started using it. But I think it's reasonable behavior to expect so it should be defendable (IMO). But others may disagree.


_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel