Implementing support for Apple HTTP Live Streaming...

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Implementing support for Apple HTTP Live Streaming...

Stephen Buck
I'm currently implementing a GStreamer plugin to handle video playback using Apple's HTTP Live Streaming protocol (http://en.wikipedia.org/wiki/HTTP_Live_Streaming), which is used by the iPhone and iPad, and need some advice on the best way to proceed. The protocol works roughly like this:

1) A client (GStreamer) opens an HTTP URI that points to an .m3u8 (mime type = application/vnd.apple.mpegurl) file. The m3u8 file contains an ordered list of URIs that point to small mpeg TS files (mime type = video/mpegts) that contain small fragments of the entire video. Something like:

...
#EXTINF:10, first 10 sec fragment
http://localhost/myvideo/fileSequence0.ts

#EXTINF:10, second 10 sec fragment
http://localhost/myvideo/fileSequence1.ts

#EXTINF:10, third 10 sec fragment
http://localhost/myvideo/fileSequence2.ts

#EXTINF:10, fourth 10 sec fragment
http://localhost/myvideo/fileSequence3.ts



2) The client plays the video by reading each of the fragment URIs and pushing them downstream to a pipeline that knows how to demux and decode the mpeg TS content. A typical pipeline might look like this:

souphttpsrc location=http://localhost/myvideo.m3u8 ! livestreamdec ! queue ! mpegtsdemux ! queue ! ffdec_h264 ! queue ! ffmpegcolorspace ! queue ! xvimagesink

or, even simpler by using playbin:

playbin uri=http://localhost/myvideo.m3u8


3) In the case of a fixed length video, the m3u8 contains URIs for all of the fragments that make up the video. In the case of live video, the m3u8 contains URIs for a relatively small window of the live stream and must be re-reread when the playback progresses past the last URI in the m3u8.


I've got many of the pieces working, but I've had to use a few hacks and I'm certain I have not done it the proper way. My design is currently a decoder that extends GstBaseSrc but adds on a single sink pad for the m3u8 mime type. It waits for EOS on the sink pad, but does not pass it downstream. It then parses the m3u8 and starts a task on the src pad that opens the mpeg TS URIs and pushes them downstream. Here are the main problems:

1) The URIs of the mpeg TS files can be relative to the URI of the containing m3u8, which I don't seem to be able to get from souphttpsrc.

2) I have not figured out how to get souphttpsrc to re-read the contents of its assigned URI.

3) My decoder is using libsoup to get the contents of the mpeg TS files, but this doesn't integrate well with the proxy settings, session, etc. used by souphttpsrc. It would be nice if I could use my upstream souphttpsrc to read these child URIs as well.

4) I'm not really sure if I should be a source (seems like a bad idea for compatibility with playbin, but the base functionality is nice), a decoder (decodebin seems to look for this in my element class name), a demuxed, or something else.

The advanced form of this protocol supports dynamic bit-rates by pointing the child URIs to other m3u8 files that have the format described above. There is one child URI each bit-rate encoding of the video and the client can switch between them depending on network qos. Once the above works, this part should be pretty easy.


I'm fairly new to gstreamer and have made a lot of progress, but I'm having trouble with these issues, so any ideas or advice are welcome.


------------------------------------------------------------------------------
ThinkGeek and WIRED's GeekDad team up for the Ultimate
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
lucky parental unit.  See the prize list and enter to win:
http://p.sf.net/sfu/thinkgeek-promo
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Implementing support for Apple HTTP Live Streaming...

michael smith-6-3
On Wed, Jun 9, 2010 at 10:30 AM, Stephen Buck <[hidden email]> wrote:
> I'm currently implementing a GStreamer plugin to handle video playback using Apple's HTTP Live Streaming protocol (http://en.wikipedia.org/wiki/HTTP_Live_Streaming), which is used by the iPhone and iPad, and need some advice on the best way to proceed. The protocol works roughly like this:

Here's a rough outline of how I'd go about implementing this:

Implement an element called: applehttplivestreamingsrc.

This will be a bin, with a single source pad. Conceptually this is a
little similar to rtspsrc - that's a bin that has source pads (though
it has multiple source pads).

It'll have a property for the URI to the m3u file (and later other
things you might need to configure).

When it starts up, it'll fetch the m3u file. Probably best done by
internally instantiating an http uri handler element, and getting the
data from that - that way you don't tie it to souphttpsrc.

Then it can instantiate more http uri handlers, to get the actual
MPEGTS data. The first of these will be hooked up to the bin's src pad
(which is a ghost pad).

On receiving EOS on the http source, it'll switch out the http source
for the next source, and not let the EOS through. You may also need to
do some mangling of other events, like new segments? Can use event
probes for this, probably.

Later, you might add:
 - pre-fetching the next URI, and buffering (probably using a queue
element), so that at segment boundaries you don't stall while waiting
for the next connection to start up.
 - Re-fetching the m3u8 file when required for live streams.
 - Making your source bin implement the uri handler interface, with
some custom protocol, so you'd be able to do something like:
gst-launch playbin2 uri=applelivehttp://server/mystream.m3u8

Hope that helps a bit - though it doesn't directly answer your
questions, I think this design will work a lot better that what you
have currently.

Mike

------------------------------------------------------------------------------
ThinkGeek and WIRED's GeekDad team up for the Ultimate
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
lucky parental unit.  See the prize list and enter to win:
http://p.sf.net/sfu/thinkgeek-promo
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Implementing support for Apple HTTP Live Streaming...

Bastien Nocera-2
On Wed, 2010-06-09 at 11:53 -0700, Michael Smith wrote:

> On Wed, Jun 9, 2010 at 10:30 AM, Stephen Buck <[hidden email]> wrote:
> > I'm currently implementing a GStreamer plugin to handle video playback using Apple's HTTP Live Streaming protocol (http://en.wikipedia.org/wiki/HTTP_Live_Streaming), which is used by the iPhone and iPad, and need some advice on the best way to proceed. The protocol works roughly like this:
>
> Here's a rough outline of how I'd go about implementing this:
>
> Implement an element called: applehttplivestreamingsrc.
>
> This will be a bin, with a single source pad. Conceptually this is a
> little similar to rtspsrc - that's a bin that has source pads (though
> it has multiple source pads).
>
> It'll have a property for the URI to the m3u file (and later other
> things you might need to configure).
>
> When it starts up, it'll fetch the m3u file. Probably best done by
> internally instantiating an http uri handler element, and getting the
> data from that - that way you don't tie it to souphttpsrc.
>
> Then it can instantiate more http uri handlers, to get the actual
> MPEGTS data. The first of these will be hooked up to the bin's src pad
> (which is a ghost pad).
>
> On receiving EOS on the http source, it'll switch out the http source
> for the next source, and not let the EOS through. You may also need to
> do some mangling of other events, like new segments? Can use event
> probes for this, probably.
>
> Later, you might add:
>  - pre-fetching the next URI, and buffering (probably using a queue
> element), so that at segment boundaries you don't stall while waiting
> for the next connection to start up.
>  - Re-fetching the m3u8 file when required for live streams.

That shouldn't be needed.

>  - Making your source bin implement the uri handler interface, with
> some custom protocol, so you'd be able to do something like:
> gst-launch playbin2 uri=applelivehttp://server/mystream.m3u8

Gross. It already has a specific mime-type, and should be pretty easy to
add a typefinder for (check shared-mime-info for some details).

> Hope that helps a bit - though it doesn't directly answer your
> questions, I think this design will work a lot better that what you
> have currently.

I'll just add that Marc-André was looking at implementing this on IRC as
well.

Cheers


------------------------------------------------------------------------------
ThinkGeek and WIRED's GeekDad team up for the Ultimate
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
lucky parental unit.  See the prize list and enter to win:
http://p.sf.net/sfu/thinkgeek-promo
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Implementing support for Apple HTTP Live Streaming...

Stephen Buck
I have looked at Marc-André's python player that uses GStreamer for this protocol at http://gitorious.net/hls-player and it's great for understanding the protocol. I just need to turn this into a GStreamer plugin that plays well with playbin so that arbitrary players can make use of it.

It sounds like I should be a source since I can easily implement a URI property and use libsoup to do all of my HTTP work. This works well when the pipeline can be defined manually. Unfortunately, playbin uses souphttpsrc for all HTTP URIs. If I add an optional m3u8 sink pad and act like a decoder so that playbin will connect souphttpsrc to my source is there any way I could just ignore the incoming content but use its URI to initialize my URI? Definitely a hack, but it might give me compatibility with playbin.

On Jun 9, 2010, at 2:43 PM, Bastien Nocera wrote:

> On Wed, 2010-06-09 at 11:53 -0700, Michael Smith wrote:
>> On Wed, Jun 9, 2010 at 10:30 AM, Stephen Buck <[hidden email]> wrote:
>>> I'm currently implementing a GStreamer plugin to handle video playback using Apple's HTTP Live Streaming protocol (http://en.wikipedia.org/wiki/HTTP_Live_Streaming), which is used by the iPhone and iPad, and need some advice on the best way to proceed. The protocol works roughly like this:
>>
>> Here's a rough outline of how I'd go about implementing this:
>>
>> Implement an element called: applehttplivestreamingsrc.
>>
>> This will be a bin, with a single source pad. Conceptually this is a
>> little similar to rtspsrc - that's a bin that has source pads (though
>> it has multiple source pads).
>>
>> It'll have a property for the URI to the m3u file (and later other
>> things you might need to configure).
>>
>> When it starts up, it'll fetch the m3u file. Probably best done by
>> internally instantiating an http uri handler element, and getting the
>> data from that - that way you don't tie it to souphttpsrc.
>>
>> Then it can instantiate more http uri handlers, to get the actual
>> MPEGTS data. The first of these will be hooked up to the bin's src pad
>> (which is a ghost pad).
>>
>> On receiving EOS on the http source, it'll switch out the http source
>> for the next source, and not let the EOS through. You may also need to
>> do some mangling of other events, like new segments? Can use event
>> probes for this, probably.
>>
>> Later, you might add:
>> - pre-fetching the next URI, and buffering (probably using a queue
>> element), so that at segment boundaries you don't stall while waiting
>> for the next connection to start up.
>> - Re-fetching the m3u8 file when required for live streams.
>
> That shouldn't be needed.
>
>> - Making your source bin implement the uri handler interface, with
>> some custom protocol, so you'd be able to do something like:
>> gst-launch playbin2 uri=applelivehttp://server/mystream.m3u8
>
> Gross. It already has a specific mime-type, and should be pretty easy to
> add a typefinder for (check shared-mime-info for some details).
>
>> Hope that helps a bit - though it doesn't directly answer your
>> questions, I think this design will work a lot better that what you
>> have currently.
>
> I'll just add that Marc-André was looking at implementing this on IRC as
> well.
>
> Cheers
>
>
> ------------------------------------------------------------------------------
> ThinkGeek and WIRED's GeekDad team up for the Ultimate
> GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
> lucky parental unit.  See the prize list and enter to win:
> http://p.sf.net/sfu/thinkgeek-promo
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel


------------------------------------------------------------------------------
ThinkGeek and WIRED's GeekDad team up for the Ultimate
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
lucky parental unit.  See the prize list and enter to win:
http://p.sf.net/sfu/thinkgeek-promo
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Implementing support for Apple HTTP Live Streaming...

Bastien Nocera-2
On Wed, 2010-06-09 at 16:13 -0600, Stephen Buck wrote:
> I have looked at Marc-André's python player that uses GStreamer for
> this protocol at http://gitorious.net/hls-player and it's great for
> understanding the protocol. I just need to turn this into a GStreamer
> plugin that plays well with playbin so that arbitrary players can make
> use of it.

I understand that. And I'd be happy if it worked out-of-the-box in
Totem.

> It sounds like I should be a source since I can easily implement a URI
> property and use libsoup to do all of my HTTP work. This works well
> when the pipeline can be defined manually. Unfortunately, playbin uses
> souphttpsrc for all HTTP URIs. If I add an optional m3u8 sink pad and
> act like a decoder so that playbin will connect souphttpsrc to my
> source is there any way I could just ignore the incoming content but
> use its URI to initialize my URI? Definitely a hack, but it might give
> me compatibility with playbin.

My idea was:
- Add typefinder for the special m3u8 format used
- Create element that takes in the m3u8 file and either:
  - redirect
  - do the streaming itself

eg.

-----------    -----------------
| httpsrc | -> |    GstBin     |
-----------    |               |
  Set with     | ------------  |
  m3u8 URL     | | httpsrc  |  |
               -----------------
                ^ httpsrc would do
                  the actual streaming of ts videos

Might also be possible to implement this as 2 plugins. One demuxer to
handle the m3u8 format and either redirect (as seen in the bug[1]), or
redirect to a fake URI (as Mike mentioned) that would hide a httpsrc
within itself and hide things like content size (but show duration, as
it would be known from the playlist).

Cheers

[1]: https://bugzilla.gnome.org/show_bug.cgi?id=594035


------------------------------------------------------------------------------
ThinkGeek and WIRED's GeekDad team up for the Ultimate
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
lucky parental unit.  See the prize list and enter to win:
http://p.sf.net/sfu/thinkgeek-promo
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel