playbin2 undecoded video and about-to-finish signal

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

playbin2 undecoded video and about-to-finish signal

Krutskikh Ivan
Hi,

I'm developing a video recording/broadcasting server with gstreamer and python.

Right now I'm facing 2 tasks:

- recording video from different sources (rtsp and http) and with different codecs (mjpeg,mpeg4,h264,h265) in a series of mkv files

- broadcasting video from my archive to different clients using http and rtsp.

Before discovering the playbin magic I used a number of template pipelines to capture video from source, pass it to multifilesink and ffmpeg to finish and store them in a convinient way, But if I could somehow feed my uri to playbin and get an undecoded video/audio stream from it, I would be able to put all my templates to single gstreamer pipeline: playbin uri=rtsp... (magic here) ! mpegtsmux ! multifilesink...  And that would be very convenient.

The second task is more ambitious. I want to feed the recorded video to my clients as if it was a live source. Right now I have a file tree:

/basedir/cam_name/yearmonthday/hour/minutesecond-duration.mp4

Example:

archive-test:/archive/video/multi/160327/21 # ls
0120-00125.mp4  0730-00120.mp4  1335-00120.mp4  1940-00125.mp4  2545-00120.mp4  3150-00120.mp4  3755-00120.mp4  4400-00125.mp4  5005-00125.mp4  5610-00125.mp4
0325-00120.mp4  0930-00120.mp4  1535-00125.mp4  2145-00120.mp4  2745-00125.mp4  3350-00125.mp4  3955-00125.mp4  4605-00120.mp4  5210-00120.mp4  5815-00120.mp4
0525-00125.mp4  1130-00125.mp4  1740-00120.mp4  2345-00120.mp4  2950-00120.mp4  3555-00120.mp4  4200-00120.mp4  4805-00120.mp4  5410-00120.mp4

At some point in the future I recieve a request to play video from cam multi from  21:05 160327. I can then find my started file, construct a pipeline with playbin and multisocketsink, fast forward to the desired time and replace the uri of the file upon each about-to-finish signal. The complex part is that I also need an undecoded video stream since I plan to broadcast it to remote client without re-encoding.

So my questions are:

1) Is this design possible?
2) Where can I find some examples of such pipelines, preferably in python.

Thanks in advance! 

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 undecoded video and about-to-finish signal

Krutskikh Ivan
I would like to bump my question...

2016-03-27 22:28 GMT+03:00 Krutskikh Ivan <[hidden email]>:
Hi,

I'm developing a video recording/broadcasting server with gstreamer and python.

Right now I'm facing 2 tasks:

- recording video from different sources (rtsp and http) and with different codecs (mjpeg,mpeg4,h264,h265) in a series of mkv files

- broadcasting video from my archive to different clients using http and rtsp.

Before discovering the playbin magic I used a number of template pipelines to capture video from source, pass it to multifilesink and ffmpeg to finish and store them in a convinient way, But if I could somehow feed my uri to playbin and get an undecoded video/audio stream from it, I would be able to put all my templates to single gstreamer pipeline: playbin uri=rtsp... (magic here) ! mpegtsmux ! multifilesink...  And that would be very convenient.

The second task is more ambitious. I want to feed the recorded video to my clients as if it was a live source. Right now I have a file tree:

/basedir/cam_name/yearmonthday/hour/minutesecond-duration.mp4

Example:

archive-test:/archive/video/multi/160327/21 # ls
0120-00125.mp4  0730-00120.mp4  1335-00120.mp4  1940-00125.mp4  2545-00120.mp4  3150-00120.mp4  3755-00120.mp4  4400-00125.mp4  5005-00125.mp4  5610-00125.mp4
0325-00120.mp4  0930-00120.mp4  1535-00125.mp4  2145-00120.mp4  2745-00125.mp4  3350-00125.mp4  3955-00125.mp4  4605-00120.mp4  5210-00120.mp4  5815-00120.mp4
0525-00125.mp4  1130-00125.mp4  1740-00120.mp4  2345-00120.mp4  2950-00120.mp4  3555-00120.mp4  4200-00120.mp4  4805-00120.mp4  5410-00120.mp4

At some point in the future I recieve a request to play video from cam multi from  21:05 160327. I can then find my started file, construct a pipeline with playbin and multisocketsink, fast forward to the desired time and replace the uri of the file upon each about-to-finish signal. The complex part is that I also need an undecoded video stream since I plan to broadcast it to remote client without re-encoding.

So my questions are:

1) Is this design possible?
2) Where can I find some examples of such pipelines, preferably in python.

Thanks in advance! 


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 undecoded video and about-to-finish signal

Tim Müller
In reply to this post by Krutskikh Ivan
On Sun, 2016-03-27 at 22:28 +0300, Krutskikh Ivan wrote:

Hi,

Your subject line mentions 'playbin2' - are you using the outdated and
unmaintained GStreamer 0.10 ? If yes, you should switch to a current
1.x version.

> I'm developing a video recording/broadcasting server with gstreamer
> and python.
>
> Right now I'm facing 2 tasks:
>
> - recording video from different sources (rtsp and http) and with
> different codecs (mjpeg,mpeg4,h264,h265) in a series of mkv files
>
> - broadcasting video from my archive to different clients using http
> and rtsp.
>
> Before discovering the playbin magic I used a number of template
> pipelines to capture video from source, pass it to multifilesink and
> ffmpeg to finish and store them in a convinient way, But if I could
> somehow feed my uri to playbin and get an undecoded video/audio
> stream from it, I would be able to put all my templates to single
> gstreamer pipeline: playbin uri=rtsp... (magic here) ! mpegtsmux !
> multifilesink...  And that would be very convenient.

For what it's worth, there is also decodebin and uridecodebin, which
are used inside of playbin but lower level.

You can also make (uri)decodebin stop autoplugging decoders early so
you can transmux without re-encoding, for example (auto* signals).


> The second task is more ambitious. I want to feed the recorded video
> to my clients as if it was a live source. Right now I have a file
> tree:
>
> /basedir/cam_name/yearmonthday/hour/minutesecond-duration.mp4
>
> Example:
>
> archive-test:/archive/video/multi/160327/21 # ls
> 0120-00125.mp4  0730-00120.mp4  1335-00120.mp4  1940-00125.mp4  2545-
> 00120.mp4  3150-00120.mp4  3755-00120.mp4  4400-00125.mp4  5005-
> 00125.mp4  5610-00125.mp4
> 0325-00120.mp4  0930-00120.mp4  1535-00125.mp4  2145-00120.mp4  2745-
> 00125.mp4  3350-00125.mp4  3955-00125.mp4  4605-00120.mp4  5210-
> 00120.mp4  5815-00120.mp4
> 0525-00125.mp4  1130-00125.mp4  1740-00120.mp4  2345-00120.mp4  2950-
> 00120.mp4  3555-00120.mp4  4200-00120.mp4  4805-00120.mp4  5410-
> 00120.mp4
>
> At some point in the future I recieve a request to play video from
> cam multi from  21:05 160327. I can then find my started file,
> construct a pipeline with playbin and multisocketsink, fast forward
> to the desired time and replace the uri of the file upon each about-
> to-finish signal. The complex part is that I also need an undecoded
> video stream since I plan to broadcast it to remote client without
> re-encoding.
>
> So my questions are:
>
> 1) Is this design possible?
> 2) Where can I find some examples of such pipelines, preferably in
> python.

It's all possible, but will be a bit fiddly.

You probably want something lower-level than playbin.

In recent GStreamer versions we have splitmuxsink and splitmuxsrc which
you might find helpful in this context.

What protocols do you want to stream as? The easiest would be to just
use gst-rtsp-server (see gst-rtsp-server/examples for some simple
examples).

Cheers
 -Tim


--
Tim Müller, Centricular Ltd - http://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 undecoded video and about-to-finish signal

Krutskikh Ivan
Hi!

Thanks for pointing me to splitmuxsink and splitmuxsrc. That would spare me a good chunk of code.=)

The only thing I don't understand about splitmuxsrc is how to specify a specific file list or to modify it in playing state. By default I have a separate directory for each hour of recording. I can stream files from one hour with location=/archive/video/cam/day/hour/*.mp4, but what happens when I finish this hour? The awkward hack is to make a temp directory and add simlinks to videofiles from main archive to it in advance...

2016-03-30 23:53 GMT+03:00 Tim Müller <[hidden email]>:
On Sun, 2016-03-27 at 22:28 +0300, Krutskikh Ivan wrote:

Hi,

Your subject line mentions 'playbin2' - are you using the outdated and
unmaintained GStreamer 0.10 ? If yes, you should switch to a current
1.x version.

> I'm developing a video recording/broadcasting server with gstreamer
> and python.
>
> Right now I'm facing 2 tasks:
>
> - recording video from different sources (rtsp and http) and with
> different codecs (mjpeg,mpeg4,h264,h265) in a series of mkv files
>
> - broadcasting video from my archive to different clients using http
> and rtsp.
>
> Before discovering the playbin magic I used a number of template
> pipelines to capture video from source, pass it to multifilesink and
> ffmpeg to finish and store them in a convinient way, But if I could
> somehow feed my uri to playbin and get an undecoded video/audio
> stream from it, I would be able to put all my templates to single
> gstreamer pipeline: playbin uri=rtsp... (magic here) ! mpegtsmux !
> multifilesink...  And that would be very convenient.

For what it's worth, there is also decodebin and uridecodebin, which
are used inside of playbin but lower level.

You can also make (uri)decodebin stop autoplugging decoders early so
you can transmux without re-encoding, for example (auto* signals).


> The second task is more ambitious. I want to feed the recorded video
> to my clients as if it was a live source. Right now I have a file
> tree:
>
> /basedir/cam_name/yearmonthday/hour/minutesecond-duration.mp4
>
> Example:
>
> archive-test:/archive/video/multi/160327/21 # ls
> 0120-00125.mp4  0730-00120.mp4  1335-00120.mp4  1940-00125.mp4  2545-
> 00120.mp4  3150-00120.mp4  3755-00120.mp4  4400-00125.mp4  5005-
> 00125.mp4  5610-00125.mp4
> 0325-00120.mp4  0930-00120.mp4  1535-00125.mp4  2145-00120.mp4  2745-
> 00125.mp4  3350-00125.mp4  3955-00125.mp4  4605-00120.mp4  5210-
> 00120.mp4  5815-00120.mp4
> 0525-00125.mp4  1130-00125.mp4  1740-00120.mp4  2345-00120.mp4  2950-
> 00120.mp4  3555-00120.mp4  4200-00120.mp4  4805-00120.mp4  5410-
> 00120.mp4
>
> At some point in the future I recieve a request to play video from
> cam multi from  21:05 160327. I can then find my started file,
> construct a pipeline with playbin and multisocketsink, fast forward
> to the desired time and replace the uri of the file upon each about-
> to-finish signal. The complex part is that I also need an undecoded
> video stream since I plan to broadcast it to remote client without
> re-encoding.
>
> So my questions are:
>
> 1) Is this design possible?
> 2) Where can I find some examples of such pipelines, preferably in
> python.

It's all possible, but will be a bit fiddly.

You probably want something lower-level than playbin.

In recent GStreamer versions we have splitmuxsink and splitmuxsrc which
you might find helpful in this context.

What protocols do you want to stream as? The easiest would be to just
use gst-rtsp-server (see gst-rtsp-server/examples for some simple
examples).

Cheers
 -Tim


--
Tim Müller, Centricular Ltd - http://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

AW: playbin2 undecoded video and about-to-finish signal

Thornton, Keith

Hi,

I don’t think it was designed fort hat use-case. It was designed to put the files in one directory with incrementing file names. Why not just move the files to the required directory using a background thread at the end of each hour.

 

Von: gstreamer-devel [mailto:[hidden email]] Im Auftrag von Krutskikh Ivan
Gesendet: Donnerstag, 31. März 2016 11:14
An: Discussion of the development of and with GStreamer
Betreff: Re: playbin2 undecoded video and about-to-finish signal

 

Hi!

Thanks for pointing me to splitmuxsink and splitmuxsrc. That would spare me a good chunk of code.=)

The only thing I don't understand about splitmuxsrc is how to specify a specific file list or to modify it in playing state. By default I have a separate directory for each hour of recording. I can stream files from one hour with location=/archive/video/cam/day/hour/*.mp4, but what happens when I finish this hour? The awkward hack is to make a temp directory and add simlinks to videofiles from main archive to it in advance...

 

2016-03-30 23:53 GMT+03:00 Tim Müller <[hidden email]>:

On Sun, 2016-03-27 at 22:28 +0300, Krutskikh Ivan wrote:

Hi,

Your subject line mentions 'playbin2' - are you using the outdated and
unmaintained GStreamer 0.10 ? If yes, you should switch to a current
1.x version.

> I'm developing a video recording/broadcasting server with gstreamer
> and python.
>
> Right now I'm facing 2 tasks:
>
> - recording video from different sources (rtsp and http) and with
> different codecs (mjpeg,mpeg4,h264,h265) in a series of mkv files
>
> - broadcasting video from my archive to different clients using http
> and rtsp.
>
> Before discovering the playbin magic I used a number of template
> pipelines to capture video from source, pass it to multifilesink and
> ffmpeg to finish and store them in a convinient way, But if I could
> somehow feed my uri to playbin and get an undecoded video/audio
> stream from it, I would be able to put all my templates to single
> gstreamer pipeline: playbin uri=rtsp... (magic here) ! mpegtsmux !
> multifilesink...  And that would be very convenient.

For what it's worth, there is also decodebin and uridecodebin, which
are used inside of playbin but lower level.

You can also make (uri)decodebin stop autoplugging decoders early so
you can transmux without re-encoding, for example (auto* signals).



> The second task is more ambitious. I want to feed the recorded video
> to my clients as if it was a live source. Right now I have a file
> tree:
>
> /basedir/cam_name/yearmonthday/hour/minutesecond-duration.mp4
>
> Example:
>
> archive-test:/archive/video/multi/160327/21 # ls
> 0120-00125.mp4  0730-00120.mp4  1335-00120.mp4  1940-00125.mp4  2545-
> 00120.mp4  3150-00120.mp4  3755-00120.mp4  4400-00125.mp4  5005-
> 00125.mp4  5610-00125.mp4
> 0325-00120.mp4  0930-00120.mp4  1535-00125.mp4  2145-00120.mp4  2745-
> 00125.mp4  3350-00125.mp4  3955-00125.mp4  4605-00120.mp4  5210-
> 00120.mp4  5815-00120.mp4
> 0525-00125.mp4  1130-00125.mp4  1740-00120.mp4  2345-00120.mp4  2950-
> 00120.mp4  3555-00120.mp4  4200-00120.mp4  4805-00120.mp4  5410-
> 00120.mp4
>
> At some point in the future I recieve a request to play video from
> cam multi from  21:05 160327. I can then find my started file,
> construct a pipeline with playbin and multisocketsink, fast forward
> to the desired time and replace the uri of the file upon each about-
> to-finish signal. The complex part is that I also need an undecoded
> video stream since I plan to broadcast it to remote client without
> re-encoding.
>
> So my questions are:
>
> 1) Is this design possible?
> 2) Where can I find some examples of such pipelines, preferably in
> python.

It's all possible, but will be a bit fiddly.

You probably want something lower-level than playbin.

In recent GStreamer versions we have splitmuxsink and splitmuxsrc which
you might find helpful in this context.

What protocols do you want to stream as? The easiest would be to just
use gst-rtsp-server (see gst-rtsp-server/examples for some simple
examples).

Cheers
 -Tim


--
Tim Müller, Centricular Ltd - http://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

 


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 undecoded video and about-to-finish signal

Tim Müller
In reply to this post by Krutskikh Ivan
On Thu, 2016-03-31 at 12:13 +0300, Krutskikh Ivan wrote:

Hi,

> Thanks for pointing me to splitmuxsink and splitmuxsrc. That would
> spare me a good chunk of code.=) 
>
> The only thing I don't understand about splitmuxsrc is how to specify
> a specific file list or to modify it in playing state. By default I
> have a separate directory for each hour of recording. I can stream
> files from one hour with location=/archive/video/cam/day/hour/*.mp4,
> but what happens when I finish this hour? The awkward hack is to make
> a temp directory and add simlinks to videofiles from main archive to
> it in advance...

splitmuxsink has the "format-location" signal.

splitmuxsrc might need new API for that (e.g. a "locations" property of
type G_TYPE_STRV so you can set an array of file names).

 Cheers
  -Tim

--
Tim Müller, Centricular Ltd - http://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 undecoded video and about-to-finish signal

Krutskikh Ivan
In reply to this post by Thornton, Keith
Right now I have this routine implemented:

multifilesink writes files of fixed time to temporary dir and I wait for message on the bus. Upon each new file message I move the file to my directory tree. So if I understood you right, if I  want to broadcast those files later I should either:

1) Move or make simlinks to  those files to temporary dir with a wildcard like video%d.mp4

2) Prepare a m3u8 manifest pointing to my files location and use hlsdemux, update the m3u8 dynamicaly


2016-03-31 12:26 GMT+03:00 Thornton, Keith <[hidden email]>:

Hi,

I don’t think it was designed fort hat use-case. It was designed to put the files in one directory with incrementing file names. Why not just move the files to the required directory using a background thread at the end of each hour.

 

Von: gstreamer-devel [mailto:[hidden email]] Im Auftrag von Krutskikh Ivan
Gesendet: Donnerstag, 31. März 2016 11:14
An: Discussion of the development of and with GStreamer
Betreff: Re: playbin2 undecoded video and about-to-finish signal

 

Hi!

Thanks for pointing me to splitmuxsink and splitmuxsrc. That would spare me a good chunk of code.=)

The only thing I don't understand about splitmuxsrc is how to specify a specific file list or to modify it in playing state. By default I have a separate directory for each hour of recording. I can stream files from one hour with location=/archive/video/cam/day/hour/*.mp4, but what happens when I finish this hour? The awkward hack is to make a temp directory and add simlinks to videofiles from main archive to it in advance...

 

2016-03-30 23:53 GMT+03:00 Tim Müller <[hidden email]>:

On Sun, 2016-03-27 at 22:28 +0300, Krutskikh Ivan wrote:

Hi,

Your subject line mentions 'playbin2' - are you using the outdated and
unmaintained GStreamer 0.10 ? If yes, you should switch to a current
1.x version.

> I'm developing a video recording/broadcasting server with gstreamer
> and python.
>
> Right now I'm facing 2 tasks:
>
> - recording video from different sources (rtsp and http) and with
> different codecs (mjpeg,mpeg4,h264,h265) in a series of mkv files
>
> - broadcasting video from my archive to different clients using http
> and rtsp.
>
> Before discovering the playbin magic I used a number of template
> pipelines to capture video from source, pass it to multifilesink and
> ffmpeg to finish and store them in a convinient way, But if I could
> somehow feed my uri to playbin and get an undecoded video/audio
> stream from it, I would be able to put all my templates to single
> gstreamer pipeline: playbin uri=rtsp... (magic here) ! mpegtsmux !
> multifilesink...  And that would be very convenient.

For what it's worth, there is also decodebin and uridecodebin, which
are used inside of playbin but lower level.

You can also make (uri)decodebin stop autoplugging decoders early so
you can transmux without re-encoding, for example (auto* signals).



> The second task is more ambitious. I want to feed the recorded video
> to my clients as if it was a live source. Right now I have a file
> tree:
>
> /basedir/cam_name/yearmonthday/hour/minutesecond-duration.mp4
>
> Example:
>
> archive-test:/archive/video/multi/160327/21 # ls
> 0120-00125.mp4  0730-00120.mp4  1335-00120.mp4  1940-00125.mp4  2545-
> 00120.mp4  3150-00120.mp4  3755-00120.mp4  4400-00125.mp4  5005-
> 00125.mp4  5610-00125.mp4
> 0325-00120.mp4  0930-00120.mp4  1535-00125.mp4  2145-00120.mp4  2745-
> 00125.mp4  3350-00125.mp4  3955-00125.mp4  4605-00120.mp4  5210-
> 00120.mp4  5815-00120.mp4
> 0525-00125.mp4  1130-00125.mp4  1740-00120.mp4  2345-00120.mp4  2950-
> 00120.mp4  3555-00120.mp4  4200-00120.mp4  4805-00120.mp4  5410-
> 00120.mp4
>
> At some point in the future I recieve a request to play video from
> cam multi from  21:05 160327. I can then find my started file,
> construct a pipeline with playbin and multisocketsink, fast forward
> to the desired time and replace the uri of the file upon each about-
> to-finish signal. The complex part is that I also need an undecoded
> video stream since I plan to broadcast it to remote client without
> re-encoding.
>
> So my questions are:
>
> 1) Is this design possible?
> 2) Where can I find some examples of such pipelines, preferably in
> python.

It's all possible, but will be a bit fiddly.

You probably want something lower-level than playbin.

In recent GStreamer versions we have splitmuxsink and splitmuxsrc which
you might find helpful in this context.

What protocols do you want to stream as? The easiest would be to just
use gst-rtsp-server (see gst-rtsp-server/examples for some simple
examples).

Cheers
 -Tim


--
Tim Müller, Centricular Ltd - http://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

 


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 undecoded video and about-to-finish signal

Krutskikh Ivan
All would be so much easier if gstreamer would have some sort of playlist src element...

2016-03-31 12:37 GMT+03:00 Krutskikh Ivan <[hidden email]>:
Right now I have this routine implemented:

multifilesink writes files of fixed time to temporary dir and I wait for message on the bus. Upon each new file message I move the file to my directory tree. So if I understood you right, if I  want to broadcast those files later I should either:

1) Move or make simlinks to  those files to temporary dir with a wildcard like video%d.mp4

2) Prepare a m3u8 manifest pointing to my files location and use hlsdemux, update the m3u8 dynamicaly


2016-03-31 12:26 GMT+03:00 Thornton, Keith <[hidden email]>:

Hi,

I don’t think it was designed fort hat use-case. It was designed to put the files in one directory with incrementing file names. Why not just move the files to the required directory using a background thread at the end of each hour.

 

Von: gstreamer-devel [mailto:[hidden email]] Im Auftrag von Krutskikh Ivan
Gesendet: Donnerstag, 31. März 2016 11:14
An: Discussion of the development of and with GStreamer
Betreff: Re: playbin2 undecoded video and about-to-finish signal

 

Hi!

Thanks for pointing me to splitmuxsink and splitmuxsrc. That would spare me a good chunk of code.=)

The only thing I don't understand about splitmuxsrc is how to specify a specific file list or to modify it in playing state. By default I have a separate directory for each hour of recording. I can stream files from one hour with location=/archive/video/cam/day/hour/*.mp4, but what happens when I finish this hour? The awkward hack is to make a temp directory and add simlinks to videofiles from main archive to it in advance...

 

2016-03-30 23:53 GMT+03:00 Tim Müller <[hidden email]>:

On Sun, 2016-03-27 at 22:28 +0300, Krutskikh Ivan wrote:

Hi,

Your subject line mentions 'playbin2' - are you using the outdated and
unmaintained GStreamer 0.10 ? If yes, you should switch to a current
1.x version.

> I'm developing a video recording/broadcasting server with gstreamer
> and python.
>
> Right now I'm facing 2 tasks:
>
> - recording video from different sources (rtsp and http) and with
> different codecs (mjpeg,mpeg4,h264,h265) in a series of mkv files
>
> - broadcasting video from my archive to different clients using http
> and rtsp.
>
> Before discovering the playbin magic I used a number of template
> pipelines to capture video from source, pass it to multifilesink and
> ffmpeg to finish and store them in a convinient way, But if I could
> somehow feed my uri to playbin and get an undecoded video/audio
> stream from it, I would be able to put all my templates to single
> gstreamer pipeline: playbin uri=rtsp... (magic here) ! mpegtsmux !
> multifilesink...  And that would be very convenient.

For what it's worth, there is also decodebin and uridecodebin, which
are used inside of playbin but lower level.

You can also make (uri)decodebin stop autoplugging decoders early so
you can transmux without re-encoding, for example (auto* signals).



> The second task is more ambitious. I want to feed the recorded video
> to my clients as if it was a live source. Right now I have a file
> tree:
>
> /basedir/cam_name/yearmonthday/hour/minutesecond-duration.mp4
>
> Example:
>
> archive-test:/archive/video/multi/160327/21 # ls
> 0120-00125.mp4  0730-00120.mp4  1335-00120.mp4  1940-00125.mp4  2545-
> 00120.mp4  3150-00120.mp4  3755-00120.mp4  4400-00125.mp4  5005-
> 00125.mp4  5610-00125.mp4
> 0325-00120.mp4  0930-00120.mp4  1535-00125.mp4  2145-00120.mp4  2745-
> 00125.mp4  3350-00125.mp4  3955-00125.mp4  4605-00120.mp4  5210-
> 00120.mp4  5815-00120.mp4
> 0525-00125.mp4  1130-00125.mp4  1740-00120.mp4  2345-00120.mp4  2950-
> 00120.mp4  3555-00120.mp4  4200-00120.mp4  4805-00120.mp4  5410-
> 00120.mp4
>
> At some point in the future I recieve a request to play video from
> cam multi from  21:05 160327. I can then find my started file,
> construct a pipeline with playbin and multisocketsink, fast forward
> to the desired time and replace the uri of the file upon each about-
> to-finish signal. The complex part is that I also need an undecoded
> video stream since I plan to broadcast it to remote client without
> re-encoding.
>
> So my questions are:
>
> 1) Is this design possible?
> 2) Where can I find some examples of such pipelines, preferably in
> python.

It's all possible, but will be a bit fiddly.

You probably want something lower-level than playbin.

In recent GStreamer versions we have splitmuxsink and splitmuxsrc which
you might find helpful in this context.

What protocols do you want to stream as? The easiest would be to just
use gst-rtsp-server (see gst-rtsp-server/examples for some simple
examples).

Cheers
 -Tim


--
Tim Müller, Centricular Ltd - http://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

 


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel




_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 undecoded video and about-to-finish signal

Tim Müller
On Thu, 2016-03-31 at 12:40 +0300, Krutskikh Ivan wrote:

> All would be so much easier if gstreamer would have some sort of
> playlist src element...

What splitmuxsrc does it more advanced than what playbin does if you
play the files with about-to-finish.

splitmusrc will stitch together the file parts *seamlessly* without any
gaps (despite the audio/video streams not being the exact same length
in practice).

Cheers
 -Tim
 
--
Tim Müller, Centricular Ltd - http://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: playbin2 undecoded video and about-to-finish signal

Krutskikh Ivan
Ok!

I ran a few tests on this setup:

I have a directory containing these files:

0000-00012.mp4  0412-00121.mp4  1014-00120.mp4  1616-00120.mp4  2217-00120.mp4  2818-00122.mp4  3421-00120.mp4  4023-00120.mp4  4624-00122.mp4  5228-00120.mp4  5831-00088.mp4
0011-00121.mp4  0613-00120.mp4  1214-00121.mp4  1816-00120.mp4  2417-00120.mp4  3020-00120.mp4  3621-00122.mp4  4223-00121.mp4  4826-00121.mp4  5428-00122.mp4 
0212-00120.mp4  0813-00121.mp4  1415-00121.mp4  2016-00121.mp4  2617-00121.mp4  3220-00121.mp4  3823-00120.mp4  4424-00120.mp4  5027-00121.mp4  5630-00121.mp4

All of those are h264 videos with mp4 muxing

I start a gstreamer pipeline with:

gst-launch-1.0 splitmuxsrc location=/archive/test/*.mp4 ! mpegtsmux ! tcpserversink host=0.0.0.0 port=5000

I get:

gstsplitmuxsrc.c(533): gst_splitmux_pad_loop (): /GstPipeline:pipeline0/GstSplitMuxSrc:splitmuxsrc0:
streaming task paused, reason not-negotiated (-4)
ОШИБКА: конвейер не хочет подготавливаться (PREROLL).

However if I do

gst-launch-1.0 splitmuxsrc location=/archive/test/*.mp4 ! mpegpsmux ! tcpserversink host=0.0.0.0 port=5000

I can view my video just fine...

matroskamux streamable=true

also fails, but on the client side (vlc player)


2016-03-31 13:00 GMT+03:00 Tim Müller <[hidden email]>:
On Thu, 2016-03-31 at 12:40 +0300, Krutskikh Ivan wrote:

> All would be so much easier if gstreamer would have some sort of
> playlist src element...

What splitmuxsrc does it more advanced than what playbin does if you
play the files with about-to-finish.

splitmusrc will stitch together the file parts *seamlessly* without any
gaps (despite the audio/video streams not being the exact same length
in practice).

Cheers
 -Tim
 
--
Tim Müller, Centricular Ltd - http://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel