Use videorate in MJPEG pipeline

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Use videorate in MJPEG pipeline

Ron McOuat
I would like to use videorate in a MJPEG pipeline in order to get 1 feed
from for example an Axis camera, then use a tee connection to split the
image/jpeg stream and deliver to two different destination sinks but at
different frame rates. For example deliver image/jpeg at 8 fps to one
sink and at 2 fps to another sink by using videorate in the 2 fps tee
branch.

I tested and got videorate running correctly in the following gst-launch
to illustrate (no tee to 2 sinks for simplicity now) and sending the
output to a window to observe the output

gst-launch -v gnomevfssrc
location="http://root:pass@192.168.2.90/axis-cgi/mjpg/video.cgi?fps=8&resolution=640x480"
do-timestamp=true ! multipartdemux !
image/jpeg,width=640,height=480,framerate=8/1 ! jpegdec ! videorate !
video/x-raw-yuv,framerate=1/5 ! xvimagesink

where the video rate change occurs in the pipeline section where the
mime type is video/x-raw-yuv. The pipeline refused to run until
framerate=8/1 was added to the caps entry after multipartdemux. Watching
the XWindow image, the camera time updates once every 5 seconds as
requested for the output framerate of videorate.


Knowing the pad templates on videorate allow only video/x-raw-yuv and
video/x-raw-rgb, I altered the source to add in image/jpeg as a third
mime type in the source and sink pad templates

line 112 and 119 in gstvideorate.c changed from

    GST_STATIC_CAPS ("video/x-raw-yuv; video/x-raw-rgb")

to

    GST_STATIC_CAPS ("video/x-raw-yuv; video/x-raw-rgb; image/jpeg")

The reasoning I have is raw video or jpeg images should both stream by
as a full frame per buffer and since every buffer contains a full frame
the videorate component should be equally valid for changing frame rates
on an image/jpeg stream as it is for the raw video stream types. In the
actual application I don't want to decode and re-encode for performance
and image quality degradation reasons.

After rebuild, gst-inspect videorate command shows the additional mime
type in the source and sink pad descriptions (using a local library
uninstall environment).

So now I build a pipeline to demonstrate if it works like this:

gst-launch -v gnomevfssrc
location="http://root:pass@192.168.2.90/axis-cgi/mjpg/video.cgi?fps=8&resolution=640x480"
do-timestamp=true ! multipartdemux !
image/jpeg,width=640,height=480,framerate=8/1 ! videorate !
image/jpeg,framerate=1/5 ! jpegdec ! xvimagesink

where the jpegdec ! xvimagesink is not where I actually want to consume
the stream, it is only for debugging purposes.

The pipeline goes into preroll and sits, then if interrupted I get the
did not want to preroll message - output as follows:

[gst-0.10.21] [ronm@localhost 0.10.21]$ gst-launch -v gnomevfssrc
location="http://root:pass@192.168.2.90/axis-cgi/mjpg/video.cgi?fps=8&resolution=640x480"
do-timestamp=true ! multipartdemux !
image/jpeg,width=640,height=480,framerate=8/1 ! videorate !
image/jpeg,framerate=1/5 ! jpegdec ! xvimagesink

Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2: caps = image/jpeg,
width=(int)640, height=(int)480, framerate=(fraction)8/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:src: caps =
image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)8/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps =
image/jpeg
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:sink: caps =
image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)8/1
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:src: caps =
image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)1/5
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:sink: caps =
image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)8/1

wait for minutes here, nothing happening
then type ^C

^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstMultipartDemux:multipartdemux0.GstPad:src_0:
caps = NULL
FREEING pipeline ...


Any comments on what I have missed or should do different. Should I
consider writing a custom component? To have this work long term the
videorate component would need to have the additional mime type added to
the pad templates as an approved enhancement.

Thanks,
Ron

-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Use videorate in MJPEG pipeline

Ron McOuat
Figured out the problem of why pipeline pre roll never completed for the
pipeline below. During pre roll the buffers come into videorate with no
clock since the clock is not started until playing state so the
videorate element throws away all buffers received on the sink pad
detecting GST_NO_CLOCK in the buffers. This causes the pipeline to not
complete the pause state pre-roll operation because the end sink never
gets a video buffer.

I forced live mode on the gnomevfsrc element and now the pipeline plays.
I need  the videorate element to handle mime type image/jpeg in order to
split a feed from an Axis camera into two different sinks. I am using
videorate only to lower the frame rate on a tee branch that doesn't need
the full rate. Doing this lowers network bandwidth requirements by only
acquiring the camera data once instead of opening two feeds over the
network. I have no need to decode the jpeg frames, the pipeline below is
only to illustrate the problem.

Questions on alternative ways to address this:

1) Enhance videorate
Should I submit an enhancement request or is this changing the videorate
component for a use that is not desired? I could write my own plugin to
address this using videorate as a model. As mentioned in the first
posting I have added the image/jpeg mime type to the src and sink caps
templates in a local to me source copy. The non completion of pre roll
could probably be fixed by not discarding buffers during the pre roll
activity or just pass one buffer to satisfy the sink as a special case
for pause state. Otherwise videorate only works with live sources or in
a location of the pipeline where a decoder has added time to the raw
video buffer.

2) Use is-live on source to avoid pre roll
gnomevfssrc and souphttpsrc do not have an is-live property like
videotestsrc for example to allow setting of the live property in
GstBaseSrc from gst-launch. It could be added to each or would it be
better to add this to GstBaseSrc like do-timestamp so any source could
set this property using the base class property. Maybe there is a reason
for not making the property generally available for sources that can
also read files (too many problems when used for the wrong purpose)?
When getting frames from a network camera by http it is sort of live
because frames are lost in a pause state but it is able to pre roll and
supply data which a live source is not generally capable of. The
do-timestamp option only comes into effect when the pipeline goes to play.

Suggestions?



Ron McOuat wrote:

> I would like to use videorate in a MJPEG pipeline in order to get 1 feed
> from for example an Axis camera, then use a tee connection to split the
> image/jpeg stream and deliver to two different destination sinks but at
> different frame rates. For example deliver image/jpeg at 8 fps to one
> sink and at 2 fps to another sink by using videorate in the 2 fps tee
> branch.
>
> I tested and got videorate running correctly in the following gst-launch
> to illustrate (no tee to 2 sinks for simplicity now) and sending the
> output to a window to observe the output
>
> gst-launch -v gnomevfssrc
> location="http://root:pass@192.168.2.90/axis-cgi/mjpg/video.cgi?fps=8&resolution=640x480"
> do-timestamp=true ! multipartdemux !
> image/jpeg,width=640,height=480,framerate=8/1 ! jpegdec ! videorate !
> video/x-raw-yuv,framerate=1/5 ! xvimagesink
>
> where the video rate change occurs in the pipeline section where the
> mime type is video/x-raw-yuv. The pipeline refused to run until
> framerate=8/1 was added to the caps entry after multipartdemux. Watching
> the XWindow image, the camera time updates once every 5 seconds as
> requested for the output framerate of videorate.
>
>
> Knowing the pad templates on videorate allow only video/x-raw-yuv and
> video/x-raw-rgb, I altered the source to add in image/jpeg as a third
> mime type in the source and sink pad templates
>
> line 112 and 119 in gstvideorate.c changed from
>
>     GST_STATIC_CAPS ("video/x-raw-yuv; video/x-raw-rgb")
>
> to
>
>     GST_STATIC_CAPS ("video/x-raw-yuv; video/x-raw-rgb; image/jpeg")
>  
> snip rest of first post.
>  

-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel