capturing mjpeg from ip-camera

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

capturing mjpeg from ip-camera

Hans Maree
Hello,

I'm trying to create a daemon that captures a video stream from an ip security-camera.
The camera provides an jpeg stream with configurable framerate. I would like to capture this stream and encode it as x264 or some other popular video format (doesn't realy matter as long as the encoding process doesn't generate too much overhead). I also want the captured video to be split in 10 minute long files.

I tried the following pipeline based on this blog post:

$ gst-launch-0.10 -evt souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=1&user=admin&pwd=123456' do-timestamp=true timeout=5 ! multipartdemux ! jpegdec ! x264enc ! filesink location=test.x264
Setting pipeline to PAUSED ...
GLib-GIO-Message: Using the 'memory' GSettings backend. Your settings will not be saved or shared with other applications.
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)320, height=(int)240, framerate=(fraction)0/1
Floating point exception

Replacing x264enc with mpeg2enc also gives an error message:

$ gst-launch-0.10 -evt souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=1&user=admin&pwd=123456' do-timestamp=true timeout=5 ! multipartdemux ! jpegdec ! mpeg2enc ! filesink location=test.mpg
Setting pipeline to PAUSED ...
GLib-GIO-Message: Using the 'memory' GSettings backend. Your settings will not be saved or shared with other applications.
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)320, height=(int)240, framerate=(fraction)0/1
ERROR: from element /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2582): gst_base_src_loop (): /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: streaming task paused, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstMultipartDemux:multipartdemux0.GstPad:src_0: caps = NULL
Freeing pipeline ...

I can view the stream using xvimagesink however:

$ gst-launch-0.10 -evt souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=1&user=admin&pwd=123456' do-timestamp=true timeout=5 ! multipartdemux ! jpegdec ! xvimagesink

What am I doing wrong here and how can I fix it.

As a second question, what would be the best way to split the stream into 10 minute files, do I simply disconnect the filesink and connect a new one every ten minutes or is there a better way?

Thanks, Hans Maree

Reply | Threaded
Open this post in threaded view
|

Re: capturing mjpeg from ip-camera

Mike Mitchell
Hans,
In both cases you have tried to attach elements in the pipeline that do not match. This is one of the troubles with something as flexible as gstreamer is that there are also in infinite number of possibilities that do not work. 

First, get yourself some visibility. See a previous post of mine that shows how to get a graphic of the pipeline with caps during and after construction. This will help you understand what is happening better than log files. 

Try this with playbin2 to see how the autoplugger connects your source to the display to understand better how to convert it. For example in your first error, I think the problem is that the framerate is zero. jpegdec only decodes one picture, you probably need ffdec_mjpeg instead.

Your second example is similar, you tried to connect a decoder of one frame to an encoder of a video stream. These are not compatible. 

Finally some hints, H264 transcoding is pretty slow and complicated, start by just storing in an AVI or MP4 container first.

For making multiple files as output look into multifilesink. 
http://gstreamer-devel.966125.n4.nabble.com/How-to-capture-a-still-image-while-previewing-live-video-td3813993.html

Experiment one piece at a time and eventually you will probably find a way.  Plan on it taking a while, just getting a good feel for all the capabilities and nuances  of gstreamer takes days if not weeks. Many nights have been lost to seductive examples that almost do what you want. 

Mike Mitchell
http://www.panometric.net

On Mon, Oct 17, 2011 at 12:03 PM, Hans Maree <[hidden email]> wrote:

 
Hello,

I'm trying to create a daemon that captures a video stream from an ip security-camera.
The camera provides an jpeg stream with configurable framerate. I would like to capture this stream and encode it as x264 or some other popular video format (doesn't realy matter as long as the encoding process doesn't generate too much overhead). I also want the captured video to be split in 10 minute long files.

I tried the following pipeline based on this blog post:

$ gst-launch-0.10 -evt souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=1&user=admin&pwd=123456' do-timestamp=true timeout=5 ! multipartdemux ! jpegdec ! x264enc ! filesink location=test.x264
Setting pipeline to PAUSED ...
GLib-GIO-Message: Using the 'memory' GSettings backend. Your settings will not be saved or shared with other applications.
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)320, height=(int)240, framerate=(fraction)0/1
Floating point exception

Replacing x264enc with mpeg2enc also gives an error message:

$ gst-launch-0.10 -evt souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=1&user=admin&pwd=123456' do-timestamp=true timeout=5 ! multipartdemux ! jpegdec ! mpeg2enc ! filesink location=test.mpg
Setting pipeline to PAUSED ...
GLib-GIO-Message: Using the 'memory' GSettings backend. Your settings will not be saved or shared with other applications.
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, width=(int)320, height=(int)240, framerate=(fraction)0/1
ERROR: from element /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2582): gst_base_src_loop (): /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: streaming task paused, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstMultipartDemux:multipartdemux0.GstPad:src_0: caps = NULL
Freeing pipeline ...

I can view the stream using xvimagesink however:

$ gst-launch-0.10 -evt souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=1&user=admin&pwd=123456' do-timestamp=true timeout=5 ! multipartdemux ! jpegdec ! xvimagesink

What am I doing wrong here and how can I fix it.

As a second question, what would be the best way to split the stream into 10 minute files, do I simply disconnect the filesink and connect a new one every ten minutes or is there a better way?

Thanks, Hans Maree


View this message in context: capturing mjpeg from ip-camera
Sent from the GStreamer-devel mailing list archive at Nabble.com.


_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: capturing mjpeg from ip-camera

Hans Maree
Thanks for your reaction Mike,

over the last week I did some more experimentation using your debug wrapper script (which is extremely usefull) and I got it to work:
souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=15&user=admin&pwd=123456' do-timestamp=true is_live=true timeout=5 ! multipartdemux ! image/jpeg, width=320, height=240, framerate=1/1 ! ffdec_mjpeg ! jpegenc ! avimux ! filesink location='test.avi'

I found out however that the stream is not exactly one frame per second so I got some 'drift' in the frame timing. So I decided to use a matroska container instead since it supports variable frame rate:
souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=15&user=admin&pwd=123456' do-timestamp=true is_live=true timeout=5 ! multipartdemux ! image/jpeg, width=320, height=240, framerate=1/1 ! ffdec_mjpeg ! jpegenc ! avimux ! filesink location='output.avi'

I'm sure if this solves the problem completely since I still tell gstreamer to use a framerate of 1fps, but it does seem to limit the drift to an exeptable level.

Mike Mitchell wrote
I'm not sure this is what I am looking for since it seems that multifilesink does not create complete video files, but simply cuts the file in several bits, which makes sense since the matroskamux element is responsible for creating the file structure. I think that for now I will just restart the pipeline every ten minutes, changing the location of the filesink every time. Later I will try for a more elegant solution.

Hans Maree