Hello,
I'm trying to create a daemon that captures a video stream from an ip security-camera. I tried the following pipeline based on this blog post:
Replacing x264enc with mpeg2enc also gives an error message:
I can view the stream using xvimagesink however:
What am I doing wrong here and how can I fix it. As a second question, what would be the best way to split the stream into 10 minute files, do I simply disconnect the filesink and connect a new one every ten minutes or is there a better way? Thanks, Hans Maree |
Hans,
In both cases you have tried to attach elements in the pipeline that do not match. This is one of the troubles with something as flexible as gstreamer is that there are also in infinite number of possibilities that do not work.
First, get yourself some visibility. See a previous post of mine that shows how to get a graphic of the pipeline with caps during and after construction. This will help you understand what is happening better than log files.
Try this with playbin2 to see how the autoplugger connects your source to the display to understand better how to convert it. For example in your first error, I think the problem is that the framerate is zero. jpegdec only decodes one picture, you probably need ffdec_mjpeg instead.
Your second example is similar, you tried to connect a decoder of one frame to an encoder of a video stream. These are not compatible. Finally some hints, H264 transcoding is pretty slow and complicated, start by just storing in an AVI or MP4 container first.
For making multiple files as output look into multifilesink. http://gstreamer-devel.966125.n4.nabble.com/How-to-capture-a-still-image-while-previewing-live-video-td3813993.html Experiment one piece at a time and eventually you will probably find a way. Plan on it taking a while, just getting a good feel for all the capabilities and nuances of gstreamer takes days if not weeks. Many nights have been lost to seductive examples that almost do what you want. Mike Mitchell http://www.panometric.net On Mon, Oct 17, 2011 at 12:03 PM, Hans Maree <[hidden email]> wrote: Hello, _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Thanks for your reaction Mike,
over the last week I did some more experimentation using your debug wrapper script (which is extremely usefull) and I got it to work: souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=15&user=admin&pwd=123456' do-timestamp=true is_live=true timeout=5 ! multipartdemux ! image/jpeg, width=320, height=240, framerate=1/1 ! ffdec_mjpeg ! jpegenc ! avimux ! filesink location='test.avi' I found out however that the stream is not exactly one frame per second so I got some 'drift' in the frame timing. So I decided to use a matroska container instead since it supports variable frame rate: souphttpsrc location='http://192.168.0.178/videostream.cgi?rate=15&user=admin&pwd=123456' do-timestamp=true is_live=true timeout=5 ! multipartdemux ! image/jpeg, width=320, height=240, framerate=1/1 ! ffdec_mjpeg ! jpegenc ! avimux ! filesink location='output.avi' I'm sure if this solves the problem completely since I still tell gstreamer to use a framerate of 1fps, but it does seem to limit the drift to an exeptable level. I'm not sure this is what I am looking for since it seems that multifilesink does not create complete video files, but simply cuts the file in several bits, which makes sense since the matroskamux element is responsible for creating the file structure. I think that for now I will just restart the pipeline every ten minutes, changing the location of the filesink every time. Later I will try for a more elegant solution. Hans Maree |
Free forum by Nabble | Edit this page |