Hi,
For a local public radio station a few years ago I have setup a live Ogg Theora/Vorbis stream. The current solution is based on a Python script that grabs the frames of an Intellinet IP camera and is used together with ffmpeg to overlay an image and text messages, and gstreamer to add live alsa audio capture and create the Ogg stream and Icecast2 to stream. Problems with this setup are that audio and video are not synchronized, and that the latest version of ffmpeg no longer supports vhooks. Therefore I'm looking for a completely gstreamer based solution to implement the live stream. I could not find a plugin to capture the images from the IP camera and I started implementing my own plugin based on gsttcpclientsrc.c. It is able to capture the JPEG images from the IP camera and send them as buffers to jpegdec (the next element in the pipeline) but for some reason I'm not able to control the framerate. It is always capturing images at the highest possible speed. After some more reading of the manuals and checking sources of e.g. videotestsrc and v4lsrc I added timestamps to the buffers but that did not help: the stream was freezing at the first frame but capturing is still done at high speed. I also tried forcing a framerate 1 fps with caps. Images now appear between 3 and 4 seconds, but capturing remains at high speed. The pipeline I currently use for testing: gst-launch ipcamsrc host=HOST port=8080 username=USER password=PASS ! jpegdec ! ffmpegcolorspace ! timeoverlay ! ximagesink >From the following debug output I conclude that this pipeline is operating in push mode: 0:00:00.910443729 ...:<ximagesink0> Trying pull mode first 0:00:00.910471532 ...:<ximagesink0> pull mode disabled 0:00:00.910498350 ...:<ximagesink0> Falling back to push mode A current development snapshot is available via http://www.omroepvenray.nl/gst/ipcam.tar.gz Any idea what might be wrong/missing here? Best regards, Roland Hermans ________________________________________________ Message sent using UebiMiau 2.7.10 ------------------------------------------------------------------------------ This SF.Net email is sponsored by the Verizon Developer Community Take advantage of Verizon's best-in-class app development support A streamlined, 14 day to market process makes app distribution fast and easy Join now and get one step closer to millions of Verizon customers http://p.sf.net/sfu/verizon-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Sun, 2009-12-27 at 17:44 +0100, Roland Hermans wrote:
Hi Roland, > I could not find a plugin to capture the images from the IP > lcamera and I started implementing my own plugin based on > gsttcpclientsrc.c. It's not the kind of cam that just works with e.g. souptthpsrc location=... ! multipartdemux ! ... ? > It is able to capture the JPEG images from the IP camera and send them as > buffers to jpegdec (the next element in the pipeline) but for some reason > I'm not able to control the framerate. It is always capturing images at the > highest possible speed. I think that's normal for a lot of these cameras. > After some more reading of the manuals and checking sources of e.g. > videotestsrc and v4lsrc I added timestamps to the buffers but that did not > help: the stream was freezing at the first frame but capturing is still done > at high speed. I also tried forcing a framerate 1 fps with caps. Images now > appear between 3 and 4 seconds, but capturing remains at high speed. Your source should put timestamps on buffers. If it derives from GstBaseSrc it might be enough to set the do-timestamp property to TRUE. You can then put a videorate element into the pipeline followed by a capsfilter with a framerate, which will drop/duplicate frames and fix up the timestamps a bit so that you get a perfectly time-stamped stream that's easy to feed into encoders/muxers. > The pipeline I currently use for testing: > > gst-launch ipcamsrc host=HOST port=8080 username=USER password=PASS ! > jpegdec ! ffmpegcolorspace ! timeoverlay ! ximagesink Maybe try something like: ipcamsrc do-timestamp=true ... ! videorate ! image/jpeg,framerate= \(fraction\)25/1 ! jpegdec ! ffmpegcolorspace ! timeoverlay ! ffmpegcolorspace ! ximagesink sync=false (IIRC videorate supports image/jpeg only in recent versions of gst-plugins-base - if that's a problem you can move the videorate behind the jpegdec.) Cheers -Tim ------------------------------------------------------------------------------ This SF.Net email is sponsored by the Verizon Developer Community Take advantage of Verizon's best-in-class app development support A streamlined, 14 day to market process makes app distribution fast and easy Join now and get one step closer to millions of Verizon customers http://p.sf.net/sfu/verizon-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Hi Tim,
>> I could not find a plugin to capture the images from the IP >> lcamera and I started implementing my own plugin based on >> gsttcpclientsrc.c. > > It's not the kind of cam that just works with e.g. souptthpsrc > location=... ! multipartdemux ! ... ? No, unfortunately it doesn't use http but has a custom protocol to authenticate and request images. >> I'm not able to control the framerate. It is always capturing images at >> the >> highest possible speed. > > I think that's normal for a lot of these cameras. Sorry if I wasn't clear. It's not the camera that's sending images too frequent but it's gstreamer that keeps calling the _create() on my ipcamsrc element. The camera will only send an image after receiving and image request. I send this image request from within the _create() call, receive the JPEG image data and put it in a buffer. If I debug the same pipeline with e.g. videotestsrc, I see that videotestsrc generates new frames at the requested frame rate (as it should). I would like to have the same behavior for my ipcamsrc, i.e. _create() is called only when a frame is needed. > Your source should put timestamps on buffers. If it derives from > GstBaseSrc it might be enough to set the do-timestamp property to TRUE. > You can then put a videorate element into the pipeline followed by a > capsfilter with a framerate, which will drop/duplicate frames and fix up > the timestamps a bit so that you get a perfectly time-stamped stream > that's easy to feed into encoders/muxers. I removed all my own (well, copied from v4lsrc) time stamping code and enabled do-timestamp but without luck. >> The pipeline I currently use for testing: >> >> gst-launch ipcamsrc host=HOST port=8080 username=USER password=PASS ! >> jpegdec ! ffmpegcolorspace ! timeoverlay ! ximagesink > > Maybe try something like: > > ipcamsrc do-timestamp=true ... ! videorate ! image/jpeg,framerate= > \(fraction\)25/1 ! jpegdec ! ffmpegcolorspace ! timeoverlay ! > ffmpegcolorspace ! ximagesink sync=false Unfortunately this doesn't help either. But to check my understanding of how gstreamer works: the src pad of videorate should be emitting frames at exactly the specified framerate, but does it also try to negotiate this framerate on the sink pad of the videorate element? I also noticed that gst_buffer_set_caps() is called. Is it necessary/important that the correct caps are set on the buffer? In other words, what is important for the framerate (timestamp, duration, caps, ...)? > (IIRC videorate supports image/jpeg only in recent versions of > gst-plugins-base - if that's a problem you can move the videorate behind > the jpegdec.) I'm using gstreamer-plugins-base 0.10.25.1 and thus should be OK here. Best regards, Roland BTW: see http://www.omroepvenray.nl/gst/gstipcamsrc.c for latest snapshot ------------------------------------------------------------------------------ This SF.Net email is sponsored by the Verizon Developer Community Take advantage of Verizon's best-in-class app development support A streamlined, 14 day to market process makes app distribution fast and easy Join now and get one step closer to millions of Verizon customers http://p.sf.net/sfu/verizon-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by Tim-Philipp Müller-2
Hi Tim,
>> I could not find a plugin to capture the images from the IP >> lcamera and I started implementing my own plugin based on >> gsttcpclientsrc.c. > > It's not the kind of cam that just works with e.g. souptthpsrc > location=... ! multipartdemux ! ... ? No, unfortunately it doesn't use http but has a custom protocol to authenticate and request images. >> I'm not able to control the framerate. It is always capturing images at >> the >> highest possible speed. > > I think that's normal for a lot of these cameras. Sorry if I wasn't clear. It's not the camera that's sending images too frequent but it's gstreamer that keeps calling the _create() on my ipcamsrc element. The camera will only send an image after receiving and image request. I send this image request from within the _create() call, receive the JPEG image data and put it in a buffer. If I debug the same pipeline with e.g. videotestsrc, I see that videotestsrc generates new frames at the requested frame rate (as it should). I would like to have the same behavior for my ipcamsrc, i.e. _create() is called only when a frame is needed. > Your source should put timestamps on buffers. If it derives from > GstBaseSrc it might be enough to set the do-timestamp property to TRUE. > You can then put a videorate element into the pipeline followed by a > capsfilter with a framerate, which will drop/duplicate frames and fix up > the timestamps a bit so that you get a perfectly time-stamped stream > that's easy to feed into encoders/muxers. I removed all my own (well, copied from v4lsrc) time stamping code and enabled do-timestamp but without luck. >> The pipeline I currently use for testing: >> >> gst-launch ipcamsrc host=HOST port=8080 username=USER password=PASS ! >> jpegdec ! ffmpegcolorspace ! timeoverlay ! ximagesink > > Maybe try something like: > > ipcamsrc do-timestamp=true ... ! videorate ! image/jpeg,framerate= > \(fraction\)25/1 ! jpegdec ! ffmpegcolorspace ! timeoverlay ! > ffmpegcolorspace ! ximagesink sync=false Unfortunately this doesn't help either. But to check my understanding of how gstreamer works: the src pad of videorate should be emitting frames at exactly the specified framerate, but does it also try to negotiate this framerate on the sink pad of the videorate element? I also noticed that gst_buffer_set_caps() is called. Is it necessary/important that the correct caps are set on the buffer? In other words, what is important for the framerate (timestamp, duration, caps, ...)? > (IIRC videorate supports image/jpeg only in recent versions of > gst-plugins-base - if that's a problem you can move the videorate behind > the jpegdec.) I'm using gstreamer-plugins-base 0.10.25.1 and didn't notice any errors related to this. Best regards, Roland BTW: see http://www.omroepvenray.nl/gst/gstipcamsrc.c for latest snapshot ------------------------------------------------------------------------------ This SF.Net email is sponsored by the Verizon Developer Community Take advantage of Verizon's best-in-class app development support A streamlined, 14 day to market process makes app distribution fast and easy Join now and get one step closer to millions of Verizon customers http://p.sf.net/sfu/verizon-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Mon, 2009-12-28 at 09:41 +0100, Roland Hermans wrote:
> Sorry if I wasn't clear. It's not the camera that's sending images too > frequent but it's gstreamer that keeps calling the _create() on my > ipcamsrc element. The camera will only send an image after receiving and > image request. I send this image request from within the _create() call, > receive the JPEG image data and put it in a buffer. > If I debug the same pipeline with e.g. videotestsrc, I see that > videotestsrc generates new frames at the requested frame rate (as it > should). I would like to have the same behavior for my ipcamsrc, i.e. > _create() is called only when a frame is needed. videotestsrc will create buffers as fast as possible (see e.g. videotestsrc ! fakesink), it will however put timestamps on it so that when the sink syncs against the clock it will display 25 frames per second or whatever the framerate is. This will then throttle the source so that it ends up creating 25 frames per second in the end. (cp. videotestsrc ! ximagesink and videotestsrc ! queue ! ximagesink for example). > (...) But to check my understanding of how gstreamer works: the src pad > of videorate should be emitting frames at exactly the specified > framerate, but does it also try to negotiate this framerate on the > sink pad of the videorate element? Sort of. First of all, the framerate field in the caps really does not matter that much. It's used for negotiation, but what matters in the end is the timestamps on buffers (and the initial newsegment event). videorate will re-timestamp/duplicate/drop buffers to match the desired downstream framerate. It *should* advertise the framerate configured downstream via caps for upstream negotiation, but that will only be used by the source if the source looks for it. And in any case it won't affect/throttle the source *directly* in any way unless the source configures hardware with the framerate or so. Usually upstream produces buffers as fast as possible and only the sink(s) sync against the clock, and this indirectly throttles upstream (once the internal/external queues are full and block, so that upstream only gets scheduled again when a buffer is processed downstream and taken out of the queue etc.). Alternatively, you could try and make your source a live source that syncs against the clock (and then not make the sinks sync against the clock with sync=false, or implement latency queries properly), and schedule buffer production yourself based on the clock, but I'm not sure that's easier or even more desirable (depends a bit on how it currently works in practice I guess). > I also noticed that gst_buffer_set_caps() is called. Is it > necessary/important that the correct caps are set on the buffer? Yes, you need to set caps on at least the first buffer. Ideally on all buffers. Cheers -Tim ------------------------------------------------------------------------------ This SF.Net email is sponsored by the Verizon Developer Community Take advantage of Verizon's best-in-class app development support A streamlined, 14 day to market process makes app distribution fast and easy Join now and get one step closer to millions of Verizon customers http://p.sf.net/sfu/verizon-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
>> Sorry if I wasn't clear. It's not the camera that's sending images too
>> frequent but it's gstreamer that keeps calling the _create() on my >> ipcamsrc element. The camera will only send an image after receiving and >> image request. I send this image request from within the _create() call, >> receive the JPEG image data and put it in a buffer. > >> If I debug the same pipeline with e.g. videotestsrc, I see that >> videotestsrc generates new frames at the requested frame rate (as it >> should). I would like to have the same behavior for my ipcamsrc, i.e. >> _create() is called only when a frame is needed. > > videotestsrc will create buffers as fast as possible (see e.g. > videotestsrc ! fakesink), it will however put timestamps on it so that > when the sink syncs against the clock it will display 25 frames per > second or whatever the framerate is. This will then throttle the source > so that it ends up creating 25 frames per second in the end. (cp. > videotestsrc ! ximagesink and videotestsrc ! queue ! ximagesink for > example). > > >> (...) But to check my understanding of how gstreamer works: the src pad >> of videorate should be emitting frames at exactly the specified >> framerate, but does it also try to negotiate this framerate on the >> sink pad of the videorate element? > > Sort of. First of all, the framerate field in the caps really does not > matter that much. It's used for negotiation, but what matters in the end > is the timestamps on buffers (and the initial newsegment event). > videorate will re-timestamp/duplicate/drop buffers to match the desired > downstream framerate. It *should* advertise the framerate configured > downstream via caps for upstream negotiation, but that will only be used > by the source if the source looks for it. And in any case it won't > affect/throttle the source *directly* in any way unless the source > configures hardware with the framerate or so. Usually upstream produces > buffers as fast as possible and only the sink(s) sync against the clock, > and this indirectly throttles upstream (once the internal/external > queues are full and block, so that upstream only gets scheduled again > when a buffer is processed downstream and taken out of the queue etc.). > Alternatively, you could try and make your source a live source that > syncs against the clock (and then not make the sinks sync against the > clock with sync=false, or implement latency queries properly), and > schedule buffer production yourself based on the clock, but I'm not sure > that's easier or even more desirable (depends a bit on how it currently > works in practice I guess). > >> I also noticed that gst_buffer_set_caps() is called. Is it >> necessary/important that the correct caps are set on the buffer? > > Yes, you need to set caps on at least the first buffer. Ideally on all > buffers. Thanks for your clarifications. I decided to just reimplemented the element - this time based on multifilesrc - and now the framerate issue is solved :-). However the code still does need some cleanup as it was a quick hack to get things working. Once the code is cleaned up it might be useful for other gstreamer users as well, even though the target audience is probably pretty limited as it depends on specific hardware (Intellinet IP camera). Would you suggest filing a bug report containing a patch, or is the plugin most likely too specific to be of general use? Best regards, Roland ------------------------------------------------------------------------------ This SF.Net email is sponsored by the Verizon Developer Community Take advantage of Verizon's best-in-class app development support A streamlined, 14 day to market process makes app distribution fast and easy Join now and get one step closer to millions of Verizon customers http://p.sf.net/sfu/verizon-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Hi,
would be nice to have your src. I'am still working on someting similar. I try to use a Prosilica Gige Camera with gstreamer. thx, Johannes On Mon, 2009-12-28 at 18:52 +0100, Roland Hermans wrote: > >> Sorry if I wasn't clear. It's not the camera that's sending images too > >> frequent but it's gstreamer that keeps calling the _create() on my > >> ipcamsrc element. The camera will only send an image after receiving and > >> image request. I send this image request from within the _create() call, > >> receive the JPEG image data and put it in a buffer. > > > >> If I debug the same pipeline with e.g. videotestsrc, I see that > >> videotestsrc generates new frames at the requested frame rate (as it > >> should). I would like to have the same behavior for my ipcamsrc, i.e. > >> _create() is called only when a frame is needed. > > > > videotestsrc will create buffers as fast as possible (see e.g. > > videotestsrc ! fakesink), it will however put timestamps on it so that > > when the sink syncs against the clock it will display 25 frames per > > second or whatever the framerate is. This will then throttle the source > > so that it ends up creating 25 frames per second in the end. (cp. > > videotestsrc ! ximagesink and videotestsrc ! queue ! ximagesink for > > example). > > > > > >> (...) But to check my understanding of how gstreamer works: the src pad > >> of videorate should be emitting frames at exactly the specified > >> framerate, but does it also try to negotiate this framerate on the > >> sink pad of the videorate element? > > > > Sort of. First of all, the framerate field in the caps really does not > > matter that much. It's used for negotiation, but what matters in the end > > is the timestamps on buffers (and the initial newsegment event). > > videorate will re-timestamp/duplicate/drop buffers to match the desired > > downstream framerate. It *should* advertise the framerate configured > > downstream via caps for upstream negotiation, but that will only be used > > by the source if the source looks for it. And in any case it won't > > affect/throttle the source *directly* in any way unless the source > > configures hardware with the framerate or so. Usually upstream produces > > buffers as fast as possible and only the sink(s) sync against the clock, > > and this indirectly throttles upstream (once the internal/external > > queues are full and block, so that upstream only gets scheduled again > > when a buffer is processed downstream and taken out of the queue etc.). > > Alternatively, you could try and make your source a live source that > > syncs against the clock (and then not make the sinks sync against the > > clock with sync=false, or implement latency queries properly), and > > schedule buffer production yourself based on the clock, but I'm not sure > > that's easier or even more desirable (depends a bit on how it currently > > works in practice I guess). > > > >> I also noticed that gst_buffer_set_caps() is called. Is it > >> necessary/important that the correct caps are set on the buffer? > > > > Yes, you need to set caps on at least the first buffer. Ideally on all > > buffers. > > Thanks for your clarifications. > > I decided to just reimplemented the element - this time based on > multifilesrc - and now the framerate issue is solved :-). However the code > still does need some cleanup as it was a quick hack to get things working. > > Once the code is cleaned up it might be useful for other gstreamer users > as well, even though the target audience is probably pretty limited as it > depends on specific hardware (Intellinet IP camera). Would you suggest > filing a bug report containing a patch, or is the plugin most likely too > specific to be of general use? > > Best regards, > Roland > > > ------------------------------------------------------------------------------ > This SF.Net email is sponsored by the Verizon Developer Community > Take advantage of Verizon's best-in-class app development support > A streamlined, 14 day to market process makes app distribution fast and easy > Join now and get one step closer to millions of Verizon customers > http://p.sf.net/sfu/verizon-dev2dev > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------------ This SF.Net email is sponsored by the Verizon Developer Community Take advantage of Verizon's best-in-class app development support A streamlined, 14 day to market process makes app distribution fast and easy Join now and get one step closer to millions of Verizon customers http://p.sf.net/sfu/verizon-dev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |