I'm trying to create an application that takes in arbitrary video and sends it out through multiple outputs. I also need to automatically reconnect any video source that fails, for example if the file ends or the network fails. My test application currently uses a local display window, an RTSP server that creates rtp outputs on request, and a file to record the video to as outputs, and one of three inputs: a videotestsrc, a uridecodebin, or a sequence of elements to deal with MPEG2 (see issue #3 below). I've run across several issues using the latest OSSBuild GStreamer from SVN (version 0.10.35, SVN revision 1012), and am working in Managed C++:
1) Setting the whole pipeline to Ready, then to Playing to reconnect the source on error or EOS seems to only work once, and is somewhat buggy. For example, pulling buffers from the appsink I use for my local display returns null between frames instead of blocking until the next frame is ready like it does on the first run. Subsequent reconnect attempts appear to generate a good pipeline (in my application I have a button I can click to dump the current pipeline to a .dot file), but nothing happens at all in the playing state, at least to the extent that I can determine by observing the output. Are there ways to insert probes into the pipeline at random to see where the clog is if there is one? Also, is there a good place I can upload .dot or .png renderings of the current pipeline? I don't want to fill who knows how many people's in boxes with images when only a few people will be interested. 2) Setting the whole pipeline to Ready, then to Playing disconnects all remote connections and restarts the file recording, I would like these to stay open during this process, and possibly display a message during the down time. I've tried just resetting a subset of the pipeline (just the uridecodebin in most cases), but that doesn't seem to work at all (the same problem as above, the pipeline appears to be good and playing when I dump it, but there's no output on the local display, recording file, or remote connections). 3) MPEG2 doesn't decode using decodebin2 (always outputs a green screen), I have to manually build a pipeline using ffdemux_mpegts and ffdec_mpeg2video. Even MPEG2 videos generated by GStreamer (using gst-launch videotestsrc ! ffenc_mpeg2video ! mpegtsmux ! filesink location="test.mpg") won't play with decodebin2, but work just fine in other players (like VLC). I've already implemented a workaround by adding a flag to set manually prior to building the pipeline. Is there a way to detect the video type automatically before assembling a pipeline? Also, it may be worth mentioning that in an older version (0.10.22 I think), MPEG2 did display, but it played back too fast using a decodebin2. 4) While trying to get the rtp output working I did find that putting elements that I would not expect to affect things (like a tee and a queue) between an rtpmp4vpay and a gstrtpbin would cause the sink on the gstrtpbin send_rtp_src_%d pad to get stuck in the ready to paused state. Not sure if that's a bug or intended behavior. In the pipeline graph the link from the tee to the queue and the queue to the gstrtpbin had a caps of ANY instead of the x-rtp... that I would have expected it to have. Thanks for taking the time to read this and I hope to see some good tips on what to do. |
Free forum by Nabble | Edit this page |