I am attempting to seek to a new position in a mpeg video while I am streaming it to a udpSink. I have used the basic-tutorial-13.c as a starting point. I am viewing the stream with the VLC media player with network stream ("udp://@:5000 " and cacheing 10ms).
When I submit the seek event, I get a lot of errors of the form below. The playback hangs for a while and may start again. If play starts again, it does so at some offset past the requested seek point. If I instead just use playbin, the seek works as expected and immediately. I have a similar problem when performing a playback rate change (using gst_element_query_position to get the current position). Is there something I can do to adjust the PTS to make them valid after submitting the seek? Is there a way to eliminate the hang and delay when performing the seek? This is my pipeline configuration for the udpSink: data.pipeline = gst_parse_launch ("filesrc location=test2.mpeg ! decodebin ! identity single-segment=true ! avenc_mpeg2video ! mpegtsmux ! udpsink host=\"127.0.0.1\" port=5000", NULL); This is my pipeline configuration with playbin: data.pipeline = gst_parse_launch ("playbin uri=file:///C:/gstream/test2.mpeg", NULL); This is my seek event (rate=1.0): seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_SKIP, GST_SEEK_TYPE_SET, position, GST_SEEK_TYPE_NONE, 0); gst_element_send_event (data->pipeline, seek_event); gstreamer diagnostic output after seek is submitted: 0:00:16.312514219 9348 00000000035D2A40 ERROR libav :0:: Invalid pts (0) <= last (378) 0:00:16.313970803 9348 00000000035D2A40 ERROR libav gstavvidenc.c:706:gst_ffmpegvidenc_handle_frame:<avenc_mpeg2video0> avenc_mpeg2video: failed to encode buffer ... [and ending with] 0:00:17.612929299 9348 00000000035D2A40 ERROR libav gstavvidenc.c:706:gst_ffmpegvidenc_handle_frame:<avenc_mpeg2video0> avenc_mpeg2video: failed to encode buffer 0:00:17.615227075 9348 00000000035D2A40 ERROR libav :0:: Invalid pts (378) <= last (378) 0:00:17.615642325 9348 00000000035D2A40 ERROR libav gstavvidenc.c:706:gst_ffmpegvidenc_handle_frame:<avenc_mpeg2video0> avenc_mpeg2video: failed to encode buffer 0:00:17.621328818 9348 00000000035D2A40 WARN basesink gstbasesink.c:3382:gst_base_sink_chain_unlocked:<udpsink0> warning: Internal data flow problem. 0:00:17.621889898 9348 00000000035D2A40 WARN basesink gstbasesink.c:3382:gst_base_sink_chain_unlocked:<udpsink0> warning: Received buffer without a new-segment. Assuming timestamps start from 0. |
Hi,
I have been working to integrate gstreamer into my application for a few months now but I am still having problems described previously with changing speed and seeking when playing a video to a udpSink. My application supports playing back a video file to a local window and/or to a UDP stream. When playing the video to a local window, I can change the playback speed and seek forwards and backwards in a video file and the playback responds immediately. However, if the playback is sent to the pipeline with a udpSink, any attempts to change speed or seek forwards or backwards results in a long pause (greater than 30 seconds) followed by the playback continuing at a point past where the seek was specified. Any insight into what is causing the playback to a udpSink to get hung up when performing a seek would be greatly appreciated. Is it necessary to perform some sort of reset or offset before making a seek? I tried to pause and then perform the seek, but that did seem to help. Here are my pipelines and the diagnostic output from gstreamer: "This is my pipeline configuration for the udpSink: data.pipeline = gst_parse_launch ("filesrc location=test2.mpeg ! decodebin ! identity single-segment=true ! avenc_mpeg2video ! mpegtsmux ! udpsink host=\"127.0.0.1\" port=5000", NULL); This is my pipeline configuration with playbin: data.pipeline = gst_parse_launch ("playbin uri=file:///C:/gstream/test2.mpeg", NULL); This is my seek event (rate=1.0): seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_SKIP, GST_SEEK_TYPE_SET, position, GST_SEEK_TYPE_NONE, 0); gst_element_send_event (data->pipeline, seek_event); gstreamer diagnostic output after seek in udpSink pipeline is submitted: 0:00:16.312514219 9348 00000000035D2A40 ERROR libav :0:: Invalid pts (0) <= last (378) 0:00:16.313970803 9348 00000000035D2A40 ERROR libav gstavvidenc.c:706:gst_ffmpegvidenc_handle_frame:<avenc_mpeg2video0> avenc_mpeg2video: failed to encode buffer ... [and ending with] 0:00:17.612929299 9348 00000000035D2A40 ERROR libav gstavvidenc.c:706:gst_ffmpegvidenc_handle_frame:<avenc_mpeg2video0> avenc_mpeg2video: failed to encode buffer 0:00:17.615227075 9348 00000000035D2A40 ERROR libav :0:: Invalid pts (378) <= last (378) 0:00:17.615642325 9348 00000000035D2A40 ERROR libav gstavvidenc.c:706:gst_ffmpegvidenc_handle_frame:<avenc_mpeg2video0> avenc_mpeg2video: failed to encode buffer 0:00:17.621328818 9348 00000000035D2A40 WARN basesink gstbasesink.c:3382:gst_base_sink_chain_unlocked:<udpsink0> warning: Internal data flow problem. 0:00:17.621889898 9348 00000000035D2A40 WARN basesink gstbasesink.c:3382:gst_base_sink_chain_unlocked:<udpsink0> warning: Received buffer without a new-segment. Assuming timestamps start from 0. " |
On Tue, 2016-09-27 at 12:54 -0700, doubledw wrote:
Hi, At first glance it looks like the encoder not handling seeks / flushing properly. I have something else for you to try which might work better for you: Create two separate pipelines in the same application - one that encodes/streams and one that decodes/seeks/etc. In the playback pipeline you use intervideosink as video sink, and in the encoding pipeline you use intervideosrc as source element. The video data from the intervideosink will be sent to the intervideosrc, but both are decoupled, so if you pause the playback pipeline, the streaming pipeline will just keep repeating and streaming the last frame (or go black after a while, depending what you set the timeout to). If you seek on the playback pipeline the streaming part will repeat the last frame until the seek is complete and then output frames from the new position (but with monotonically increasing timestamps, so the encoder will never know there was a discontinuity). If you play back at half speed or double speed, the streaming part will encode that at the playback speed and stream it out normally. For the decoding pipeline you can just use a playbin element and set the "video-sink" property to an intervideosink (GstElement *). On the streaming pipeline you want something like: intervideosrc ! video/x-raw,framerate=25/1 ! videoconvert ! avenc_mpeg2video ! mpegtsmux ! udpsink If you want audio as well, there's also interaudiosink/src which work the same way. Good luck! Cheers -Tim -- Tim Müller, Centricular Ltd - http://www.centricular.com Join us at the GStreamer Conference! 10-11 October 2016 in Berlin, Germany http://gstreamer.freedesktop.org/conference/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Tim,
Your alternative suggestion looks like it might nicely fit my needs. I am already supporting playback to a local window in conjunction with the udpSink just in 2 independent pipelines. I assume that if I want to modify my existing pipeline that plays to a local window to also support the intervideosink then I will have to tee the pipleline. I will give this coupling approach a try. Thanks a lot, Doug On Tue, Sep 27, 2016 at 5:37 PM, Tim Müller [via GStreamer-devel] <[hidden email]> wrote: > On Tue, 2016-09-27 at 12:54 -0700, doubledw wrote: > > Hi, > > At first glance it looks like the encoder not handling seeks / flushing > properly. > > I have something else for you to try which might work better for you: > > Create two separate pipelines in the same application - one that > encodes/streams and one that decodes/seeks/etc. > > In the playback pipeline you use intervideosink as video sink, and in > the encoding pipeline you use intervideosrc as source element. > > The video data from the intervideosink will be sent to the > intervideosrc, but both are decoupled, so if you pause the playback > pipeline, the streaming pipeline will just keep repeating and streaming > the last frame (or go black after a while, depending what you set the > timeout to). If you seek on the playback pipeline the streaming part > will repeat the last frame until the seek is complete and then output > frames from the new position (but with monotonically increasing > timestamps, so the encoder will never know there was a discontinuity). > If you play back at half speed or double speed, the streaming part will > encode that at the playback speed and stream it out normally. > > For the decoding pipeline you can just use a playbin element and set > the "video-sink" property to an intervideosink (GstElement *). > > On the streaming pipeline you want something like: > > intervideosrc ! video/x-raw,framerate=25/1 ! videoconvert > ! avenc_mpeg2video ! mpegtsmux ! udpsink > > If you want audio as well, there's also interaudiosink/src which work > the same way. > > Good luck! > > Cheers > -Tim > > -- > Tim Müller, Centricular Ltd - http://www.centricular.com > > Join us at the GStreamer Conference! > 10-11 October 2016 in Berlin, > Germany > http://gstreamer.freedesktop.org/conference/ > > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > > > ________________________________ > If you reply to this email, your message will be added to the discussion > below: > http://gstreamer-devel.966125.n4.nabble.com/Seeking-when-streaming-to-udpSink-tp4678702p4679823.html > To unsubscribe from Seeking when streaming to udpSink, click here. > NAML -- Doug Wood | Principal Software Engineer VT MÄK | 150 Cambridge Park Drive, Third Floor, Cambridge, MA 02140 T: +1.407.359.2725 / +1.617.876.8085 [hidden email] | www.mak.com |
In reply to this post by Tim Müller
I modified my application to use the intervideosink and intervediosrc
but I am getting a crash when the playbin pipeline is started. I reproduced the problem in the included test application. With debugging at level 4, the diagnostic output indicated the assertion below. Any suggestions on what I am doing wrong or where the problem is? I am using release 1.8.0. 0:00:00.232899237 14216 00000000041C1140 INFO GST_STATES gstelement.c:2277:_priv_gst_element_state_changed:<playbin0> notifying about state-changed PAUSED to PLAYING (VOID_PENDING pending) 0:00:00.232934304 14216 0000000002B04000 INFO GST_BUS gstbus.c:565:gst_bus_timed_pop_filtered:<bus1> we got woken up, recheck for message 0:00:00.247985542 14216 00000000033EE440 INFO GST_EVENT gstevent.c:679:gst_event_new_caps: creating caps event video/x-raw, format=(string)I420, width=(int)1442, height=(int)1011, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1 0:00:00.248142062 14216 00000000033EE440 INFO basetransform gstbasetransform.c:1367:gst_base_transform_setcaps:<capsfilter0> reuse caps 0:00:00.248173709 14216 00000000033EE440 INFO GST_EVENT gstevent.c:679:gst_event_new_caps: creating caps event video/x-raw, format=(string)I420, width=(int)1442, height=(int)1011, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1 0:00:00.248360165 14216 00000000033EE440 INFO basetransform gstbasetransform.c:1367:gst_base_transform_setcaps:<videoconvert0> reuse caps 0:00:00.248417043 14216 00000000033EE440 INFO GST_EVENT gstevent.c:679:gst_event_new_caps: creating caps event video/x-raw, format=(string)I420, width=(int)1442, height=(int)1011, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)25/1 0:00:00.249147902 14216 00000000033EE440 INFO libav :0:: Assertion v>0 && v<=(1 ? 32 : 16) failed at libavutil/mem.c:233 This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information. #include <gst/gst.h> GstElement* buildPipeline(char* pipelineDescription) { GError* error=NULL; GstElement* pipeline = gst_parse_launch(pipelineDescription, &error); if(error != NULL) { g_printerr("Error: %s\n", error->message); return NULL; } if(pipeline == NULL) { g_printerr("Error creating pipeline\n"); return NULL; } return pipeline; } gint main (gint argc, gchar *argv[]) { GstStateChangeReturn retSink; GstStateChangeReturn retSrc; // init GStreamer gst_init (&argc, &argv); char* sinkPipelineDescription = "playbin uri=file:///C:/gstreamer/1.0/x86_64/bin/captureMakLand.mpeg"; char* srcPipelineDescription ="intervideosrc channel=V1 ! video/x-raw,framerate=25/1 ! videoconvert ! avenc_mpeg2video ! mpegtsmux ! udpsink host=127.0.0.255 port=5000"; // setup pipeline GstElement* mySinkPipeline = buildPipeline(sinkPipelineDescription); GstElement* mySrcPipeline = buildPipeline(srcPipelineDescription); if (mySinkPipeline && mySrcPipeline) { GstElement* interVideoSink = gst_element_factory_make("intervideosink", "interVideoSink"); g_object_set (GST_OBJECT (interVideoSink), "channel", "V1", NULL); g_object_set (GST_OBJECT (mySinkPipeline), "video-sink", interVideoSink, NULL); retSrc = gst_element_set_state (mySrcPipeline, GST_STATE_PLAYING); retSink = gst_element_set_state (mySinkPipeline, GST_STATE_PLAYING); if (retSink != GST_STATE_CHANGE_FAILURE && retSrc != GST_STATE_CHANGE_FAILURE) { /* Wait until error or EOS */ GstBus *bus = gst_element_get_bus (mySinkPipeline); GstMessage *msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GstMessageType(GST_MESSAGE_ERROR | GST_MESSAGE_EOS)); /* Parse message */ if (msg != NULL) { GError *err; gchar *debug_info; switch (GST_MESSAGE_TYPE (msg)) { case GST_MESSAGE_ERROR: gst_message_parse_error (msg, &err, &debug_info); g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message); g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none"); g_clear_error (&err); g_free (debug_info); break; case GST_MESSAGE_EOS: g_print ("End-Of-Stream reached.\n"); break; default: /* We should not reach here because we only asked for ERRORs and EOS */ g_printerr ("Unexpected message received.\n"); break; } } } } // clean up gst_element_set_state (mySinkPipeline, GST_STATE_NULL); gst_object_unref (GST_OBJECT (mySinkPipeline)); //g_main_loop_unref (loop); return 0; } On Wed, Sep 28, 2016 at 10:38 AM, Doug Wood <[hidden email]> wrote: > Tim, > > Your alternative suggestion looks like it might nicely fit my needs. I > am already supporting playback to a local window in conjunction with > the udpSink just in 2 independent pipelines. I assume that if I want > to modify my existing pipeline that plays to a local window to also > support the intervideosink then I will have to tee the pipleline. > > I will give this coupling approach a try. > > Thanks a lot, > Doug > > On Tue, Sep 27, 2016 at 5:37 PM, Tim Müller [via GStreamer-devel] > <[hidden email]> wrote: >> On Tue, 2016-09-27 at 12:54 -0700, doubledw wrote: >> >> Hi, >> >> At first glance it looks like the encoder not handling seeks / flushing >> properly. >> >> I have something else for you to try which might work better for you: >> >> Create two separate pipelines in the same application - one that >> encodes/streams and one that decodes/seeks/etc. >> >> In the playback pipeline you use intervideosink as video sink, and in >> the encoding pipeline you use intervideosrc as source element. >> >> The video data from the intervideosink will be sent to the >> intervideosrc, but both are decoupled, so if you pause the playback >> pipeline, the streaming pipeline will just keep repeating and streaming >> the last frame (or go black after a while, depending what you set the >> timeout to). If you seek on the playback pipeline the streaming part >> will repeat the last frame until the seek is complete and then output >> frames from the new position (but with monotonically increasing >> timestamps, so the encoder will never know there was a discontinuity). >> If you play back at half speed or double speed, the streaming part will >> encode that at the playback speed and stream it out normally. >> >> For the decoding pipeline you can just use a playbin element and set >> the "video-sink" property to an intervideosink (GstElement *). >> >> On the streaming pipeline you want something like: >> >> intervideosrc ! video/x-raw,framerate=25/1 ! videoconvert >> ! avenc_mpeg2video ! mpegtsmux ! udpsink >> >> If you want audio as well, there's also interaudiosink/src which work >> the same way. >> >> Good luck! >> >> Cheers >> -Tim >> >> -- >> Tim Müller, Centricular Ltd - http://www.centricular.com >> >> Join us at the GStreamer Conference! >> 10-11 October 2016 in Berlin, >> Germany >> http://gstreamer.freedesktop.org/conference/ >> >> _______________________________________________ >> gstreamer-devel mailing list >> [hidden email] >> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> >> >> ________________________________ >> If you reply to this email, your message will be added to the discussion >> below: >> http://gstreamer-devel.966125.n4.nabble.com/Seeking-when-streaming-to-udpSink-tp4678702p4679823.html >> To unsubscribe from Seeking when streaming to udpSink, click here. >> NAML > > > > -- > Doug Wood | Principal Software Engineer > VT MÄK | 150 Cambridge Park Drive, Third Floor, Cambridge, MA 02140 > T: +1.407.359.2725 / +1.617.876.8085 > [hidden email] | www.mak.com -- Doug Wood | Principal Software Engineer VT MÄK | 150 Cambridge Park Drive, Third Floor, Cambridge, MA 02140 T: +1.407.359.2725 / +1.617.876.8085 [hidden email] | www.mak.com |
In reply to this post by Tim Müller
Tim,
FYI: Upgrading to GStreamer 1.9,9 resolved the crash that I was seeing using the intervideosrc/sink (previously using 1.8.0). The test application is working now and i was also able to get my target application working. I can now seek forwards and backwards and change speed in my video file playback while it is streaming over UDP and view the continuous stream using VLC (or gstreamer). Thanks a lot for your suggestion. It was invaluable! Doug On Fri, Sep 30, 2016 at 6:05 PM, Doug Wood <[hidden email]> wrote: > I modified my application to use the intervideosink and intervediosrc > but I am getting a crash when the playbin pipeline is started. I > reproduced the problem in the included test application. With > debugging at level 4, the diagnostic output indicated the assertion > below. > > Any suggestions on what I am doing wrong or where the problem is? I > am using release 1.8.0. > > 0:00:00.232899237 14216 00000000041C1140 INFO GST_STATES > gstelement.c:2277:_priv_gst_element_state_changed:<playbin0> notifying > about state-changed PAUSED to PLAYING (VOID_PENDING pending) > 0:00:00.232934304 14216 0000000002B04000 INFO GST_BUS > gstbus.c:565:gst_bus_timed_pop_filtered:<bus1> we got woken up, > recheck for message > 0:00:00.247985542 14216 00000000033EE440 INFO GST_EVENT > gstevent.c:679:gst_event_new_caps: creating caps event video/x-raw, > format=(string)I420, width=(int)1442, height=(int)1011, > interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, > chroma-site=(string)mpeg2, colorimetry=(string)bt709, > framerate=(fraction)25/1 > 0:00:00.248142062 14216 00000000033EE440 INFO basetransform > gstbasetransform.c:1367:gst_base_transform_setcaps:<capsfilter0> reuse > caps > 0:00:00.248173709 14216 00000000033EE440 INFO GST_EVENT > gstevent.c:679:gst_event_new_caps: creating caps event video/x-raw, > format=(string)I420, width=(int)1442, height=(int)1011, > interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, > chroma-site=(string)mpeg2, colorimetry=(string)bt709, > framerate=(fraction)25/1 > 0:00:00.248360165 14216 00000000033EE440 INFO basetransform > gstbasetransform.c:1367:gst_base_transform_setcaps:<videoconvert0> > reuse caps > 0:00:00.248417043 14216 00000000033EE440 INFO GST_EVENT > gstevent.c:679:gst_event_new_caps: creating caps event video/x-raw, > format=(string)I420, width=(int)1442, height=(int)1011, > interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, > chroma-site=(string)mpeg2, colorimetry=(string)bt709, > framerate=(fraction)25/1 > 0:00:00.249147902 14216 00000000033EE440 INFO libav > :0:: Assertion v>0 && v<=(1 ? 32 : 16) failed at libavutil/mem.c:233 > > This application has requested the Runtime to terminate it in an unusual way. > Please contact the application's support team for more information. > > > > #include <gst/gst.h> > > GstElement* buildPipeline(char* pipelineDescription) > { > GError* error=NULL; > GstElement* pipeline = gst_parse_launch(pipelineDescription, &error); > > if(error != NULL) > { > g_printerr("Error: %s\n", error->message); > return NULL; > } > > if(pipeline == NULL) > { > g_printerr("Error creating pipeline\n"); > return NULL; > } > > return pipeline; > } > > > gint main (gint argc, gchar *argv[]) > { > GstStateChangeReturn retSink; > GstStateChangeReturn retSrc; > > // init GStreamer > gst_init (&argc, &argv); > > char* sinkPipelineDescription = "playbin > uri=file:///C:/gstreamer/1.0/x86_64/bin/captureMakLand.mpeg"; > char* srcPipelineDescription ="intervideosrc channel=V1 ! > video/x-raw,framerate=25/1 ! videoconvert ! avenc_mpeg2video ! > mpegtsmux ! udpsink host=127.0.0.255 port=5000"; > > // setup pipeline > GstElement* mySinkPipeline = buildPipeline(sinkPipelineDescription); > GstElement* mySrcPipeline = buildPipeline(srcPipelineDescription); > > > if (mySinkPipeline && mySrcPipeline) > { > GstElement* interVideoSink = > gst_element_factory_make("intervideosink", "interVideoSink"); > > g_object_set (GST_OBJECT (interVideoSink), "channel", "V1", NULL); > g_object_set (GST_OBJECT (mySinkPipeline), "video-sink", > interVideoSink, NULL); > > retSrc = gst_element_set_state (mySrcPipeline, GST_STATE_PLAYING); > retSink = gst_element_set_state (mySinkPipeline, GST_STATE_PLAYING); > if (retSink != GST_STATE_CHANGE_FAILURE && retSrc != > GST_STATE_CHANGE_FAILURE) > { > /* Wait until error or EOS */ > GstBus *bus = gst_element_get_bus (mySinkPipeline); > GstMessage *msg = gst_bus_timed_pop_filtered (bus, > GST_CLOCK_TIME_NONE, GstMessageType(GST_MESSAGE_ERROR | > GST_MESSAGE_EOS)); > > /* Parse message */ > if (msg != NULL) > { > GError *err; > gchar *debug_info; > > switch (GST_MESSAGE_TYPE (msg)) > { > case GST_MESSAGE_ERROR: > gst_message_parse_error (msg, &err, &debug_info); > g_printerr ("Error received from element %s: %s\n", > GST_OBJECT_NAME (msg->src), err->message); > g_printerr ("Debugging information: %s\n", debug_info ? > debug_info : "none"); > g_clear_error (&err); > g_free (debug_info); > break; > case GST_MESSAGE_EOS: > g_print ("End-Of-Stream reached.\n"); > break; > default: > /* We should not reach here because we only asked for > ERRORs and EOS */ > g_printerr ("Unexpected message received.\n"); > break; > } > } > } > } > > // clean up > gst_element_set_state (mySinkPipeline, GST_STATE_NULL); > gst_object_unref (GST_OBJECT (mySinkPipeline)); > //g_main_loop_unref (loop); > > return 0; > } > > > On Wed, Sep 28, 2016 at 10:38 AM, Doug Wood <[hidden email]> wrote: >> Tim, >> >> Your alternative suggestion looks like it might nicely fit my needs. I >> am already supporting playback to a local window in conjunction with >> the udpSink just in 2 independent pipelines. I assume that if I want >> to modify my existing pipeline that plays to a local window to also >> support the intervideosink then I will have to tee the pipleline. >> >> I will give this coupling approach a try. >> >> Thanks a lot, >> Doug >> >> On Tue, Sep 27, 2016 at 5:37 PM, Tim Müller [via GStreamer-devel] >> <[hidden email]> wrote: >>> On Tue, 2016-09-27 at 12:54 -0700, doubledw wrote: >>> >>> Hi, >>> >>> At first glance it looks like the encoder not handling seeks / flushing >>> properly. >>> >>> I have something else for you to try which might work better for you: >>> >>> Create two separate pipelines in the same application - one that >>> encodes/streams and one that decodes/seeks/etc. >>> >>> In the playback pipeline you use intervideosink as video sink, and in >>> the encoding pipeline you use intervideosrc as source element. >>> >>> The video data from the intervideosink will be sent to the >>> intervideosrc, but both are decoupled, so if you pause the playback >>> pipeline, the streaming pipeline will just keep repeating and streaming >>> the last frame (or go black after a while, depending what you set the >>> timeout to). If you seek on the playback pipeline the streaming part >>> will repeat the last frame until the seek is complete and then output >>> frames from the new position (but with monotonically increasing >>> timestamps, so the encoder will never know there was a discontinuity). >>> If you play back at half speed or double speed, the streaming part will >>> encode that at the playback speed and stream it out normally. >>> >>> For the decoding pipeline you can just use a playbin element and set >>> the "video-sink" property to an intervideosink (GstElement *). >>> >>> On the streaming pipeline you want something like: >>> >>> intervideosrc ! video/x-raw,framerate=25/1 ! videoconvert >>> ! avenc_mpeg2video ! mpegtsmux ! udpsink >>> >>> If you want audio as well, there's also interaudiosink/src which work >>> the same way. >>> >>> Good luck! >>> >>> Cheers >>> -Tim >>> >>> -- >>> Tim Müller, Centricular Ltd - http://www.centricular.com >>> >>> Join us at the GStreamer Conference! >>> 10-11 October 2016 in Berlin, >>> Germany >>> http://gstreamer.freedesktop.org/conference/ >>> >>> _______________________________________________ >>> gstreamer-devel mailing list >>> [hidden email] >>> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >>> >>> >>> ________________________________ >>> If you reply to this email, your message will be added to the discussion >>> below: >>> http://gstreamer-devel.966125.n4.nabble.com/Seeking-when-streaming-to-udpSink-tp4678702p4679823.html >>> To unsubscribe from Seeking when streaming to udpSink, click here. >>> NAML >> >> >> >> -- >> Doug Wood | Principal Software Engineer >> VT MÄK | 150 Cambridge Park Drive, Third Floor, Cambridge, MA 02140 >> T: +1.407.359.2725 / +1.617.876.8085 >> [hidden email] | www.mak.com > > > > -- > Doug Wood | Principal Software Engineer > VT MÄK | 150 Cambridge Park Drive, Third Floor, Cambridge, MA 02140 > T: +1.407.359.2725 / +1.617.876.8085 > [hidden email] | www.mak.com -- Doug Wood | Principal Software Engineer VT MÄK | 150 Cambridge Park Drive, Third Floor, Cambridge, MA 02140 T: +1.407.359.2725 / +1.617.876.8085 [hidden email] | www.mak.com |
Free forum by Nabble | Edit this page |