Hey guys!
I have two pipelines that i want to connect. One for displaying video and one for streaming on the network. These 2 pipelines should be independent from each other, meaning: If the network pipeline crashes, the other pipeline should keep on running no matter what. Therefore i separated the two pipelines and used an appsink in the first pipeline and an appsrc in the second to connect them: Audiotestsrc ! Audioconvert ! Appsink Appsrc ! Queue ! rtpL16pay ... Appsink and Appsrc are connected via static callbacks and the data is being pushed from the appsink to the appsrc: static GstFlowReturn onNewBuffer (GstAppSink *appsink, gpointer user_data) { GstFlowReturn rv = GST_FLOW_ERROR; GstAppSrc *appsrc = ((GstAppSrc*)(user_data)); GstBuffer *buf = gst_app_sink_pull_buffer(appsink); rv = gst_app_src_push_buffer(appsrc,buf); return rv; } This works, but if I use the 2nd pipeline in the RTSP Server and connect with a client I get "timing out collisions" error messages from the rtpsession.c implementation. If I put all in one pipeline it works. I experimented with gst_element_get_base_time (...); gst_element_set_start_time (pipeline,GST_CLOCK_TIME_NONE); gst_element_set_base_time (pipeline,time); To synchorinze the timestamps and got it to work with some video pipelines, but can't get it to work with this audio example. What would be the right way to synchronize these 2 pipelines? Thanks, mat _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hi,
I found the solution to my problem, so here is the solution: 1. Set the gst_element_set_start_time (pipeline,GST_CLOCK_TIME_NONE); on the 2nd pipeline 2. In case of the audiotestsrc is-live has to be set to true. Cheers mat -----Ursprüngliche Nachricht----- Von: Matthias Dodt Gesendet: 19 October 2011 16:00 An: 'Discussion of the development of and with GStreamer' Betreff: Connecting two independent pipelines - sync timestamps Hey guys! I have two pipelines that i want to connect. One for displaying video and one for streaming on the network. These 2 pipelines should be independent from each other, meaning: If the network pipeline crashes, the other pipeline should keep on running no matter what. Therefore i separated the two pipelines and used an appsink in the first pipeline and an appsrc in the second to connect them: Audiotestsrc ! Audioconvert ! Appsink Appsrc ! Queue ! rtpL16pay ... Appsink and Appsrc are connected via static callbacks and the data is being pushed from the appsink to the appsrc: static GstFlowReturn onNewBuffer (GstAppSink *appsink, gpointer user_data) { GstFlowReturn rv = GST_FLOW_ERROR; GstAppSrc *appsrc = ((GstAppSrc*)(user_data)); GstBuffer *buf = gst_app_sink_pull_buffer(appsink); rv = gst_app_src_push_buffer(appsrc,buf); return rv; } This works, but if I use the 2nd pipeline in the RTSP Server and connect with a client I get "timing out collisions" error messages from the rtpsession.c implementation. If I put all in one pipeline it works. I experimented with gst_element_get_base_time (...); gst_element_set_start_time (pipeline,GST_CLOCK_TIME_NONE); gst_element_set_base_time (pipeline,time); To synchorinze the timestamps and got it to work with some video pipelines, but can't get it to work with this audio example. What would be the right way to synchronize these 2 pipelines? Thanks, mat _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |