Hi All,
I am working on one of our product in which android 4.4 Kitkat is running. We have one application called it as SPICE client which runs on our product and is used to capture Audio+Video data streamed over network from SPICE server which is installed into Ubuntu 16.04 Server. We are using SPICE client gtk based topology in which Gstreamer Framework 1.0 is used. We are facing one audio sync issue like when Streaming process of Audio+Video are started from SPICE server side at that time we are getting video data perfectly but not getting audio data in sync compared to video frames and are dropped for some initial durations. so, it seems like Audio data is not synced or is lagged compare to video data at that time. Please find following code snippet of Gstreamer Pipeline we have configured and used in our SPICE client android code. #ifdef WITH_GST1AUDIO Also, we are getting below warning message sometimes from Android Gstreamer Studio. gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<audiosink-actual-sink-opensles> Unexpected discontinuity in audio timestamps of -0:00:00.076145124, resyncing Please let me know if anyone has any idea or clue to debug or solve this type of issue. -- Regards, Ritesh Prajapati, System Level Solutions (India) Pvt.Ltd. _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
See reply inline.
Le jeudi 07 juillet 2016 à 14:12 +0530, Ritesh Prajapati a écrit : > Hi All, > > I am working on one of our product in which android 4.4 Kitkat is > running. We have one application called it as SPICE client which runs > on our product and is used to capture Audio+Video data streamed over > network from SPICE server which is installed into Ubuntu 16.04 > Server. > > We are using SPICE client gtk based topology in which Gstreamer > Framework 1.0 is used. > > We are facing one audio sync issue like when Streaming process > of Audio+Video are started from SPICE server side at that time we are > getting video data perfectly but not getting audio data in sync > compared to video frames and are dropped for some initial durations. > so, it seems like Audio data is not synced or is lagged compare to > video data at that time. > > Please find following code snippet of Gstreamer Pipeline we have > configured and used in our SPICE client android code. > > > #ifdef WITH_GST1AUDIO > > g_strdup_printf("audio/x- > > raw,format=\"S16LE\",channels=%d,rate=%d," > > "layout=interleaved", channels, > > frequency); > > #else > > g_strdup_printf("audio/x-raw- > > int,channels=%d,rate=%d,signed=(boolean)true," > > "width=16,depth=16,endianness=1234", > > channels, frequency); > > #endif > > gchar *pipeline = g_strdup > > (g_getenv("SPICE_GST_AUDIOSINK")); > > if (pipeline == NULL) > > pipeline = g_strdup_printf("appsrc is-live=1 do- > > timestamp=1 format=time min-latency=0 caps=\"%s\" name=\"appsrc\" ! > > " > > "audioconvert ! > > audioresample ! autoaudiosink name=\"audiosink\"", audio_caps); > > Also, we are getting below warning message sometimes from Android > Gstreamer Studio. > > > gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<audiosink- > actual-sink-opensles> Unexpected discontinuity in audio timestamps of > -0:00:00.076145124, resyncing timestamp distance between buffer becomes too small. The timestamp + duration of the current buffer may endup after the next timestamp. In this case the audio sink will resync. To prevent that, you should timestamp the buffer yourself, so you pick an initial timestamp, and add the duration. You then monitor the calculated timestamp and the time now, if it drift over a certain threshold (generally 40ms), you resync (and set the discontinuity flag). In an ideal world, the streaming protocol should provide you with timing information that let you correlate in time the video and the audio frames. This should serve in creating timestamps and ensuring perfect A/V sync. I don't know Spice too much, but it might not be part of the protocol. Nicolas _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (188 bytes) Download Attachment |
Hi Nicolas,
Thanks for Reply. When I am getting too much audio discontinuity timestamps messages at that time Audio data is lagged compare to video data and goes out of sync for that duration. So, if I set threshold timestamp to 40ms then will my sync problem be solved or reduced? Please let me know if any other configuration require in appsrc pipeline to solve this problem. Regards, Ritesh Prajapati, System Level Solutions (India) Pvt.Ltd.On Thursday 07 July 2016 06:45 PM, Nicolas Dufresne wrote: See reply inline. Le jeudi 07 juillet 2016 à 14:12 +0530, Ritesh Prajapati a écrit :Hi All, I am working on one of our product in which android 4.4 Kitkat is running. We have one application called it as SPICE client which runs on our product and is used to capture Audio+Video data streamed over network from SPICE server which is installed into Ubuntu 16.04 Server. We are using SPICE client gtk based topology in which Gstreamer Framework 1.0 is used. We are facing one audio sync issue like when Streaming process of Audio+Video are started from SPICE server side at that time we are getting video data perfectly but not getting audio data in sync compared to video frames and are dropped for some initial durations. so, it seems like Audio data is not synced or is lagged compare to video data at that time. Please find following code snippet of Gstreamer Pipeline we have configured and used in our SPICE client android code.#ifdef WITH_GST1AUDIO g_strdup_printf("audio/x- raw,format=\"S16LE\",channels=%d,rate=%d," "layout=interleaved", channels, frequency); #else g_strdup_printf("audio/x-raw- int,channels=%d,rate=%d,signed=(boolean)true," "width=16,depth=16,endianness=1234", channels, frequency); #endif gchar *pipeline = g_strdup (g_getenv("SPICE_GST_AUDIOSINK")); if (pipeline == NULL) pipeline = g_strdup_printf("appsrc is-live=1 do- timestamp=1 format=time min-latency=0 caps=\"%s\" name=\"appsrc\" ! " "audioconvert ! audioresample ! autoaudiosink name=\"audiosink\"", audio_caps);Also, we are getting below warning message sometimes from Android Gstreamer Studio. gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<audiosink- actual-sink-opensles> Unexpected discontinuity in audio timestamps of -0:00:00.076145124, resyncingAs audio may arrive in small burst, what will happen is that the timestamp distance between buffer becomes too small. The timestamp + duration of the current buffer may endup after the next timestamp. In this case the audio sink will resync. To prevent that, you should timestamp the buffer yourself, so you pick an initial timestamp, and add the duration. You then monitor the calculated timestamp and the time now, if it drift over a certain threshold (generally 40ms), you resync (and set the discontinuity flag). In an ideal world, the streaming protocol should provide you with timing information that let you correlate in time the video and the audio frames. This should serve in creating timestamps and ensuring perfect A/V sync. I don't know Spice too much, but it might not be part of the protocol. Nicolas _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hi Nicolas or others,
does anyone has any idea to solve below issue? Regards, Ritesh Prajapati, System Level Solutions (India) Pvt.Ltd.On Friday 08 July 2016 09:25 AM, Ritesh Prajapati wrote:
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Ritesh Prajapati
Le vendredi 08 juillet 2016 à 09:25 +0530, Ritesh Prajapati a écrit :
> Hi Nicolas, > > Thanks for Reply. > > When I am getting too much audio discontinuity timestamps > messages at that time Audio data is lagged compare to video data and > goes out of sync for that duration. So, if I set threshold timestamp > to 40ms then will my sync problem be solved or reduced? 40ms threshold is default on most audio sink. The sink will provide a clock and compare the timestamp you provide with that clock. So it will resync if the timestamp you provide and the number of sample miss- match. I probably don't have enough information to better help you right now. But I think what is important is that your timestamps remains as representative as possible to when the audio was produced. Did you find anything in SPICE to help you with that ? > > Please let me know if any other configuration require in appsrc > pipeline to solve this problem. > On Thursday 07 July 2016 06:45 PM, Nicolas Dufresne wrote: > > See reply inline. > > > > Le jeudi 07 juillet 2016 à 14:12 +0530, Ritesh Prajapati a écrit : > > > Hi All, > > > > > > I am working on one of our product in which android 4.4 > > > Kitkat is > > > running. We have one application called it as SPICE client which > > > runs > > > on our product and is used to capture Audio+Video data streamed > > > over > > > network from SPICE server which is installed into Ubuntu 16.04 > > > Server. > > > > > > We are using SPICE client gtk based topology in which > > > Gstreamer > > > Framework 1.0 is used. > > > > > > We are facing one audio sync issue like when Streaming > > > process > > > of Audio+Video are started from SPICE server side at that time we > > > are > > > getting video data perfectly but not getting audio data in sync > > > compared to video frames and are dropped for some initial > > > durations. > > > so, it seems like Audio data is not synced or is lagged compare > > > to > > > video data at that time. > > > > > > Please find following code snippet of Gstreamer Pipeline we > > > have > > > configured and used in our SPICE client android code. > > > > > > > #ifdef WITH_GST1AUDIO > > > > g_strdup_printf("audio/x- > > > > raw,format=\"S16LE\",channels=%d,rate=%d," > > > > "layout=interleaved", channels, > > > > frequency); > > > > #else > > > > g_strdup_printf("audio/x-raw- > > > > int,channels=%d,rate=%d,signed=(boolean)true," > > > > > > > > "width=16,depth=16,endianness=1234", > > > > channels, frequency); > > > > #endif > > > > gchar *pipeline = g_strdup > > > > (g_getenv("SPICE_GST_AUDIOSINK")); > > > > if (pipeline == NULL) > > > > pipeline = g_strdup_printf("appsrc is-live=1 do- > > > > timestamp=1 format=time min-latency=0 caps=\"%s\" > > > > name=\"appsrc\" ! > > > > " > > > > "audioconvert ! > > > > audioresample ! autoaudiosink name=\"audiosink\"", audio_caps); > > > > > > Also, we are getting below warning message sometimes from > > > Android > > > Gstreamer Studio. > > > > > > > > > gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<audios > > > ink- > > > actual-sink-opensles> Unexpected discontinuity in audio > > > timestamps of > > > -0:00:00.076145124, resyncing > > As audio may arrive in small burst, what will happen is that the > > timestamp distance between buffer becomes too small. The timestamp > > + > > duration of the current buffer may endup after the next timestamp. > > In > > this case the audio sink will resync. To prevent that, you should > > timestamp the buffer yourself, so you pick an initial timestamp, > > and > > add the duration. You then monitor the calculated timestamp and the > > time now, if it drift over a certain threshold (generally 40ms), > > you > > resync (and set the discontinuity flag). > > > > In an ideal world, the streaming protocol should provide you with > > timing information that let you correlate in time the video and the > > audio frames. This should serve in creating timestamps and ensuring > > perfect A/V sync. I don't know Spice too much, but it might not be > > part > > of the protocol. > > > > Nicolas > > > > > > _______________________________________________ > > gstreamer-devel mailing list > > [hidden email] > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hi Nicolas,
I have checked in aSPICE client as well as in aSPICE server configurations for specific to audio+video sync issue but could not found anything yet. Regards, Ritesh Prajapati, System Level Solutions (India) Pvt.Ltd. On Saturday 09 July 2016 07:12 PM, Nicolas Dufresne wrote: > Le vendredi 08 juillet 2016 à 09:25 +0530, Ritesh Prajapati a écrit : >> Hi Nicolas, >> >> Thanks for Reply. >> >> When I am getting too much audio discontinuity timestamps >> messages at that time Audio data is lagged compare to video data and >> goes out of sync for that duration. So, if I set threshold timestamp >> to 40ms then will my sync problem be solved or reduced? > 40ms threshold is default on most audio sink. The sink will provide a > clock and compare the timestamp you provide with that clock. So it will > resync if the timestamp you provide and the number of sample miss- > match. > > I probably don't have enough information to better help you right now. > But I think what is important is that your timestamps remains as > representative as possible to when the audio was produced. Did you find > anything in SPICE to help you with that ? > >> Please let me know if any other configuration require in appsrc >> pipeline to solve this problem. >> On Thursday 07 July 2016 06:45 PM, Nicolas Dufresne wrote: >>> See reply inline. >>> >>> Le jeudi 07 juillet 2016 à 14:12 +0530, Ritesh Prajapati a écrit : >>>> Hi All, >>>> >>>> I am working on one of our product in which android 4.4 >>>> Kitkat is >>>> running. We have one application called it as SPICE client which >>>> runs >>>> on our product and is used to capture Audio+Video data streamed >>>> over >>>> network from SPICE server which is installed into Ubuntu 16.04 >>>> Server. >>>> >>>> We are using SPICE client gtk based topology in which >>>> Gstreamer >>>> Framework 1.0 is used. >>>> >>>> We are facing one audio sync issue like when Streaming >>>> process >>>> of Audio+Video are started from SPICE server side at that time we >>>> are >>>> getting video data perfectly but not getting audio data in sync >>>> compared to video frames and are dropped for some initial >>>> durations. >>>> so, it seems like Audio data is not synced or is lagged compare >>>> to >>>> video data at that time. >>>> >>>> Please find following code snippet of Gstreamer Pipeline we >>>> have >>>> configured and used in our SPICE client android code. >>>> >>>>> #ifdef WITH_GST1AUDIO >>>>> g_strdup_printf("audio/x- >>>>> raw,format=\"S16LE\",channels=%d,rate=%d," >>>>> "layout=interleaved", channels, >>>>> frequency); >>>>> #else >>>>> g_strdup_printf("audio/x-raw- >>>>> int,channels=%d,rate=%d,signed=(boolean)true," >>>>> >>>>> "width=16,depth=16,endianness=1234", >>>>> channels, frequency); >>>>> #endif >>>>> gchar *pipeline = g_strdup >>>>> (g_getenv("SPICE_GST_AUDIOSINK")); >>>>> if (pipeline == NULL) >>>>> pipeline = g_strdup_printf("appsrc is-live=1 do- >>>>> timestamp=1 format=time min-latency=0 caps=\"%s\" >>>>> name=\"appsrc\" ! >>>>> " >>>>> "audioconvert ! >>>>> audioresample ! autoaudiosink name=\"audiosink\"", audio_caps); >>>> >>>> Also, we are getting below warning message sometimes from >>>> Android >>>> Gstreamer Studio. >>>> >>>> >>>> gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<audios >>>> ink- >>>> actual-sink-opensles> Unexpected discontinuity in audio >>>> timestamps of >>>> -0:00:00.076145124, resyncing >>> As audio may arrive in small burst, what will happen is that the >>> timestamp distance between buffer becomes too small. The timestamp >>> + >>> duration of the current buffer may endup after the next timestamp. >>> In >>> this case the audio sink will resync. To prevent that, you should >>> timestamp the buffer yourself, so you pick an initial timestamp, >>> and >>> add the duration. You then monitor the calculated timestamp and the >>> time now, if it drift over a certain threshold (generally 40ms), >>> you >>> resync (and set the discontinuity flag). >>> >>> In an ideal world, the streaming protocol should provide you with >>> timing information that let you correlate in time the video and the >>> audio frames. This should serve in creating timestamps and ensuring >>> perfect A/V sync. I don't know Spice too much, but it might not be >>> part >>> of the protocol. >>> >>> Nicolas >>> >>> >>> _______________________________________________ >>> gstreamer-devel mailing list >>> [hidden email] >>> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel >> > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |