Hi. I want to stream audio data from microphone and from some ogg audio file to my app.
directsoundsrc device=\"{1b70206d-3bb5-4246-a4b3-6aeddb8f1264}\"
! audioconvert
! audio/x-raw, format=(string)F32LE, rate=(int)48000, channels=(int)2, layout=(string)interleaved
! appsink name=flowsink sync=true
from file:
filesrc location=sound_file.ogg ! oggdemux ! vorbisdec !
audioconvert ! audio/x-raw, format=(string)F32LE, channels=(int)2, layout=(string)interleaved !
audioresample ! audio/x-raw, format=(string)F32LE, rate=(int)48000, channels=(int)2, layout=(string)interleaved !
appsink name=flowsink sync=true
That's how i read buffer from stream:
GstSample* gs = gst_app_sink_pull_sample(GST_APP_SINK( m_appsink));
if (gs)
{
GstBuffer* gb = gst_sample_get_buffer(gs); // no lifetime transfer
GstMapInfo gm;
if (!gb)
{
return false;
}
if (!gst_buffer_map(gb, &gm, GST_MAP_READ))
{
return false;
}
else
{
std::cout << "buffer size " << gm.size;
}
}
when i read buffer from microphone stream everything works as expected and i get
100 buffers per second with size 3840 (3840 / 2 / 4 = 48000 kHz)
but when i read from .ogg file i have strange buffer size which is usually 2048 and sometimes less
and 200 buffers per second
Why does it work like that?