I am new to Gstreamer and have trouble feeding a custom appsrc via H264 to RTP. My pipeline:
appsrc name=mysrc ! videorate ! ffmpegcolorspace ! videoscale method=1
! video/x-raw-yuv,width=320,height=240,format=(fourcc)UYVY,framerate=\(fraction\)15/1
! TIVidenc1 codecName=h264enc engineName=codecServer ! rtph264pay pt=96
! udpsink host=192.168.1.112 port=5000
The appsrc read_data function contains this:
static gboolean white = FALSE;
gdouble ms = g_timer_elapsed(app->timer, NULL);
if (ms > 1.0/15.0) {
GstBuffer* buffer = gst_buffer_new();
GST_INFO("Pushing new data into pipe, elapsed seconds=%f.", ms);
memset(GST_BUFFER_DATA(buffer), white ? 0xff : 0x0, GST_BUFFER_SIZE(buffer));
GST_BUFFER_SIZE(buffer) = 320 * 240 * 3 * sizeof(guchar);
white = !white;
g_signal_emit_by_name(app->appsrc, "push-buffer", buffer, &ret);
gst_buffer_unref(buffer);
}
The pipeline runs for a few frames and then stops with:
rtsptest: Framecopy.c:108: _execute: Assertion `Buffer_getUserPtr(hSrcBuf)' failed.
When increasing debuglevel, I can see the following relevant lines:
default main.c:38:read_data: Pushing new data into pipe, elapsed seconds=0.066681.
appsrc gstappsrc.c:1456:gst_app_src_push_buffer_full:<mysrc> queue filled (230400 >= 200000)
default main.c:71:stop_feed: Entering stop_feed(pipeline=0xec0c0, app=0x11f9c).
Seems the trouble is in the appsrc buffering configuration, I just don't know what to set and where. Same pipeline using videotestsrc instead of appsrc works, encodes and streams perfectly.
Full current code is available at
pastebin.