Gstreamer AppSink & AppSrc with Threads

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Gstreamer AppSink & AppSrc with Threads

Matias Hernandez Arellano

(sorry for my english)

I have an application with two pipeline each of them using AppSink and AppSrc.

The AppSink pipeline capture frames from webcam.
Then i  get frames from the this pipeline and make some image processing (with OpenCV) and push to the AppSrc pipeline to do the streaming part..

The problem is i lost a lot of info (in the command line a get gstreamer messages tell me that i lost a lot of buffers) and the application don't work "normal" ..

So i think.. it's possible to do this with some threads...

One to get de images from the webcam.
One to Process this
and finally one to make the streaming part?

here are some code:

//part of the main
//The pipelines
sprintf(entrada_str,"qtkitvideosrc ! video/x-raw-yuv,width=640,height=480 ! ffmpegcolorspace "
                      "! video/x-raw-rgb,format=RGB3,width=640,height=480 ! appsink name=\"%s\"",entrada_name);

sprintf(salida_str,"appsrc name=\"%s\" ! queue ! videoparse format=14 width=640 height=480 ! videorate ! videoscale ! ffmpegcolorspace "
                  "! video/x-raw-rgb,width=640,height=480 ! glimagesink",salida_name);

pipe_entrada = gst_parse_launch(entrada_str,&error);
pipe_salida = gst_parse_launch(salida_str,&error);


// and the appsink callback
 GstBuffer *buffer = gst_app_sink_pull_buffer((GstAppSink*) gst_bin_get_by_name(GST_BIN(pipe_entrada),entrada_name));
img_data = (uchar*)GST_BUFFER_DATA(buffer);
//Create the OpenCV image
img = cvCreateImageHeader(cvSize(640,480),IPL_DEPTH_8U,3);
cvSetData(img,img_data,640*3);
/* Do some image processing */

/* Push the image to appsrc */
memcpy(GST_BUFFER_DATA(buffer),img_data,GST_BUFFER_SIZE(buffer));
gst_app_src_push_buffer( GST_APP_SRC( gst_bin_get_by_name(GST_BIN(pipe_salida),salida_name)) , buffer);



Thanks in advance

Matías Hernandez Arellano
Ingeniero de Software/Proyectos en VisionLabs S.A
CDA Archlinux-CL
www.msdark.archlinux.cl




_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer AppSink & AppSrc with Threads

Tim-Philipp Müller-2
On Mon, 2011-04-04 at 22:32 -0400, Matias Hernandez Arellano wrote:

> (sorry for my english)
>
> I have an application with two pipeline each of them using AppSink and AppSrc.
>
> The AppSink pipeline capture frames from webcam.
> Then i  get frames from the this pipeline and make some image processing (with OpenCV) and push to the AppSrc pipeline to do the streaming part..
>
> The problem is i lost a lot of info (in the command line a get gstreamer messages tell me that i lost a lot of buffers) and the application don't work "normal" ..
>
> So i think.. it's possible to do this with some threads...

I don't think more threads will help in this case. The problem is more
likely that the timing and timing information is screwed up in the
second pipeline (e.g. appsrc not reporting latency, not operating in
TIME format etc.).

The easiest way to fix this is to use glimagesink sync=false (which
should be fine in this case, since the source syncs against the clock
already anyway).

An even better approach would be to write a little plugin that does the
opencv stuff for you, then you only need one pipeline. It shouldn't be
that hard, and your plugin can be withing your application binary
('static plugin'). Look at the existing opencv plugins in -bad for
inspiration.

Cheers
 -Tim

> One to get de images from the webcam.
> One to Process this
> and finally one to make the streaming part?
>
> here are some code:
>
> //part of the main
> //The pipelines
> sprintf(entrada_str,"qtkitvideosrc ! video/x-raw-yuv,width=640,height=480 ! ffmpegcolorspace "
>                       "! video/x-raw-rgb,format=RGB3,width=640,height=480 ! appsink name=\"%s\"",entrada_name);
>
> sprintf(salida_str,"appsrc name=\"%s\" ! queue ! videoparse format=14 width=640 height=480 ! videorate ! videoscale ! ffmpegcolorspace "
>                   "! video/x-raw-rgb,width=640,height=480 ! glimagesink",salida_name);
>
> pipe_entrada = gst_parse_launch(entrada_str,&error);
> pipe_salida = gst_parse_launch(salida_str,&error);
>
>
> // and the appsink callback
>  GstBuffer *buffer = gst_app_sink_pull_buffer((GstAppSink*) gst_bin_get_by_name(GST_BIN(pipe_entrada),entrada_name));
> img_data = (uchar*)GST_BUFFER_DATA(buffer);
> //Create the OpenCV image
> img = cvCreateImageHeader(cvSize(640,480),IPL_DEPTH_8U,3);
> cvSetData(img,img_data,640*3);
> /* Do some image processing */
>
> /* Push the image to appsrc */
> memcpy(GST_BUFFER_DATA(buffer),img_data,GST_BUFFER_SIZE(buffer));
> gst_app_src_push_buffer( GST_APP_SRC( gst_bin_get_by_name(GST_BIN(pipe_salida),salida_name)) , buffer);


_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel