Use of GstAppSrc for video streaming

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Use of GstAppSrc for video streaming

Lauraire
This post was updated on .
Hi,

I am facing difficulties with the use of GstAppSrc : my idea is to feed the built pipeline with buffer that I have already treated from a streaming flow (I know the data : _avPacket.data and I know the size _avPacket.size).

I've try to follow API explanations and examples to implement my application but i still have a green display...  What am I doing wrong?

1 - I have succesfully tested the pipeline with cmd line,
2 - The buffer's size varies each time : I don't know if I take this into account,
3 - I defined the GstAppSrc's caps... but still not sure... The stream is a mpeg2 video with mepgTS container.
4 - Is it an incompatible format pixel....?
_________________________________________

Here is part of my code :

    if( _videoCodec == CODEC_ID_MPEG2VIDEO ||
        _videoCodec == CODEC_ID_H264)
    {
        GMainLoop *loop;
        GstBus *bus;

        GstElement *pipeline, *appsrc, *demuxer, *decoder, *postprocess, *videosink;
        loop = g_main_loop_new (NULL, FALSE);

        /* Create gstreamer elements */
        pipeline    = gst_pipeline_new ("pipeline");
        appsrc      = gst_element_factory_make ("appsrc", "app-src");
        decoder     = gst_element_factory_make ("vdpaumpegdec", "vdpau-decoder");
        demuxer     = gst_element_factory_make ("mpegtsdemux", "mpeg-demux");
        postprocess = gst_element_factory_make ("vdpauvideopostprocess", "vdpau-video-post-process");
        videosink   = gst_element_factory_make ("vdpausink", "vdpau-sink");

        /* set the capabilities of the appsrc element */
        GstCaps *caps = gst_caps_new_simple ("video/mpeg",
                                "width", G_TYPE_INT, 720,
                                "height", G_TYPE_INT, 576,
                                "framerate", GST_TYPE_FRACTION, 25, 1,
                                "bpp", G_TYPE_INT, 16,
                                "depth", G_TYPE_INT, 16,
                                "endianness", G_TYPE_INT, G_BYTE_ORDER,
                                NULL);

        gst_app_src_set_caps(GST_APP_SRC(appsrc), caps);

        /* we add all elements into the pipeline */
        /* we link the elements together */
        gst_element_link (appsrc, demuxer);
        gst_element_link_many (decoder, postprocess, videosink, NULL);
        g_signal_connect (demuxer, "pad-added", G_CALLBACK (on_pad_added), decoder);

        /* play */
        gst_element_set_state (pipeline, GST_STATE_PLAYING);

        /* create the buffer */
        GstBuffer *buffer = gst_buffer_new();
        GST_BUFFER_DATA (buffer) = _avPacket.data;
        GST_BUFFER_SIZE (buffer) = _avPacket.size;
        printf("BUFFER_SIZE = %d \n", _avPacket.size);

        /* push the buffer to pipeline via appsrc */
        GstFlowReturn gstFlowReturn = gst_app_src_push_buffer(GST_APP_SRC(appsrc), buffer);

        /* and loop... */
        g_main_loop_run (loop);

        /* clean up */
        gst_element_set_state (pipeline, GST_STATE_NULL);
        gst_object_unref (GST_OBJECT (pipeline));

________________________________________________

Thanks for your help!
Reply | Threaded
Open this post in threaded view
|

Re: Use of GstAppSrc for video streaming

Lauraire
Any help...? I am still working on this application and no way to go further....

What I've tried so far is :

- 1/ implements the application that do the job from a raw format... the pipeline become :
                       
    appsrc -> autovideosink

The result is that I obtain a display with very bad fps (8.... instead of the 25 from the server) and an error message : "Error : erreur interne de flux de données".

- 2/ Define a Gstcaps as :
                caps = gst_caps_new_simple("video/mpegts",
                                                        ....);
No result.

Help!
Reply | Threaded
Open this post in threaded view
|

Re: Use of GstAppSrc for video streaming

wally_bkg
In reply to this post by Lauraire
I don't have an answer for you, but your messages are listed as "pending" in Nabble meaning nobody on the mailing list (where the real experts are) sees them.

Your Nabble and/or gstreamer-devel setups are not correct.

It'll be interesting to see if my Nabble setup still works or not.  It stopped working after the list moved from sourceforge to freedesktop.



Lauraire wrote
Hi,

I am facing difficulties with the use of GstAppSrc : my idea is to feed the built pipeline with buffer that I have already treated from a streaming flow (I know the data : _avPacket.data and I know the size _avPacket.size).

I've try to follow API explanations and examples to implement my application but i still have a green display...  What am I doing wrong?

1 - I have succesfully tested the pipeline with cmd line,
2 - The buffer's size varies each time : I don't know if I take this into account,
3 - I defined the GstAppSrc's caps... but still not sure... The stream is a mpeg2 video with mepgTS container.
4 - Is it an incompatible format pixel....?
_________________________________________

Here is part of my code :

    if( _videoCodec == CODEC_ID_MPEG2VIDEO ||
        _videoCodec == CODEC_ID_H264)
    {
        GMainLoop *loop;
        GstBus *bus;

        GstElement *pipeline, *appsrc, *demuxer, *decoder, *postprocess, *videosink;
        loop = g_main_loop_new (NULL, FALSE);

        /* Create gstreamer elements */
        pipeline    = gst_pipeline_new ("pipeline");
        appsrc      = gst_element_factory_make ("appsrc", "app-src");
        decoder     = gst_element_factory_make ("vdpaumpegdec", "vdpau-decoder");
        demuxer     = gst_element_factory_make ("mpegtsdemux", "mpeg-demux");
        postprocess = gst_element_factory_make ("vdpauvideopostprocess", "vdpau-video-post-process");
        videosink   = gst_element_factory_make ("vdpausink", "vdpau-sink");

        /* set the capabilities of the appsrc element */
        GstCaps *caps = gst_caps_new_simple ("video/mpeg",
                                "width", G_TYPE_INT, 720,
                                "height", G_TYPE_INT, 576,
                                "framerate", GST_TYPE_FRACTION, 25, 1,
                                "bpp", G_TYPE_INT, 16,
                                "depth", G_TYPE_INT, 16,
                                "endianness", G_TYPE_INT, G_BYTE_ORDER,
                                NULL);

        gst_app_src_set_caps(GST_APP_SRC(appsrc), caps);

        /* we add all elements into the pipeline */
        /* we link the elements together */
        gst_element_link (appsrc, demuxer);
        gst_element_link_many (decoder, postprocess, videosink, NULL);
        g_signal_connect (demuxer, "pad-added", G_CALLBACK (on_pad_added), decoder);

        /* play */
        gst_element_set_state (pipeline, GST_STATE_PLAYING);

        /* create the buffer */
        GstBuffer *buffer = gst_buffer_new();
        GST_BUFFER_DATA (buffer) = _avPacket.data;
        GST_BUFFER_SIZE (buffer) = _avPacket.size;
        printf("BUFFER_SIZE = %d \n", _avPacket.size);

        /* push the buffer to pipeline via appsrc */
        GstFlowReturn gstFlowReturn = gst_app_src_push_buffer(GST_APP_SRC(appsrc), buffer);

        /* and loop... */
        g_main_loop_run (loop);

        /* clean up */
        gst_element_set_state (pipeline, GST_STATE_NULL);
        gst_object_unref (GST_OBJECT (pipeline));

________________________________________________

Thanks for your help!