how to use desktop nvidia GPU gstreamer encoding {omxh264enc}

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

how to use desktop nvidia GPU gstreamer encoding {omxh264enc}

devil coder
Hi Experts,

I have developed the  gstreamer GPU encoding on Jetson Tx2. but now i am
trying for Desktop GPU encoding using gstreamer.

I am not able get "*omxh264enc*" in the listed supported elements using
"*gst-inspect-1.0.exe*".

so, i want to do hardware encoding on* desktop GPU* using Gstreamer.

Kindly guide me in right direction, or any sample command line or program
for desktop gpu supported encoding.
Note: i am trying on laptop gpu

Regards,
newbaby



-----
adi
--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
Reply | Threaded
Open this post in threaded view
|

Re: how to use desktop nvidia GPU gstreamer encoding {omxh264enc}

Matthew Waters
On 16/05/18 16:38, aasim wrote:

> Hi Experts,
>
> I have developed the  gstreamer GPU encoding on Jetson Tx2. but now i am
> trying for Desktop GPU encoding using gstreamer.
>
> I am not able get "*omxh264enc*" in the listed supported elements using
> "*gst-inspect-1.0.exe*".
>
> so, i want to do hardware encoding on* desktop GPU* using Gstreamer.
>
> Kindly guide me in right direction, or any sample command line or program
> for desktop gpu supported encoding.
> Note: i am trying on laptop gpu
You would want to try out the nvenc elements available in
https://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/nvenc.

It requires downloading NVidia's Video Codec SDK and setting up
necessary include/library paths manually.

Cheers
-Matt

> Regards,
> newbaby
>
>
>
> -----
> adi
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (499 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: how to use desktop nvidia GPU gstreamer encoding {omxh264enc}

devil coder
Hi Matt,
Thanks for quick reply,
I have one more quick question, i will install suggested sdk & cuda libs and
set the necessary path.(as i did for Jetson tx2).


*i am using gstreamer api code, is this a correct way to approach.*

I have to still modify the code to read from buffer and audio encode
support.

*Below is my code.*




#define _CRT_SECURE_NO_WARNINGS 1
//#pragma warning(disable:4996)
#include <gst/gst.h>
#include <gst/audio/audio.h>
#include <gst/app/gstappsrc.h>
#include <gst/base/gstpushsrc.h>
#include <gst/app/gstappsink.h>
#include <gst/video/video.h>
#include <gst/video/gstvideometa.h>
#include <gst/video/video-overlay-composition.h>

#include <string.h>
#include <stdio.h>

#include <fstream.h>
#include <iostream.h>

using namespace std;

GstElement *SinkBuff;
char *out_file_path;
FILE *out_file;

//gst-launch-1.0.exe -v filesrc location=Transformers1080p.yuv
blocksize=4147200 !  
//videoconvert !
video/x-raw,format=I420,width=1920,height=1080,framerate=60/1 !  
//openh264enc ! mpegtsmux ! filesink location=final.ts

// sprintf(launch_string_, "alsasrc device = plughw:0 do-timestamp = true
format = 3 is-live = true ! "
// "queue max-size-buffers = 0 max-size-time = 0 nax-size-bytes = 0
min-threshold-time = 0000000000 ! audioconvert ! "
// "queue ! audio/x-raw, format = S16LE, rate = 44100, channels = 2 ! "
// "queue ! voaacenc ! " // lamemp3enc  voaacenc
// "queue ! mux. "
// "appsrc name =Transformers1080p.yuv do-timestamp = true format = 3 ! "
// "queue max-size-buffers = 0 max-size-time = 0 nax-size-bytes = 0
min-threshold-time = 0 ! video/x-h264, stream-format = byte-stream, width
=1920, height =1080 ! "
// "queue ! h264parse ! "
// "queue ! flvmux name = mux ! " //rtph264pay mpegtsmux flvmux
// "queue ! filesink location=test_file.flv ");

//" filesrc location=Transformers1080p.yuv   blocksize=4147200 !"
//" videoparse  width=1920 height=1080 framerate=60/1 !"
//" videoconvert !
video/x-raw,format=I420,width=1920,height=1080,framerate=60/1 ! "
//" x264enc bitrate=8000 !"
//" queue max-size-buffers=0 max-size-time=0 max-size-bytes=0
min-threshold-time=0 !"
//" video/x-h264, stream-format=byte-stream ! mpegtsmux !  appsink name =
sink");



static gboolean bus_call(GstBus     *bus, GstMessage *msg, gpointer    data)
{
        GMainLoop *loop = (GMainLoop *)data;

        switch (GST_MESSAGE_TYPE(msg))
        {
        case GST_MESSAGE_EOS:
                g_print("End of stream\n");
                g_main_loop_quit(loop);
                break;

        case GST_MESSAGE_ERROR:
        {
                gchar  *debug;
                GError *error;

                gst_message_parse_error(msg, &error, &debug);
                g_free(debug);

                g_printerr("Error: %s\n", error->message);
                g_error_free(error);

                g_main_loop_quit(loop);
                break;
        }
        default:
                break;
        }
        return TRUE;
}

/* called when the appsink notifies us that there is a new buffer ready for
* processing */
GstFlowReturn  on_new_sample_from_sink(GstElement * elt, void *ptr)
{
        guint size;
        GstBuffer *app_buffer, *buffer;
        GstElement *source;
        GstMapInfo map = { 0 };
        GstSample *sample;
        static int packindex = 0;
       
        printf("\n data packet [%d]\n", packindex++);
        /* get the buffer from appsink */
        g_signal_emit_by_name(SinkBuff, "pull-sample", &sample, NULL);
        if (sample)
        {
                buffer = gst_sample_get_buffer(sample);
                gst_buffer_map(buffer, &map, GST_MAP_READ);

                fwrite(map.data, 1,map.size, out_file);
               
                gst_buffer_unmap(buffer, &map);
                gst_sample_unref(sample);
        }
        return GST_FLOW_OK;
}


int main(int   argc, char *argv[])
{
        GMainLoop *loop;
        int width, height;

        GstElement *pipeline;
        GError *error = NULL;
        GstBus *bus;
        char pipeline_desc[1024];
        out_file = fopen("output.ts", "wb");


        /* Initialisation */
        gst_init(&argc, &argv);

        // Create gstreamer loop
        loop = g_main_loop_new(NULL, FALSE);

        sprintf(
                pipeline_desc,
                " filesrc location=Transformers1080p.yuv   blocksize=4147200    !"
                " videoparse  width=1920 height=1080 framerate=60/1 !"
                " videoconvert !
video/x-raw,format=I420,width=1920,height=1080,framerate=60/1  ! "
                " x264enc bitrate=8000 !"
                " queue max-size-buffers=0 max-size-time=0 max-size-bytes=0
min-threshold-time=0 !"
                " mpegtsmux !  appsink name = sink");


        printf("pipeline: %s\n", pipeline_desc);

        /* Create gstreamer elements */
        pipeline = gst_parse_launch(pipeline_desc, &error);

        /* TODO: Handle recoverable errors. */

        if (!pipeline) {
                g_printerr("Pipeline could not be created. Exiting.\n");
                return -1;
        }

        /* get sink */
        SinkBuff = gst_bin_get_by_name(GST_BIN(pipeline), "sink");
        g_object_set(G_OBJECT(SinkBuff), "emit-signals", TRUE, "sync", FALSE,
NULL);
        g_signal_connect(SinkBuff, "new-sample",
G_CALLBACK(on_new_sample_from_sink), NULL);


        /* Set up the pipeline */
        /* we add a message handler */
        bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
        gst_bus_add_watch(bus, bus_call, loop);
        gst_object_unref(bus);

        /* Set the pipeline to "playing" state*/
        g_print("Now playing: Transformers1080p.yuv \n");
        gst_element_set_state(pipeline, GST_STATE_PLAYING);

        /* Iterate */
        g_print("Running...\n");
        g_main_loop_run(loop);

        /* Out of the main loop, clean up nicely */
        g_print("Returned, stopping playback\n");
        gst_element_set_state(pipeline, GST_STATE_NULL);

        g_print("Deleting pipeline\n");
        gst_object_unref(GST_OBJECT(pipeline));
        fclose(out_file);
        g_main_loop_unref(loop);


        return 0;
}

 


   



-----
adi
--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi