Hello everybody,
I use GStreamer Editing Services for simple operations such as cutting parts of an audio-video clip. I noticed that if I try to insert a high-resolution video (3840 x 2160) and play, the memory occupied by the application increases to 1.5Gb and the CPU increases to 80-95% and is maintained as long as I play. Is there any method to reduce CPU / memory to a reasonable value? I use GStreamer 1.16.1, do you think the newer versions are more optimized? The code used is not much different from the one below: #include <stdlib.h> #include <ges/ges.h> #include <stdlib.h> int main(int argc, char ** argv) { GESPipeline *pipeline; GESTimeline *timeline; GESTrack *tracka, *trackv; GESLayer *layer; GMainLoop *mainloop; guint i; gst_init(&argc, &argv); ges_init(); /* Initialize the GStreamer Editing Services */ timeline = ges_timeline_new(); /* This is our main GESTimeline */ tracka = GES_TRACK(ges_audio_track_new()); trackv = GES_TRACK(ges_video_track_new()); layer = ges_layer_new(); /* We are only going to be doing one layer of clips */ if (!ges_timeline_add_layer(timeline, layer)) /* Add the tracks and the layer to the timeline */ return -1; if (!ges_timeline_add_track(timeline, tracka)) return -1; if (!ges_timeline_add_track(timeline, trackv)) return -1; /* Here we've finished initializing our timeline, we're * ready to start using it... by solely working with the layer ! */ gchar *uri1 = gst_filename_to_uri("C:\\Users\\neluc\\Documents\\KSV1\\Roger Federer.mp4", NULL); GESUriClip *src1 = ges_uri_clip_new(uri1); g_assert(src1); g_free(uri1); ges_layer_add_clip(layer, (GESClip *)src1); pipeline = ges_pipeline_new(); if (!ges_pipeline_set_timeline(pipeline, timeline)) /* Add the timeline to that pipeline */ return -1; /* We set the pipeline to playing ... */ gst_element_set_state(GST_ELEMENT(pipeline), GST_STATE_PLAYING); mainloop = g_main_loop_new(NULL, FALSE); g_main_loop_run(mainloop); return 0; } You can download the video I tested with from this address: www.videosurgeon.net/Roger_Federer.mp4 <https://www.videosurgeon.net/Roger_Federer.mp4> -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
You might try upgrading to MSVC 1.18. There was a lot of work put in to
getting support for hardware encoding/decoding in Windows. FYI, I don't think these changes are supported in MinGW builds, so you can't use MSYS2 if you currently are. Otherwise, I got nothing. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Thanks for the reply.
I will try then with the latest version. I'm using MSVC. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hello everybody,
I did some tests with the latest version of GStreamer: -if I use playbin for a UHD video: pipeline = gst_parse_launch("playbin uri=\"file:///C:/Users/neluc/RogerFederer.mp4\"", NULL); gst_element_set_state(pipeline, GST_STATE_PLAYING); , it is clear that it does the video decoding accelerated, the processor is at 2-3%. -if I try to play the same video but inserted it in the timeline (see the code in a previous message), looks like the video decoding is done in software, the processor is at 65% Does anyone know if editing services from GStreamer use accelerated video decoding? Is there any possibility to use accelerated video decoding if I play video from the timeline? Thanks, Nelu -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |