This post was updated on .
Hi GstDev Community
I'm trying to write the following pipeline in C++ code: gst-launch-1.0.exe rtspsrc location=rtsp://localhost:554/test ! rtph265depay ! avdec_h265 ! autovideosink On the other side is a Live555 based RTSP server that works with this pipeline and VLC. First, why do I try this in code? I want to replace the autovideosink with an appsink and render the image to a texture. Unfortunately I can not link the rtspsrc with the rest of the pipeline. I found out pretty fast, that there is no src pad on the rtspsrc to link, so it has to be created dynamicaly ( tutorial source <https://gstreamer.freedesktop.org/documentation/tutorials/basic/dynamic-pipelines.html> ). The problem now is, that the padAddedHandler I registred on the "pad-added" signal never gets called... I even found a similar problem, but I still can not get it running Similar Issue <http://gstreamer-devel.966125.n4.nabble.com/No-quot-pad-added-quot-signal-in-rtspsrc-td4670967.html> The whole thing will be a native dll which gets called via P/Invoke in c#, thats the reason why I use a init() / run() method. Code: cpp file: #include "RTSPClient.h" RTSPClient::RTSPClient() { } RTSPClient::~RTSPClient() { close(); } bool RTSPClient::init() { const gchar *nano_str; guint major, minor, micro, nano; gst_init(NULL, NULL); // - version info gst_version(&major, &minor, µ, &nano); if (nano == 1) nano_str = "(CVS)"; else if (nano == 2) nano_str = "(Prerelease)"; else nano_str = ""; cout << "This program is linked against GStreamer" << major << "/" << minor << "/" << micro << "/" << nano_str << endl; source = gst_element_factory_make("rtspsrc", "source"); if (!source) { TRACE("failed to create the rtsp source element"); return false; } g_object_set(G_OBJECT(source), "location", "rtsp://localhost:554/test", NULL); TRACE("src location set to rtsp://localhost:554/test"); rtpdepay = gst_element_factory_make("rtph265depay", "depay"); if (!rtpdepay) { TRACE("failed to create the rtpdepay element"); return false; } decoder = gst_element_factory_make("avdec_h265", "decoder"); if (!decoder) { TRACE("failed to create the h265 decoder element"); return false; } appsink = gst_element_factory_make("appsink", "sink"); if (!appsink) { TRACE("failed to create the appsink element"); return false; } pipeline = gst_pipeline_new("decoding-pipeline"); if (!pipeline) { TRACE("failed to create the decoding pipeline element"); return false; } TRACE("all libs were sucessfully loaded"); gst_bin_add_many(GST_BIN(pipeline), source, rtpdepay, decoder, appsink, NULL); if (!gst_element_link_many(rtpdepay, decoder, appsink, NULL)) { TRACE("element linking failed"); return false; } // source callbacks if (g_signal_connect(source, "pad-added", G_CALLBACK(RTSPClient::padAddedHandler), rtpdepay) > 0) { TRACE("pad-added callback installed"); } else { TRACE("pad-added callback failed"); return false; } if (g_signal_connect(source, "on-sdp", G_CALLBACK(RTSPClient::onSDPHandler), this)) { TRACE("on-sdp callback installed"); } else { TRACE("on-sdp callback failed"); return false; } if (g_signal_connect(source, "new-manager", G_CALLBACK(RTSPClient::newManagerHandler), NULL)) { TRACE("new manager callback installed"); } else { TRACE("new manager callback failed"); return false; } // sink callbacks sinkCallback = new GstAppSinkCallbacks; sinkCallback->new_sample = RTSPClient::onNewSampleHandler; //+ register buffer destroyed callback --> last param gst_app_sink_set_callbacks((GstAppSink*)appsink, sinkCallback, this, NULL); TRACE("registered appsink onNewSample callback"); gst_element_set_state(pipeline, GST_STATE_READY); gstreamerLoop = g_main_loop_new(NULL, FALSE); if (!gstreamerLoop) { TRACE("could not create gstreamer loop"); return false; } return true; } void RTSPClient::newManagerHandler(GstElement rtspsrc, GstElement manager, void * udata) { TRACE("new manager created"); } void RTSPClient::padAddedHandler(GstElement* src, GstPad *new_pad, void* rtpdepay) { TRACE("Pad Added"); GstPad *sink_pad = gst_element_get_static_pad((GstElement*)rtpdepay, "sink"); GstPadLinkReturn ret; GstCaps *new_pad_caps = NULL; GstStructure *new_pad_struct = NULL; const gchar *new_pad_type = NULL; g_print("Received new pad '%s' from '%s':\n", GST_PAD_NAME(new_pad), GST_ELEMENT_NAME(src)); /* If our converter is already linked, we have nothing to do here */ if (gst_pad_is_linked(sink_pad)) { g_print("We are already linked. Ignoring.\n"); goto exit; } /* Check the new pad's type */ new_pad_caps = gst_pad_get_current_caps(new_pad); new_pad_struct = gst_caps_get_structure(new_pad_caps, 0); new_pad_type = gst_structure_get_name(new_pad_struct); if (!g_str_has_prefix(new_pad_type, "audio/x-raw")) { g_print("It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type); goto exit; } /* Attempt the link */ ret = gst_pad_link(new_pad, sink_pad); if (GST_PAD_LINK_FAILED(ret)) { g_print("Type is '%s' but link failed.\n", new_pad_type); } else { g_print("Link succeeded (type '%s').\n", new_pad_type); } exit: /* Unreference the new pad's caps, if we got them */ if (new_pad_caps != NULL) gst_caps_unref(new_pad_caps); /* Unreference the sink pad */ gst_object_unref(sink_pad); } void RTSPClient::onSDPHandler(GstElement* src, GstSDPMessage* sdp, void* instance) { TRACE("SDP received"); GstPad *src_pad = gst_element_get_static_pad((GstElement*)src, "src"); if (!src_pad) { TRACE("there ain't no src pad"); return; } TRACE("TODO connect here?? wtf"); } GstFlowReturn RTSPClient::onNewSampleHandler(GstAppSink *aSink, void* instance) { TRACE("New data arrived @appsink"); return GST_FLOW_OK; auto sample = gst_app_sink_pull_sample(aSink); if (sample) { // make all fields global static --> better performance GstBuffer *buffer; GstCaps *caps; GstStructure *s; gboolean res; int width; int height; GstMapInfo map; /* get the snapshot buffer format now. We set the caps on the appsink so * that it can only be an rgb buffer. The only thing we have not specified * on the caps is the height, which is dependant on the pixel-aspect-ratio * of the source material */ caps = gst_sample_get_caps(sample); if (!caps) { TRACE("could not get snapshot format\n"); return GST_FLOW_ERROR; } s = gst_caps_get_structure(caps, 0); /* we need to get the final caps on the buffer to get the size */ res = gst_structure_get_int(s, "width", &width); res |= gst_structure_get_int(s, "height", &height); if (!res) { TRACE("could not get snapshot dimension\n"); return GST_FLOW_ERROR; } ((RTSPClient*)instance)->setWidth(width); ((RTSPClient*)instance)->setHeight(height); /* create pixmap from buffer and save, gstreamer video buffers have a stride * that is rounded up to the nearest multiple of 4 */ buffer = gst_sample_get_buffer(sample); gst_buffer_map(buffer, &map, GST_MAP_READ); /* pixbuf = gdk_pixbuf_new_from_data(map.data, GDK_COLORSPACE_RGB, FALSE, 8, width, height, GST_ROUND_UP_4(width * 3), NULL, NULL);*/ /* save the pixbuf */ //gdk_pixbuf_save(pixbuf, "snapshot.png", "png", &error, NULL); gst_buffer_unmap(buffer, &map); } else { g_print("could not make snapshot\n"); } //+ set dimensions //+ call fillImgBuffer here //+ --> buffer needs to be set return GST_FLOW_OK; } void RTSPClient::setWidth(int width) { w = width; TRACE("set width to:"); TRACE(w); } void RTSPClient::setHeight(int height) { h = height; TRACE("set height to:"); TRACE(h); } void RTSPClient::close() { delete sinkCallback; } void RTSPClient::run() { TRACE("start pipeline"); gst_element_set_state(pipeline, GST_STATE_PLAYING); // this is a blocking call g_main_loop_run(gstreamerLoop); gst_element_set_state(pipeline, GST_STATE_NULL); } h file: #pragma once #include "header.h" #include <gst\gst.h> #include <gst\app\gstappsink.h> #include <gst\rtsp\rtsp.h> class RTSPClient { GstElement* source; GstElement* rtpdepay; GstElement* decoder; GstElement* appsink; GstElement *pipeline; GMainLoop* gstreamerLoop; GstAppSinkCallbacks* sinkCallback; int w; int h; void close(); static void newManagerHandler(GstElement rtspsrc, GstElement manager, void* udata); static void padAddedHandler(GstElement *src, GstPad *new_pad, void* rtpdepay); static void onSDPHandler(GstElement* src, GstSDPMessage* sdp, void* instance); static GstFlowReturn onNewSampleHandler(GstAppSink *aSink, void* instance); public: RTSPClient(); ~RTSPClient(); bool init(); void run(); void setDimPtr(int* w, int* h); void fillImgBuff(byte* buff); void setWidth(int width); void setHeight(int height); }; P.S. I know the code still contains a lot of leaks and g_free is not called once^^ -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list gstreamer-devel@lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
I had to change my message, the code was removed...?
-- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by fischra2
Hi fischra,
RTSP stream decoding is slightly complicated. Involves ssrc management and jitter correction. The best option would be using uridecodebin. This autoplug element do every thing for you. OR Use uridecodebin pipeline dot diagram as reference for your application. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hi Vinod
Thanks for the hint! I just tested this pipeline with gst-launch gst-launch-1.0.exe uridecodebin uri=rtsp://localhost:554/test ! videoconvert ! autovideosink Unfortunately I get wrong colors there, I see the picture, but the background color of everything is green, I guess the color information of the YUV image is wrong, looks like both U & V chanels are complete 0x00. If I try to give him the right decoder I get an error: gst-launch-1.0.exe uridecodebin uri=rtsp://localhost:554/test ! avdec_h265 ! autovideosink WARNING: from element /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0: Delayed linking failed. Additional debug info: ./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0: failed delayed linking some pad of GstURIDecodeBin named uridecodebin0 to some pad of avdec_h265 named avdec_h265-0 Redistribute latency... Guess this makes sense :) I'm still not sure why my first idea doesnt work... Like I said before, this pipeline works perfectly: gst-launch-1.0.exe rtspsrc location=rtsp://localhost:554/test ! rtph265depay ! avdec_h265 ! autovideosink This also needs to work in code? I'm sure it is a small detail I did not found yet :) Anyone having an idea? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Found the reason! Had not all needed libs loaded... :/
These are also needed: -libgstcoreelements.dll -libgstudp.dll -libgsttcp.dll -libgstrtpmanager.dll --> with all their dependencies ;) -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |