I have USB camera, embedded device. I want to have the possibility to firstly
save the video from the camera to a file, later probably to display the video and/or stream it via RTSP by means of C/c++ wrtitten app. Now I have a working terminal command to save the given number of frames: ''' gst-launch-1.0 v4l2src device=/dev/video1 num-buffers=900 ! image/jpeg, width=1920, height=1080, io-mode=4 ! imxvpudec ! imxvpuenc_mjpeg ! avimux ! filesink location=/some/path.avi'''. I would like to be able to save not-defined lenght of the video (until some user input to stop it). I have tried several approaches. Of course I can simply use some syscall, but the problems with that are probably not necesary to list, I will not be taking that route. Now the next simplest (yet probably problematic if I would want to do something else with the stream it the future) is: ``` void pipelineVideoStart(){ if (!gst_is_initialized()) { qWarning()<<"initializing GST"; setenv("GST_DEBUG", ("*:" + std::to_string(3)).c_str(), 1); gst_init(nullptr, nullptr); } GstElement *pipeline; GstBus *bus; GstMessage *msg; std::string command = "v4l2src device=/dev/video1 ! image/jpeg, width=1920, height=1080, io-mode=4 ! imxvpudec ! imxvpuenc_mjpeg ! avimux ! filesink location = /some/file.avi"; pipeline = gst_parse_launch (command.c_str(), NULL); /* Start playing */ gst_element_set_state (pipeline, GST_STATE_PLAYING); /* Wait until error or EOS */ bus = gst_element_get_bus (pipeline); msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GstMessageType( GST_MESSAGE_ERROR | GST_MESSAGE_EOS)); /* Free resources */ if (msg != NULL) gst_message_unref (msg); gst_object_unref (bus); gst_element_set_state (pipeline, GST_STATE_NULL); gst_object_unref (pipeline); return; } ''' ``` Problems I find with this approach 1) Video is created, however with wrong metadata (0 seconds lenght, even though it is playable). Because... it's never succesfully finished, is it? How do I stop the recording? 1) zero control. I expect problems with any additional sink (appsink, rtsp streaming) In second approach I was trying to build the pipeline myself: ``` static GstElement *pipeline; void gstreamerUsbCamera::selfPipelineVideo(){ if (!gst_is_initialized()) { setenv("GST_DEBUG", ("*:" + std::to_string(3)).c_str(), 1); gst_init(nullptr, nullptr); } GstCaps *caps; GstStateChangeReturn ret; GstElement *source, *muxer, *sink; source = gst_element_factory_make ("v4l2src", "source"); g_object_set (source, "device", "/dev/video1", NULL); muxer = gst_element_factory_make ("avimux", "avimux"); sink = gst_element_factory_make ("filesink", "sink"); g_object_set (sink, "location", "/mnt/ssd/someTest.avi", NULL); pipeline = gst_pipeline_new ("pipeline_src"); if (!pipeline || !source || !muxer || !sink) { g_printerr ("Not all elements could be created.\n"); return; } caps = gst_caps_new_simple ("image/jpeg", "width", G_TYPE_INT, 1920, "height", G_TYPE_INT, 1080, "io-mode", G_TYPE_INT, 4, "framerate", GST_TYPE_FRACTION, 30, 1, "pixel-aspect-ratio", GST_TYPE_FRACTION, 1,1, "interlace-mode", G_TYPE_STRING, "progresive", NULL); GstPadTemplate *template1 = gst_element_class_get_pad_template(GST_ELEMENT_GET_CLASS(source), "src_%u"); GstPad *pad = gst_element_request_pad(src, template1, "pad", caps);//");// gst_pad_new("source", GST_PAD_SRC); gst_caps_unref (caps); gst_bin_add_many (GST_BIN (pipeline), source, muxer, sink, NULL); if (gst_element_link_many(source, muxer,sink, NULL) != TRUE) { g_printerr ("Elements could not be linked.\n"); gst_object_unref (pipeline); return; } ret = gst_element_set_state (pipeline, GST_STATE_PLAYING); if (ret == GST_STATE_CHANGE_FAILURE) { g_printerr ("Unable to set the pipeline to the playing state.\n"); gst_object_unref (pipeline); return; } // Start playing recording = true; return; } int endVideoPipeline(void) { GstMessage *message = gst_message_new_eos(nullptr); gst_bus_post(pipeline->bus, message); /* Free resources */ if (message != NULL) gst_message_unref (message); gst_element_set_state (pipeline, GST_STATE_PAUSED); gst_element_set_state (pipeline, GST_STATE_NULL); gst_object_unref (pipeline); return 1; gst_app_src_end_of_stream(GST_APP_SRC(mGstData.appsrc)); usleep(500000); // Important gst_element_set_state (mGstData.pipeline_src, GST_STATE_NULL); gst_object_unref (mGstData.pipeline_src); recording = false; return 0; } ``` Problems with this: 1) wrong metadata (wrong lenght in VLC) and wrong format (the caps negotiating fails, I get 3000x4000&10fps instead of HD@30fps). I also tried (at some point) to use the code from https://gist.github.com/crearo/1dc01b93b2b513e0000f183144c61b20 with some tweaks (commented-out the displaying, since that is not what I want now, and changed startRecording function to ```void gstreamerUsbCamera::startRecordingS() { g_print("startRecording\n"); GstPad *sinkpad; GstPadTemplate *templ, *temp2; GstCaps *caps; templ = gst_element_class_get_pad_template(GST_ELEMENT_GET_CLASS(tee), "src_%u"); teepad = gst_element_request_pad(tee, templ, NULL, NULL); queue_record = gst_element_factory_make("queue", "queue_record"); encoder = gst_element_factory_make("imxvpuenc", NULL); muxer = gst_element_factory_make("avimux", NULL); filesink = gst_element_factory_make("filesink", NULL); g_object_set(filesink, "location", "output.avi", NULL); caps = gst_caps_new_simple ("image/jpeg", "width", G_TYPE_INT, 1920, "height", G_TYPE_INT, 1080, "io-mode", G_TYPE_INT, 4, NULL); temp2 = gst_element_class_get_pad_template(GST_ELEMENT_GET_CLASS(src), "src_%u"); GstPad *pad = gst_element_request_pad(src, templ, NULL, caps); gst_bin_add_many(GST_BIN(pipeline), queue_record, encoder, muxer, filesink, NULL); gst_element_link_many(queue_record, encoder, muxer, filesink, NULL); gst_element_sync_state_with_parent(queue_record); gst_element_sync_state_with_parent(encoder); gst_element_sync_state_with_parent(muxer); gst_element_sync_state_with_parent(filesink); sinkpad = gst_element_get_static_pad(queue_record, "sink"); gst_pad_link(teepad, sinkpad); gst_object_unref(sinkpad); isRecording = true; } ``` Here I get empty (0B) file at output. Now I do think that I am aiming to some tweak of the last two mentioned approaches. Can anyone point me in the direction of my mystakes and how to fix them? Thank you. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
It doesn't look like you're stopping the pipeline correctly, so the data is
never written to the file. In your first attempt, you are waiting for an EOS, but you never actually send one. In the next, you send an EOS, but then immediately stop the pipeline, so the EOS never goes anywhere. To stop a pipeline, you need to - Send an EOS event to the pipeline - Wait for it to travel through the pipeline - Set the pipeline state to READY or NULL - Proceed with anything else basic_eos.cpp <http://gstreamer-devel.966125.n4.nabble.com/file/t379531/basic_eos.cpp> Check out this example code. It creates the pipeline, runs it, and then schedules an EOS event to be sent. When the cb_message function receives the EOS signal, this means it has reached the end of the pipeline, so it is safe to stop the pipeline. It uses gst_parse_launch to create the pipeline, but you can switch to the manual method as needed. I used g_timeout_add_seconds() to schedule the EOS, but you can easily capture ctrl+c signals or other keyboard events to send an EOS. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |