Hi,
In this scenario a live stream from IP camera is displayed on HDMI TV using embedded system. OS = Linux - CPU = i.MX6 ARM - Gstreamer =0.10.36 The problem is the decoder is dropping buffers because of timestamping problem. So the video is not smooth and what is being played is a breaking sequence of frames as if the iframe alone is being displayed and everything else is dropped.
I'm using ffmpeg to demux the frame which gives me ffmpeg packet then memcpy the packet date to GST_BUFFER. Appsrc is used to inject the buffer date into Gstreamer pipeline. Here is my pipeline:
app-source | typefinder | vpu-decoder | mfw_v4lsink-sink This is the need_data signal callback: { av_read_frame(fctx,&packet)
buffer = gst_buffer_new_and_alloc (packet.size); memcpy(GST_BUFFER_MALLOCDATA(buffer),packet.data, packet.size); ret = gst_app_src_push_buffer(ctx->appsrc, buffer);
av_free_packet(&packet); return TRUE } Debug messages: :00:16.732445169 1660 0x3465baf0 INFO vpudec vpudec.c:1512:gst_vpudec_chain: Got no disp information!!
0:00:16.732541335 1660 0x3465baf0 INFO vpudec vpudec.c:1551:gst_vpudec_chain: Got not enough input message!! 0:00:16.762768835 1660 0x3465baf0 WARN basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<video-sink> warning: A lot of buffers are being dropped.
0:00:16.762866002 1660 0x3465baf0 WARN basesink gstbasesink.c:2875:gst_base_sink_is_too_late:<video-sink> warning: There may be a timestamping problem, or this computer is too slow. My question is how do I extract the PTS and pass this information to the decoder? Thanks, Tarek
_______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Wed, 2012-12-12 at 12:05 +0000, Tarek El-Sherbiny wrote:
Hi, > In this scenario a live stream from IP camera is displayed on HDMI TV > using embedded system. > OS = Linux - CPU = i.MX6 ARM - Gstreamer =0.10.36 > The problem is the decoder is dropping buffers because of timestamping > problem. So the video is not smooth and what is being played is a > breaking sequence of frames as if the iframe alone is being displayed > and everything else is dropped. > > > I'm using ffmpeg to demux the frame which gives me ffmpeg packet then > memcpy the packet date to GST_BUFFER. > > Appsrc is used to inject the buffer date into Gstreamer pipeline. Here > is my pipeline: > app-source | typefinder | vpu-decoder | mfw_v4lsink-sink > > Debug messages: > :00:16.732445169 1660 0x3465baf0 INFO vpudec > vpudec.c:1512:gst_vpudec_chain: Got no disp information!! > 0:00:16.732541335 1660 0x3465baf0 INFO vpudec > vpudec.c:1551:gst_vpudec_chain: Got not enough input message!! > 0:00:16.762768835 1660 0x3465baf0 WARN basesink > gstbasesink.c:2875:gst_base_sink_is_too_late:<video-sink> warning: A > lot of buffers are being dropped. > 0:00:16.762866002 1660 0x3465baf0 WARN basesink > gstbasesink.c:2875:gst_base_sink_is_too_late:<video-sink> warning: > There may be a timestamping problem, or this computer is too slow. > > My question is how do I extract the PTS and pass this information to > the decoder? I don't know much about the elements involved, but: Try setting the "do-timestamp" property on appsrc to TRUE, maybe also set "format" to GST_FORMAT_TIME, "is-live" to TRUE and "max-latency" to something suitable. You can set "sync" to FALSE on the sink to test if that's likely to be the problem (that's not an actual solution though, just for diagnostics). Cheers -Tim _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
The do-timestamp definitely helped. I'm not sure what max-latency should be so I left it.
One more question: should I set "block" to FALSE ? Thanks for your help _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Tarek El-Sherbiny
Hello,
I'm having same issue? What kind of source do you use. Is it udpsrc? I use rtspsrc for camera and it has no do-timestamp. I'm looking forward for your solution. thanks Maciek |
Hi Maciek,
I'm using FFMPEG to connect to the camera and demux the packets. Then I inject the packets to gstreamer decoder using the appsrc element. It's a weird method and I don't recommend anyone to use it. I was forced to do so.
I had to use ffmpeg demux because it's a proprietary format. Also I had to use gstreamer because the silicon provider only gives the accelerated decoding as gstreamer plug-in! Hope that helps.
Thanks, Tarek On Wed, Dec 12, 2012 at 6:15 PM, bamboosso <[hidden email]> wrote: Hello, _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hello Tarek,
Well, freescale also provide codeck pack with driverd to their VPU, which are then used in gstreamer plugin. But gstreamer is ok tu use I think. My problem is that when I record video from rtsp camera no frames are dropped. And when I use VPU, vpu drops a lot of frames according to timestams problems so I have a lot of artefacts on playback and I don't know why because timestamps that outs from rtph264depay are set. Do you use ffmpeg provided with gstreamer? Do you have to set proper caps od srcpad for your appsink? Maybe you can share some sample code how to use ffmpeg to grab data from camera? This issue id driving me crazy. Thanks Maciek |
Hi Maciek,
Have you checked freescale's forum? you might find some useful info.
For me I'm not using ffmpg from gstreamer. I'm using a locally modified version from libav.org To connect to a camera simply I'm calling:
av_open_input_file(filename,NULL,0,NULL)
I tried to use VPU libs but there were so many errors like delay's and disconnections. Gstreamer is more reliable for some reason. The main gstreamer config:
g_object_set(G_OBJECT(ctx->appsrc), "is-live", TRUE, "block", FALSE,"do-timestamp", TRUE, "format",GST_FORMAT_TIME,NULL);
That works with mfw_v4lsink but is not really what I want. I still need to figure out how to use mfw_isink to utilise scaling and overlay features. If you you have more info on this that will be great.
Thanks, Tarek On Thu, Dec 13, 2012 at 10:00 AM, bamboosso <[hidden email]> wrote: Hello Tarek, _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hey,
I use mfw_isink to playback video on gtk window. Scaling is no problem, because it's done internaly. I just link mfw_vpudecoder with mfw_isink and size is adjusted do gtk window. here is part of pipeline bus handler that attach mfw_isink to GtkWidget: case GST_MESSAGE_ELEMENT: { g_print("Message: %s from: %s\n", message_name, name_guard.get()); if( gst_structure_has_name( msg->structure, "prepare-xwindow-id" ) ) { if(window != 0) { GdkWindow * w = gtk_widget_get_window((GtkWidget*)window); gst_x_overlay_set_window_handle(GST_X_OVERLAY(GST_MESSAGE_SRC(msg)), GDK_WINDOW_XID(w)); } } break; } Regards Maciek |
I did't get the pipeline can you please post it again?
Also are you using iMX5 or iMX6? I understand that the drivers are not the same. Thanks, Tarek On Thu, Dec 13, 2012 at 11:46 AM, bamboosso <[hidden email]> wrote: Hey, _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by bamboosso
Hey again,
Could you make a test for me and run pipeline with rtspsrc element for your camera? If you will se a lot of artefacts on screen this mean it is not only my problem :>. export VSALPHA=1 <<-- to see video on framebuffer. gst-launch rtspsrc location=rtsp://yourcamera ! gstrtpjitterbuffer ! rtph264depay ! h264parse ! mfw_vpudecoder ! mfw_isink Thanks Maciek |
In reply to this post by Tarek El-Sherbiny
I use iMX53 but I don't think that they made very revelanr changes in how vpu works, but yes, drivers will be different.
My pipeline is: gst-launch rtspsrc buffer-mode=0 latency=100 location=rtsp://admin:4321@192.168.0.140:554/profile5/media.smp ! gstrtpjitterbuffer drop-on-latency=false latency=200 do-lost=true ! rtph264depay ! legacyh264parse access-unit=true output-format=0 ! mfw_vpudecoder profiling=true ! mfw_isink max-lateness=-1 Maciek |
In reply to this post by bamboosso
Hey,
I don't have rtspsrc installed on my target. I will try to install it and give it a go. Do you know which package provide rtspsrc ?
T On Thu, Dec 13, 2012 at 12:30 PM, bamboosso <[hidden email]> wrote: Hey again, _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
gstreamer-plugins-good
I have 0.10.31 Maciek |
Hey,
I have installed the rtsp but my camera is not rtsp!
So I don't think the rtspsrc will take that. One suggestion can you try the mfw_v4lsink instead of mfw_isink? According to : it's better performance. T On Thu, Dec 13, 2012 at 1:04 PM, bamboosso <[hidden email]> wrote: gstreamer-plugins-good _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Hey,
It doesn't matter. The problem is between mfw_vpudecoder and rtspsrc. when I use: rtpsrc->file OK file->vpu OK (with file recorded prev.) rtpsrc->vpu BAD...... well its even more weird: rtpsrc->vpu->videoscale->filesink PERFECT rtpsrc->vpu->videoscale->mfw_isink/mfw_v4lsink EVEN WORSE An it complains about timestamping problem... The rtspsrc is not based on BaseSrcClass and it has no do-timestamp. So I hope solution you use will help. I home libav will be able to read rtsp without problems... I'm trying to compile libav but after compilation I have undefined reference to av_open_input_file, avformat_open_input ... and so on. What ./configure flags do you use to compile?? Maciek |
Hey,
First of all, thanks very much Tarek. You helped me very much. Unfortunately this solution with data frames injected to pipeline from ffmpeg does not work form me. I think there is some issue with comunication with vpu and gpu when frames from camera don't came in specific framerate. My camera is not sending them exactly in 30fps. Actually it's 33,33333 fps and sometime one frame takes 16ms sometimes 30... I've try to buffer frames and send them with exactly 30fps like it is when playing from file, but i failed for now. I have to look, how timestamps are parsed in mp4mux and maybe do the same. Anyway, your solution with appsrc gave me an idea how to make better recording branch from my pipeline. Regards Maciek |
Administrator
|
In reply to this post by Tarek El-Sherbiny
Hi,
The freescale gstreamer decoder elements are known to have atrociously bad timestamp management (read : the correlation between the timestamps provided to the decoder and the timestamps coming out are completely nuts). There's not much you can do until freescale (or someone else) fixes that. The issue could be mitigated by basing those plugins off the base GstVideoDecoder classes (which has proper timestamp handling) ... and hope the VPU firmware no longer has issues regarding frame corruption. Another option is using the gst-ffmpeg decoder plugins (but then you won't have hardware acceleration). Edward On Wed, 2012-12-12 at 12:05 +0000, Tarek El-Sherbiny wrote: > Hi, > > > In this scenario a live stream from IP camera is displayed on HDMI TV > using embedded system. > OS = Linux - CPU = i.MX6 ARM - Gstreamer =0.10.36 > The problem is the decoder is dropping buffers because of timestamping > problem. So the video is not smooth and what is being played is a > breaking sequence of frames as if the iframe alone is being displayed > and everything else is dropped. > > > I'm using ffmpeg to demux the frame which gives me ffmpeg packet then > memcpy the packet date to GST_BUFFER. > > > Appsrc is used to inject the buffer date into Gstreamer pipeline. Here > is my pipeline: > app-source | typefinder | vpu-decoder | mfw_v4lsink-sink > > > > This is the need_data signal callback: > { > av_read_frame(fctx,&packet) > > > > buffer = gst_buffer_new_and_alloc (packet.size); > memcpy(GST_BUFFER_MALLOCDATA(buffer),packet.data, packet.size); > > > ret = gst_app_src_push_buffer(ctx->appsrc, buffer); > > > av_free_packet(&packet); > > > > return TRUE > } > > > Debug messages: > :00:16.732445169 1660 0x3465baf0 INFO vpudec > vpudec.c:1512:gst_vpudec_chain: Got no disp information!! > 0:00:16.732541335 1660 0x3465baf0 INFO vpudec > vpudec.c:1551:gst_vpudec_chain: Got not enough input message!! > 0:00:16.762768835 1660 0x3465baf0 WARN basesink > gstbasesink.c:2875:gst_base_sink_is_too_late:<video-sink> warning: A > lot of buffers are being dropped. > 0:00:16.762866002 1660 0x3465baf0 WARN basesink > gstbasesink.c:2875:gst_base_sink_is_too_late:<video-sink> warning: > There may be a timestamping problem, or this computer is too slow. > > > My question is how do I extract the PTS and pass this information to > the decoder? > > > Thanks, > Tarek > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel _______________________________________________ gstreamer-devel mailing list [hidden email] http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |