Serial port interface on frame grabber

classic Classic list List threaded Threaded
37 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
Hi james,

What you suggest is really a good idea. I will try to use the strace and
configure the block size to see some improvements. However I did not
understand the below suggestion:

"Split the problem.  Temporarily exclude the serial link by emulating
it between two GStreamer pipelines.  Locate or write a serial port
emulator which feeds a byte at a time and pauses for a very short time
in order to achieve 115200 baud throughput.  Perhaps the Ubuntu
package trickle could be used for ideas.

Measure the timing of arrival of serial data, and reproduce that same
timing in the emulator. "


Could you please explain it into a more broader perspective. What do you
suggest by "emaulating it between 2 GStreamer pipelines".

My thesis just got more complicated.  :(

Regards



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

James Cameron
G'day vk_gst,

Sorry, I was a bit terse with my explanation.

Let me put it another way.

Using a system perspective, the video source, the aircraft GStreamer
pipeline, the wireless transmission, and the ground GStreamer pipeline
together form a system.

When you can substitute alternate parts into a system, you can
discover behaviours of other parts, or test them more fully.

So far you have tested with the system in roughly the expected
arrangement of parts, and with a USB serial adapter in place of the
wireless link.  Your results of those tests show that there is
something about the wireless transmission that causes GStreamer to act
outside your expectations.

But there is always doubt; could this be something that GStreamer
could do better?  Could you configure GStreamer better?  What is
actually happening to cause the problem?

So my suggestion was to set aside the wireless transmission; just for
the purpose of testing all the other parts together.

By emulating the wireless transmission; as a GStreamer element or a
program using named pipes or pseudo-terminal devices, you can explore
the behaviour of the rest of the system under different conditions.

Imagine a program which creates two pseudo-terminal devices, then
reads bytes from one, and writes the bytes to the other.  You could
then write a GStreamer pipeline that writes to the first device, and
write another GStreamer pipeline that reads from the second device.
It should work identically to a GStreamer pipeline over UDP.

Then extend the program to track number of bytes over time, and delay
writing bytes by enough time to yield an effecitve 115200 baud
throughput.

Then extend the program still further to randomly inject the kind of
latencies that are observed on the wireless link.

Is that any clearer?

--
James Cameron
http://quozl.netrek.org/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
Hi James,

Thanks for the explanation. Now its more clearer to me.
This would take some time on my part to test and get back to you.
Meanwhile, I have also begun investigating the use of LTE to see if that is
a better solution.
I will keep you updated on my results.

Regards.



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
In reply to this post by James Cameron
Hi James,

1. I managed to dig a bit deeper and tune the following elements: filesrc,
filesink, imxvpuenc_h264, avdec_h264. With this tuning, and the usb-serial
adapter(cable) and a baud of 19200, I managed to stream the video from imx6
board to PC. However, the latency is very high, approx. 5-6 seconds with
19200 Baud. At the receiving side, the video frames looks stuck and the
transition of frames is quite slow.
The whole latency reduced when I configured the Baud at 1MBaud.

2. I still have to try this out on the custom modem we have here. The
maximum baud I can configure for the custom modem here is around 1M Baud,
however the effective Bandwidth available  for video will be around 25Kbps.
So I do not think the video latency will be any better as compared to 19200
Baud.  Do you have any suggestions to speed this up.

3. I am posting the commands below which I used to get the video working
through serial port. May be you can have a look and suggest me if I can fine
tune it more.

imx6:
gst-launch-1.0 -v videotestsrc pattern=18 ! video/x-raw,width=100,height=50
! imxvpuenc_h264 bitrate=5 ! h264parse ! filesink location=/dev/ttyUSB0
blocksize=1024 max-bitrate=19000 sync=false

PC:
gst-launch-1.0 -v filesrc location=/dev/ttyUSB1 blocksize=1024 ! $CAPS !
h264parse ! avdec_h264 lowres=2 skip-frame=0 ! autovideosink sync=false


4. I was also wondering if I could add a buffer on the receiving side, so
that the video is more smooth. Although I completely understand, that the
buffering up video data will result me in getting old video frames and not
the real time video frames. But I would just want to try out if its possible
to do that.  Does gstreamer allow any elements like a pseudo RAM where I can
buffer up the frames and play it after some initial delay, so that the
resulting video is smooth?





--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

Ian Davidson
I would try putting a queue between h264parse and avdec_h264.

A queue causes gst to make a new task.  Assuming that your PC has a twin
core, that will allow it to decode the video (in one core) at the same
time as it parsing the next bit (in another core) - so you should see
the picture sooner.

Ian


On 04/07/18 12:23, vk_gst wrote:

>
> PC:
> gst-launch-1.0 -v filesrc location=/dev/ttyUSB1 blocksize=1024 ! $CAPS !
> h264parse ! avdec_h264 lowres=2 skip-frame=0 ! autovideosink sync=false
>
>
> 4. I was also wondering if I could add a buffer on the receiving side, so
> that the video is more smooth. Although I completely understand, that the
> buffering up video data will result me in getting old video frames and not
> the real time video frames. But I would just want to try out if its possible
> to do that.  Does gstreamer allow any elements like a pseudo RAM where I can
> buffer up the frames and play it after some initial delay, so that the
> resulting video is smooth?
>
>
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

James Cameron
In reply to this post by sk_gst
G'day vk_gst,

Your test over 19200 baud wire link may have been affected by the data
exceeding the bitrate limits at the filesink imposed by max-bitrate
and the serial port.  This would cause blocks of data to be dropped,
and the decoder would hesitate.

To test for this, run the pipeline for a minute to a file, and then
measure the size, and calculate the average bitrate demand.  e.g.

timeout --signal=SIGINT 60s \
    gst-launch-1.0 -ev videotestsrc pattern=18 ! \
    video/x-raw,width=100,height=50 ! \
    imxvpuenc_h264 bitrate=5 ! h264parse ! \
    filesink location=60s.h264 && \
        echo $(( $(stat --format=%s 60s.h264) / 60 * 8))

Your test at 19200 baud will also suffer from serial transmission
latency; 1024 bytes with a start and stop bit each will take 10240 bit
times, which is 533 ms.  You'll only need ten blocks buffered to hit
the five seconds you observed.

You may try to find the number of blocks per frame.  You might do that
by adding a videorate element after the videotestsrc, and varying the
max-rate while measuring the average bitrate demand as above.

At this stage, I doubt a receive side buffer will help, because I'm
fairly sure the data doesn't fit into the available bandwidth.  But if
you want to try, use a queue element without the default properties,
as the default is minimal compared to your transmission rate.

I'm curious as to whether the filesink property buffer-mode may make a
difference for you.  Also look at buffer-size, defaults to 65536,
which could be enough buffering to give you that five seconds.

You may find it useful to read the GStreamer source for the filesink
element.

--
James Cameron
http://quozl.netrek.org/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
Hello,

I think I was a bit unclear in my previous post and also made a mistake
w.r.t bandwidth available.

1. At 1MBaud, the latency was < 1ms and worked smooth.
2. The practical available bandwidth for the custom modem is 250Kbps, and
not 25Kbps as I mentioned.
3. I performed the test to measure the average bitrate demand as suggested
by James, and I am confused/surprised with the results that I have obtained.
The following are the results:

Resolution                                             BitRate
100 x 50                                                ~120.3K
320 x 240                                              ~93.7K
640 x 480                                              ~52.7K
1280 x 960                                            ~32.3K

I had a thought that lesser the resolution, lesser the bandwidth required.
But here the results give a total different picture.  Observing the results,
it seems that the encoder is more efficient for resolutions in the factor of
320 x 240.  Does anyone have an explanation for this?

4. I am still to find out about blocks per frame, and the filesink elements
which I shall soon.

5. So far I have been using the tools to test this concept. I would like to
implement this in a C/C++ program. What would be the starting point for
this?

6. I also want to individually access the frames, that the GStreamer
receives at the receiving side (PC). This is because I want to have
knowledge of the last received frame in case the wireless link breaks down,
and manipulate the video frame with some other data. Is this possible with
GStreamer?









--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
In reply to this post by James Cameron
Correction in last post:

latency < 1s



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

James Cameron
In reply to this post by sk_gst
G'day vk_gst,

You asked for speculation on why you saw average bitrate demand that
was the inverse of resolution, instead of proportional to the size of
the frames.

I've never written one; but encoders work in mysterious ways.  One way
may be to divide the frame into squares and encode each square.  The
algorithm may have a preferred size for these squares, and may use
only part of a square when the video dimensions are unusual for the
algorithm.  Other encoders use macroblocks; and the same kind of
non-linearities may occur.

Try different encoders to see if there is one better suited to your
limited bandwidth and latency requirements.

Be sure you restricted the framerate during the average bitrate demand
tests; otherwise you may be encoding smaller resolutions at a faster
framerate.  That would give those kind of results.  I've not recently
tested, but the videotestsrc element may offer a higher framerate for
lower resolutions.  Add -v to your gst-launch to find out what was
negotiated.

My main reason for asking for the average bitrate demand was to ensure
it would fit within the serial link bandwidth you have available.  At
115200 baud, only one of the resolutions you tested would not fit,
possibly giving you latency from a queue outside GStreamer.

Moving to a C program should be relatively straightforward; call
gst_parse_launch, give elements names if you need to use them, and
call gst_pipeline_get_by_name to reach into them.

You might also try a Python program for fast prototyping, if that's a
language you're familiar with.

Accessing the frames from within a program takes some form of sink
element; look at appsink, or for GLib or GTK+ based programs try
gdkpixbufsink.  I've used the latter quite a bit myself, in Python.

Hope that helps.

--
James Cameron
http://quozl.netrek.org/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
Hi James,

I will dig deeper into the H264 codec and try to understand the behaviour.
It seems interesting and I will post my analysis here!
As far as the test was performed, all the resolution was tested with 10 fps.
I am still to test this setup with a live video from camera, and see if the
bitrate shows a similar behaviour.

Python is definitely a viable option, and I'll have a look.
I was hoping to access the frames, and overlay them with a IMU data captured
from a separate device connected to imx6. However, the bigger issue now
seems to be synchronizing the IMU data and the video frames.



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst

Hi James,

In case the video on the receiving side is stuck, for eg. the video data
packets are lost, but still the receiver  gets the current IMU data, then I
want to use the last received frame or the one stuck on the screen, and move
it left/right, up/down, rotate based on the IMU data received. Is this
possible? Do you have an idea/suggestions on how to realise this system?



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

James Cameron
G'day vk_gst,

No, I don't have any clear idea how to assemble that system, but you
could look at these elements;

- videorate, can be placed at the receiver after the source, and you
  can read the in, out, and duplicate frame count properties in your
  program to detect a stall,

- rotate, can rotate a picture by an arbitrary angle, and you may
  change the angle property through program control,

- the various overlay elements can place the IMU data over the
  picture.

Off-hand, I don't know an element that can be used to translate the
picture in X or Y dimensions.  You might look through the elements.

On Fri, Jul 13, 2018 at 04:41:47AM -0700, vk_gst wrote:

>
> Hi James,
>
> In case the video on the receiving side is stuck, for eg. the video data
> packets are lost, but still the receiver  gets the current IMU data, then I
> want to use the last received frame or the one stuck on the screen, and move
> it left/right, up/down, rotate based on the IMU data received. Is this
> possible? Do you have an idea/suggestions on how to realise this system?
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

--
James Cameron
http://quozl.netrek.org/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
Hi james,

Thanks for the update. I am tackling this solution from the source, so the
first part is synchronizing the IMU and video frame data.

The IMU(runs at 60Hz) device gives me a trigger that I can use to trigger my
pipeline at 30Hz.
So this means I have two sets of IMU data for each frame at a give point of
time.  
 I was considering if I can trigger the pipeline at 30Hz, so that the IMU
and video data is synchronized.

 For eg: consider this pipeline
gst-launch-1.0 -v v4l2src device=/dev/video1 !
video/x-raw,width=100,height=50,framerate=30/1 ! filesink location=xyz.avi

How could I trigger this based on the external signal. The easiest I can
think is of writing a C program, that polls for the signal and plays the
pipeline. But I feel this is not the right way to do it?

Can you suggest me if the Gstreamer provides any elements, that can help me
do this?

 

Else, the other way of synchronizing is to use the timestamp of each frame,
and allot the timestamp to the IMU data at the imx6 side. Later at receiver,
I need to match the timestamp again to relate each frame to IMU. Would that
be a better option?

 



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
Correction:


gst-launch-1.0 -v imxv4l2videosrc device=/dev/video1 !
video/x-raw,width=100,height=50,framerate=30/1 ! filesink location=<xyz>



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

James Cameron
In reply to this post by sk_gst
G'day vk_gst,

I'm finding this complex, as I've not needed to go to such depth with
GStreamer.  Maybe someone else would have some ideas.

To paraphrase, you have application data arriving at 60 frames per
second and you want to send that along the pipeline with the
corresponding video frame.  Also, ideally, you'd like to have the
frames from the v4l2src captured at about the same time as the
application data; synchronisation.

GStreamer is open source, so you can make whatever changes you like in
a custom system, but your maintenance burden will be lowered if you
can achieve your goals without changing GStreamer.

Several approximate solutions spring to mind; but I don't know which
will be lowest initial cost or lowest eventual cost.  I've never done
any of these myself; I'm speculating.  Here's two;

1.  a custom Gst.Clock instance that is used to drive the entire
pipeline, thus set the v4l2src capture times; with the clock in turn
driven by the arriving IMU data, ... but I don't know if v4l2src can
be driven like this,

2.  a custom v4l2src element that correlates the frames to IMU data,
and emits the IMU data timestamped on another source pad, which you
can later mux back into the stream,

Hope that helps.  Look through the API for Gst.Element, Gst.Bin,
Gst.Element, Gst.Clock, Gst.Message, and Gst.Bus.

On Thu, Jul 19, 2018 at 04:44:49AM -0500, vk_gst wrote:

> Hi james,
>
> Thanks for the update. I am tackling this solution from the source, so the
> first part is synchronizing the IMU and video frame data.
>
> The IMU(runs at 60Hz) device gives me a trigger that I can use to trigger my
> pipeline at 30Hz.
> So this means I have two sets of IMU data for each frame at a give point of
> time.  
>  I was considering if I can trigger the pipeline at 30Hz, so that the IMU
> and video data is synchronized.
>
>  For eg: consider this pipeline
> gst-launch-1.0 -v v4l2src device=/dev/video1 !
> video/x-raw,width=100,height=50,framerate=30/1 ! filesink location=xyz.avi
>
> How could I trigger this based on the external signal. The easiest I can
> think is of writing a C program, that polls for the signal and plays the
> pipeline. But I feel this is not the right way to do it?
>
> Can you suggest me if the Gstreamer provides any elements, that can help me
> do this?
>
> Else, the other way of synchronizing is to use the timestamp of each frame,
> and allot the timestamp to the IMU data at the imx6 side. Later at receiver,
> I need to match the timestamp again to relate each frame to IMU. Would that
> be a better option?

--
James Cameron
http://quozl.netrek.org/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

sk_gst
Hi James,

Thanks for the update.
I definitely, want to use the existing GStrreamer elements, rather than
writing my own and introducing more bugs.  
The IMU device which I am supposed to use, provides a trigger signal that
provides me the latest sample of IMU data. So my idea is now to trigger this
signal at every frame capture, and then I have the IMU data for that
particular frame, with a minimum delay. However, I have 2 questions here:

1. I could find any API ' static gboolean gst_imx_v4l2src_start(GstBaseSrc
*src) ' that reads from the camera device. I am thinking of exporting a
trigger from this API for every success the API returns. However, I do not
know if this call happens every frame; there is not much description
provided. Is there any other way, that I can trigger the IMU after every
frame is read? Generally, the camera provides a VSnc signal that marks the
reading of complete frame, but the camera I am using (OV5640) does not have
such a trigger.  

2. For sending the IMU data with the frames, should I be using AppSrc ?
Would the IMU data also be readable at the receiver side when I read the
AppSrc? This is because I want the video data and IMU data separately at the
receiver.  

Regards




--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: AW: AW: AW: Serial port interface on frame grabber

James Cameron
Sorry, I don't know.  Perhaps someone else does.

On Mon, Jul 23, 2018 at 04:41:59AM -0500, vk_gst wrote:

> Hi James,
>
> Thanks for the update.
> I definitely, want to use the existing GStrreamer elements, rather than
> writing my own and introducing more bugs.  
> The IMU device which I am supposed to use, provides a trigger signal that
> provides me the latest sample of IMU data. So my idea is now to trigger this
> signal at every frame capture, and then I have the IMU data for that
> particular frame, with a minimum delay. However, I have 2 questions here:
>
> 1. I could find any API ' static gboolean gst_imx_v4l2src_start(GstBaseSrc
> *src) ' that reads from the camera device. I am thinking of exporting a
> trigger from this API for every success the API returns. However, I do not
> know if this call happens every frame; there is not much description
> provided. Is there any other way, that I can trigger the IMU after every
> frame is read? Generally, the camera provides a VSnc signal that marks the
> reading of complete frame, but the camera I am using (OV5640) does not have
> such a trigger.  
>
> 2. For sending the IMU data with the frames, should I be using AppSrc ?
> Would the IMU data also be readable at the receiver side when I read the
> AppSrc? This is because I want the video data and IMU data separately at the
> receiver.  
>
> Regards
>
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

--
James Cameron
http://quozl.netrek.org/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
12