How do I get gst-inspect properties for v4l2src device other than /dev/video0?

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Kulecz, Walter (JSC-SK)[WYLE INTEG. SCI. & ENG.]
I need two capture cards in my application.  In the process of playing around with things, I've currently got three installed  All work in a simple pipeline if I set the device property with C code using:

g_object_set(G_OBJECT(source), "device",  "/dev/video2",  NULL)

but the capture devices are different enough that I'm stumbling around with the caps, as only using unfiltered caps works for all three cards.  The point of the exercise is to figure out if I can use our current assortment of PCI capture cards and add in some USB ones I've found.

I'd prefer having to only deal with a single buffer type by forcing an input format common to all the capture devices if possible.

So how can I  get gst-inspect to list the properties of the other capture devices?
------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Kapil Agrawal
Are your capture cards supported by v4l2 drivers ? If yes then only you can use v4l2src.

Assuming it supports, you need to figure out where is your card installed, like for my blackmagic capture card its /dev/blackmagic/card0 (although there is no v4l2 support, hence wrote our own src element for blackmagic).

So if its not supported by v4l2 you either check if any other src element supports your card or write one of your own :)

Best
Kapil

On Wed, Apr 14, 2010 at 4:22 AM, Kulecz, Walter (JSC-SK)[WYLE INTEG. SCI. & ENG.] <[hidden email]> wrote:
I need two capture cards in my application.  In the process of playing around with things, I've currently got three installed  All work in a simple pipeline if I set the device property with C code using:

g_object_set(G_OBJECT(source), "device",  "/dev/video2",  NULL)

but the capture devices are different enough that I'm stumbling around with the caps, as only using unfiltered caps works for all three cards.  The point of the exercise is to figure out if I can use our current assortment of PCI capture cards and add in some USB ones I've found.

I'd prefer having to only deal with a single buffer type by forcing an input format common to all the capture devices if possible.

So how can I  get gst-inspect to list the properties of the other capture devices?
------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel



--
http://www.linkedin.com/in/kapilagrawal

------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Stefan Sauer
In reply to this post by Kulecz, Walter (JSC-SK)[WYLE INTEG. SCI. & ENG.]
Kulecz, Walter (JSC-SK)[WYLE INTEG. SCI. & ENG.] wrote:

> I need two capture cards in my application.  In the process of playing around with things, I've currently got three installed  All work in a simple pipeline if I set the device property with C code using:
>
> g_object_set(G_OBJECT(source), "device",  "/dev/video2",  NULL)
>
> but the capture devices are different enough that I'm stumbling around with the caps, as only using unfiltered caps works for all three cards.  The point of the exercise is to figure out if I can use our current assortment of PCI capture cards and add in some USB ones I've found.
>
> I'd prefer having to only deal with a single buffer type by forcing an input format common to all the capture devices if possible.
>
> So how can I  get gst-inspect to list the properties of the other capture devices?
>  
objects always have the same properties. The only optional parts are the
implemented interfaces. gst-inspect shows information from the registry,
it does not show information from a running instance.

If you just want to know about your v4l2 devices, use v4l-info to list
the capabilities.

Stefan


> ------------------------------------------------------------------------------
> Download Intel&#174; Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>  


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Kulecz, Walter (JSC-SK)[WYLE INTEG. SCI. & ENG.]
>> So how can I  get gst-inspect to list the properties of the other capture devices?
>>
>objects always have the same properties. The only optional parts are the
>implemented interfaces. gst-inspect shows information from the registry,
>it does not show information from a running instance.
>
>If you just want to know about your v4l2 devices, use v4l-info to list
>the capabilities.
>
>Stefan



Thanks, v4l-info seems to be part of xawtv package, at least I had after in installed Ubuntu 10.04Beta xawtv and xawtv-tools packages.

Clarifies things a bit, seems the Hauppauge 950Q USB devices only supports 16-bit UYVY format at 720x480 frame size (fails at lower frame sizes), while the SAA713x supports only 704x480 frames (or lower).  Looks like I need to use input card specific caps and crop out the 640x480 analysis frame from the stream.  The cameras are "genlocked"  but my next question is would the video buffer timestamps be adequate to maintain time alignment between the two streams if I use an extra pipeline stage to convert the 16-bit UYVY to 8-bit gray.

Are there any docs on how the various video buffer formants are organized in the video buffers I get with app-sink, or am I reduced to stumbling through the gstreamer source tree to find something that might show me?  I see several ways to proceed, its not clear at all which would be more efficient in either throughput or development time.

If the video buffer timestamps can be counted on to realign the final analyzed results between the two capture streams, there would be no need to worry about latencies of USB vs PCI capture devices or differing numbers of elements in the two pipelines.  What we do in version one is external hardware overlays an IRIG timecode on the frames before capture.  Eliminating the need for this would be a big win in the version 2 design.
------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Stefan Sauer
Kulecz, Walter (JSC-SK)[WYLE INTEG. SCI. & ENG.] wrote:

>>> So how can I  get gst-inspect to list the properties of the other capture devices?
>>>
>>>      
>> objects always have the same properties. The only optional parts are the
>> implemented interfaces. gst-inspect shows information from the registry,
>> it does not show information from a running instance.
>>
>> If you just want to know about your v4l2 devices, use v4l-info to list
>> the capabilities.
>>
>> Stefan
>>    
>
>
>
> Thanks, v4l-info seems to be part of xawtv package, at least I had after in installed Ubuntu 10.04Beta xawtv and xawtv-tools packages.
>
> Clarifies things a bit, seems the Hauppauge 950Q USB devices only supports 16-bit UYVY format at 720x480 frame size (fails at lower frame sizes), while the SAA713x supports only 704x480 frames (or lower).  Looks like I need to use input card specific caps and crop out the 640x480 analysis frame from the stream.  The cameras are "genlocked"  but my next question is would the video buffer timestamps be adequate to maintain time alignment between the two streams if I use an extra pipeline stage to convert the 16-bit UYVY to 8-bit gray.
>
> Are there any docs on how the various video buffer formants are organized in the video buffers I get with app-sink, or am I reduced to stumbling through the gstreamer source tree to find something that might show me?  I see several ways to proceed, its not clear at all which would be more efficient in either throughput or development time.
>  
The best place are the docs for GstVideoFormat. Its from
gst-plugins-base/gst-libs/gst/video/. We take patches fro more detailed
docs :)
> If the video buffer timestamps can be counted on to realign the final analyzed results between the two capture streams, there would be no need to worry about latencies of USB vs PCI capture devices or differing numbers of elements in the two pipelines.
Camera sources are live-sources. That means GST_BUFFER_TIMESTAMP(buf) is
a clock sample of the time when the frame was captured
(clocktime-basetime). Thats why it is a good idea to use openrating
system mechanisms to run those capture threads at higher priority or
even under realtime scheduling (avoiding jitter in the timestamps).

Stefan

>   What we do in version one is external hardware overlays an IRIG timecode on the frames before capture.  Eliminating the need for this would be a big win in the version 2 design.
> ------------------------------------------------------------------------------
> Download Intel&#174; Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>  


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Kulecz, Walter (JSC-SK)[WYLE INTEG. SCI. & ENG.]
> Camera sources are live-sources. That means GST_BUFFER_TIMESTAMP(buf) is
> a clock sample of the time when the frame was captured
> (clocktime-basetime). Thats why it is a good idea to use operating
> system mechanisms to run those capture threads at higher priority or
> even under realtime scheduling (avoiding jitter in the timestamps).
>
> Stefan

Thanks for the reply.  This sounds like what I wanted to hear!
My version one code runs on the RT kernal from Ubuntu Studio.

Any clues about how to read the buffer time stamps after I've pulled
them in with appsink would be greatly appreciated.   Its not clear to me
what the nested macro definition does:

#define GST_BUFFER_TIMESTAMP(buf) (GST_BUFFER_CAST(buf)->timestamp)

If buf is a pointer to a GstBuffer does this turn into the value of the buffer's timestamp,
or a pointer to it?

I haven't stumbled on the definition of  the GST_BUFFER_CAST macro yet :(

Also, what is the use/purpose of the duration value stored in the buffer struct definition?

--wally.
------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Edward Hervey
Administrator
On Fri, 2010-04-16 at 14:53 -0500, Kulecz, Walter (JSC-SK)[WYLE INTEG.
SCI. & ENG.] wrote:

> > Camera sources are live-sources. That means GST_BUFFER_TIMESTAMP(buf) is
> > a clock sample of the time when the frame was captured
> > (clocktime-basetime). Thats why it is a good idea to use operating
> > system mechanisms to run those capture threads at higher priority or
> > even under realtime scheduling (avoiding jitter in the timestamps).
> >
> > Stefan
>
> Thanks for the reply.  This sounds like what I wanted to hear!
> My version one code runs on the RT kernal from Ubuntu Studio.
>
> Any clues about how to read the buffer time stamps after I've pulled
> them in with appsink would be greatly appreciated.   Its not clear to me
> what the nested macro definition does:
>
> #define GST_BUFFER_TIMESTAMP(buf) (GST_BUFFER_CAST(buf)->timestamp)
>
> If buf is a pointer to a GstBuffer does this turn into the value of the buffer's timestamp,
> or a pointer to it?
>
> I haven't stumbled on the definition of  the GST_BUFFER_CAST macro yet :(

  ==> Option 1 : Just try using GST_BUFFER_TIMESTAMP
  ==> Option 2 : Google for usages of GST_BUFFER_TIMESTAMP
  ==> Option 3 : grep for usage of GST_BUFFER_TIMESTAMP in existing code
  ==> Option 4 : Use common sense
  ==> Option 5 : Consider career change

   No, you don't need to reply, I'll say it before you : Our
documentation sucks [1]

>
> --wally.
> ------------------------------------------------------------------------------
> Download Intel&#174; Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
[1] For blind people


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Short, Jonathan

> -----Original Message-----
> From: Edward Hervey [mailto:[hidden email]]
> Sent: Saturday, April 17, 2010 12:15 PM
> To: Discussion of the development of GStreamer
> Subject: Re: [gst-devel] How do I get gst-inspect properties for
> v4l2src device other than /dev/video0?
>
> On Fri, 2010-04-16 at 14:53 -0500, Kulecz, Walter (JSC-SK)[WYLE INTEG.
> SCI. & ENG.] wrote:
> > > Camera sources are live-sources. That means
> GST_BUFFER_TIMESTAMP(buf) is
> > > a clock sample of the time when the frame was captured
> > > (clocktime-basetime). Thats why it is a good idea to use operating
> > > system mechanisms to run those capture threads at higher priority
> or
> > > even under realtime scheduling (avoiding jitter in the
timestamps).
> > >
> > > Stefan

Is there a preferred method for setting the thread priorities of
sources, given that different sources may have different internal
threading structure and threading is generally abstracted by gstreamer?

For example, I'd like to set the priority of rtspsrc as suggested above.

Thanks,

Jonathan


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Stefan Sauer
Am 19.04.2010 19:20, schrieb Short, Jonathan:

>
>> -----Original Message-----
>> From: Edward Hervey [mailto:[hidden email]]
>> Sent: Saturday, April 17, 2010 12:15 PM
>> To: Discussion of the development of GStreamer
>> Subject: Re: [gst-devel] How do I get gst-inspect properties for
>> v4l2src device other than /dev/video0?
>>
>> On Fri, 2010-04-16 at 14:53 -0500, Kulecz, Walter (JSC-SK)[WYLE INTEG.
>> SCI. & ENG.] wrote:
>>>> Camera sources are live-sources. That means
>> GST_BUFFER_TIMESTAMP(buf) is
>>>> a clock sample of the time when the frame was captured
>>>> (clocktime-basetime). Thats why it is a good idea to use operating
>>>> system mechanisms to run those capture threads at higher priority
>> or
>>>> even under realtime scheduling (avoiding jitter in the
> timestamps).
>>>>
>>>> Stefan
>
> Is there a preferred method for setting the thread priorities of
> sources, given that different sources may have different internal
> threading structure and threading is generally abstracted by gstreamer?
>
> For example, I'd like to set the priority of rtspsrc as suggested above.

Have a look at the examples under gstreamer/tests/examples/streams/. It is
unfortunately platform specific. Glibs gthread abstraction does not cover this
area (well).

Stefan

>
> Thanks,
>
> Jonathan
>
>
> ------------------------------------------------------------------------------
> Download Intel&#174; Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel


------------------------------------------------------------------------------
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Short, Jonathan
> -----Original Message-----
> From: Stefan Kost [mailto:[hidden email]]
> Sent: Tuesday, April 20, 2010 2:47 PM
> To: Discussion of the development of GStreamer
> Cc: Short, Jonathan
> Subject: Re: [gst-devel] How do I get gst-inspect properties for
> v4l2src device other than /dev/video0?
>
> Am 19.04.2010 19:20, schrieb Short, Jonathan:
> >
> >> -----Original Message-----
> >> From: Edward Hervey [mailto:[hidden email]]
> >> Sent: Saturday, April 17, 2010 12:15 PM
> >> To: Discussion of the development of GStreamer
> >> Subject: Re: [gst-devel] How do I get gst-inspect properties for
> >> v4l2src device other than /dev/video0?
> >>
> >> On Fri, 2010-04-16 at 14:53 -0500, Kulecz, Walter (JSC-SK)[WYLE
> INTEG.
> >> SCI. & ENG.] wrote:
> >>>> Camera sources are live-sources. That means
> >> GST_BUFFER_TIMESTAMP(buf) is
> >>>> a clock sample of the time when the frame was captured
> >>>> (clocktime-basetime). Thats why it is a good idea to use
operating

> >>>> system mechanisms to run those capture threads at higher priority
> >> or
> >>>> even under realtime scheduling (avoiding jitter in the
> > timestamps).
> >>>>
> >>>> Stefan
> >
> > Is there a preferred method for setting the thread priorities of
> > sources, given that different sources may have different internal
> > threading structure and threading is generally abstracted by
> gstreamer?
> >
> > For example, I'd like to set the priority of rtspsrc as suggested
> above.
>
> Have a look at the examples under gstreamer/tests/examples/streams/.
It
> is
> unfortunately platform specific. Glibs gthread abstraction does not
> cover this
> area (well).
>
> Stefan


Thanks greatly for the info.  I had a look at the examples, and can see
how to set thread priority now.  The one part I'm not clear on is how to
ensure that I am only setting the priority of the capture thread and not
others.  For instance, our pipeline has quite a few queues in it.  Its
not clear to me from the examples how the rtspsrc thread would be set at
a high priority while the consumer threads further down the pipeline
wouldn't.  Also based on the thread pool nature of gstreamer, is there
any assurance that the higher priority thread won't later be used for
something else?

Thanks again,

Jonathan



------------------------------------------------------------------------------
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel