decode byte-stream video without gmainloop

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

decode byte-stream video without gmainloop

Karl Lattimer
I’m trying to integrate gstreamer with RpiPlay (https://github.com/FD-/RPiPlay) for generic linux support (https://github.com/FD-/RPiPlay/issues/24) and looking at the gstreamer documentation it seems difficult to find a method to write a buffer of received bytes to a pipeline. 

Specifically I need to take the method call 

void video_renderer_render_buffer(video_renderer_t *renderer, raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts, int type)

and push that data into a gstreamer pipeline. As far as I understand the raop_ntp_t struct contains information regarding time synchronisation and pts provides an accurate timestamp. In my tests with gstreamer so far I’ve taken the buffer, dumped it to disk as an MP4 file, then used a simple playbin pipeline to open the file and play it. 

I suppose what I’m looking for here are pointers to the appropriate documentation for pushing a block of data into a decode pipeline, preferably without going as far as writing a gstreamer plugin as that wouldn’t sit well with the rest of the project. 

I’d also like to know if it’s possible to get a playbin to dump the pipeline that it’s using out somehow, there’s quite a lot of information in the verbose output, but I can’t seem to spot a pipeline which I could use on the command line in place of playbin, I think that would help me in the final goal. 

Advice, guidance, links, examples appreciated. 


Regards,
 K


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: decode byte-stream video without gmainloop

Nicolas Dufresne-5
Le dimanche 02 février 2020 à 22:10 +0000, Karl Lattimer a écrit :

> I’m trying to integrate gstreamer with RpiPlay (https://github.com/FD-/RPiPlay) for generic linux support (https://github.com/FD-/RPiPlay/issues/24) and looking at the gstreamer documentation it seems difficult to find a method to write a buffer of received bytes to a pipeline.
>
> Specifically I need to take the method call
>
> void video_renderer_render_buffer(video_renderer_t *renderer,
> raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts, int
> type)
>
> and push that data into a gstreamer pipeline. As far as I understand
> the raop_ntp_t struct contains information regarding time
> synchronisation and pts provides an accurate timestamp. In my tests
> with gstreamer so far I’ve taken the buffer, dumped it to disk as an
> MP4 file, then used a simple playbin pipeline to open the file and
> play it.

Have you looked at the appsrc element ? Note that GStreamer processing
is asynchronous, so you may have to copy the pointer, or make sure that
your wrapped buffer is consumed before returning.

>
> I suppose what I’m looking for here are pointers to the appropriate
> documentation for pushing a block of data into a decode pipeline,
> preferably without going as far as writing a gstreamer plugin as that
> wouldn’t sit well with the rest of the project.

https://gstreamer.freedesktop.org/documentation/tutorials/basic/short-cutting-the-pipeline.html?gi-language=c

>
> I’d also like to know if it’s possible to get a playbin to dump the
> pipeline that it’s using out somehow, there’s quite a lot of
> information in the verbose output, but I can’t seem to spot a
> pipeline which I could use on the command line in place of playbin, I
> think that would help me in the final goal.

When you are using gst-launch-1.0 you can dump the pipeline into DOT
files using the env GST_DEBUG_DUMP_DOT_DIR=

To get the same thing in your application, you have to call:

  GST_DEBUG_BIN_TO_DOT_FILE(bin, GST_DEBUG_GRAPH_SHOW_ALL, file_name);

This needs to be called wherever in your code you would like to inspect
the pipeline.

>
> Advice, guidance, links, examples appreciated.
>
> —
>
> Regards,
>  K
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: decode byte-stream video without gmainloop

Karl Lattimer

and push that data into a gstreamer pipeline. As far as I understand
the raop_ntp_t struct contains information regarding time
synchronisation and pts provides an accurate timestamp. In my tests
with gstreamer so far I’ve taken the buffer, dumped it to disk as an
MP4 file, then used a simple playbin pipeline to open the file and
play it. 

Have you looked at the appsrc element ? Note that GStreamer processing
is asynchronous, so you may have to copy the pointer, or make sure that
your wrapped buffer is consumed before returning.

It’s my understanding that for that to work I’d need a gmainloop in order to callback the appropriate functions for need data and have enough data, as well as being able to emit the signal push-data which seems to be where the data is inserted into the pipeline. 

Is this correct? I don’t think I can shove a gmainloop into RPiPlay it’s pretty much dependent on the data stream and has it’s own mainloop for that purpose. 



I’d also like to know if it’s possible to get a playbin to dump the
pipeline that it’s using out somehow, there’s quite a lot of
information in the verbose output, but I can’t seem to spot a
pipeline which I could use on the command line in place of playbin, I
think that would help me in the final goal. 

When you are using gst-launch-1.0 you can dump the pipeline into DOT
files using the env GST_DEBUG_DUMP_DOT_DIR=

To get the same thing in your application, you have to call:

 GST_DEBUG_BIN_TO_DOT_FILE(bin, GST_DEBUG_GRAPH_SHOW_ALL, file_name);

perfect thanks! 

K,

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: decode byte-stream video without gmainloop

Matthew Waters
On 3/2/20 6:24 pm, Karl Lattimer wrote:

and push that data into a gstreamer pipeline. As far as I understand
the raop_ntp_t struct contains information regarding time
synchronisation and pts provides an accurate timestamp. In my tests
with gstreamer so far I’ve taken the buffer, dumped it to disk as an
MP4 file, then used a simple playbin pipeline to open the file and
play it. 

Have you looked at the appsrc element ? Note that GStreamer processing
is asynchronous, so you may have to copy the pointer, or make sure that
your wrapped buffer is consumed before returning.

It’s my understanding that for that to work I’d need a gmainloop in order to callback the appropriate functions for need data and have enough data, as well as being able to emit the signal push-data which seems to be where the data is inserted into the pipeline. 

Is this correct? I don’t think I can shove a gmainloop into RPiPlay it’s pretty much dependent on the data stream and has it’s own mainloop for that purpose.

You don't need a mainloop running to emit or retrieve those signals on appsrc.  Nothing in GStreamer requires there to be any kind of mainloop running except for platform or plugin specifics which are few and far between (e.g. MacOS video output).

Cheers
-Matt



I’d also like to know if it’s possible to get a playbin to dump the
pipeline that it’s using out somehow, there’s quite a lot of
information in the verbose output, but I can’t seem to spot a
pipeline which I could use on the command line in place of playbin, I
think that would help me in the final goal. 

When you are using gst-launch-1.0 you can dump the pipeline into DOT
files using the env GST_DEBUG_DUMP_DOT_DIR=

To get the same thing in your application, you have to call:

 GST_DEBUG_BIN_TO_DOT_FILE(bin, GST_DEBUG_GRAPH_SHOW_ALL, file_name);

perfect thanks! 

K,

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (499 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: decode byte-stream video without gmainloop

Karl Lattimer
I seem to be getting *somewhere* but I’m having an issue with creating a new GstBuffer 

This seems strange, I’ve tried both 

    buffer = gst_buffer_new_and_alloc(data_len);

and

    buffer = gst_buffer_new ();
    memory = gst_allocator_alloc (NULL, data_len, NULL);
    gst_buffer_insert_memory (buffer, -1, memory);

both methods cause a segfault… Not sure why, I mean, I’m only requesting a memory allocation so this should be OK right? 

backtrace isn’t entirely revealing 

#0  0x00007ffff7e6c07d in gst_allocator_alloc () at /usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0

Regards,
 K


On 3 Feb 2020, at 08:00, Matthew Waters <[hidden email]> wrote:

On 3/2/20 6:24 pm, Karl Lattimer wrote:

and push that data into a gstreamer pipeline. As far as I understand
the raop_ntp_t struct contains information regarding time
synchronisation and pts provides an accurate timestamp. In my tests
with gstreamer so far I’ve taken the buffer, dumped it to disk as an
MP4 file, then used a simple playbin pipeline to open the file and
play it. 

Have you looked at the appsrc element ? Note that GStreamer processing
is asynchronous, so you may have to copy the pointer, or make sure that
your wrapped buffer is consumed before returning.

It’s my understanding that for that to work I’d need a gmainloop in order to callback the appropriate functions for need data and have enough data, as well as being able to emit the signal push-data which seems to be where the data is inserted into the pipeline. 

Is this correct? I don’t think I can shove a gmainloop into RPiPlay it’s pretty much dependent on the data stream and has it’s own mainloop for that purpose.

You don't need a mainloop running to emit or retrieve those signals on appsrc.  Nothing in GStreamer requires there to be any kind of mainloop running except for platform or plugin specifics which are few and far between (e.g. MacOS video output).

Cheers
-Matt



I’d also like to know if it’s possible to get a playbin to dump the
pipeline that it’s using out somehow, there’s quite a lot of
information in the verbose output, but I can’t seem to spot a
pipeline which I could use on the command line in place of playbin, I
think that would help me in the final goal. 

When you are using gst-launch-1.0 you can dump the pipeline into DOT
files using the env GST_DEBUG_DUMP_DOT_DIR=

To get the same thing in your application, you have to call:

 GST_DEBUG_BIN_TO_DOT_FILE(bin, GST_DEBUG_GRAPH_SHOW_ALL, file_name);

perfect thanks! 

K,

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: decode byte-stream video without gmainloop

Matthew Waters
1. You should download debug symbols for GStreamer to get better backtraces.
2. Are you calling gst_init() at the beginning of your program ?

Cheers
-Matt

On 3/2/20 9:59 pm, Karl Lattimer wrote:
I seem to be getting *somewhere* but I’m having an issue with creating a new GstBuffer 

This seems strange, I’ve tried both 

    buffer = gst_buffer_new_and_alloc(data_len);

and

    buffer = gst_buffer_new ();
    memory = gst_allocator_alloc (NULL, data_len, NULL);
    gst_buffer_insert_memory (buffer, -1, memory);

both methods cause a segfault… Not sure why, I mean, I’m only requesting a memory allocation so this should be OK right? 

backtrace isn’t entirely revealing 

#0  0x00007ffff7e6c07d in gst_allocator_alloc () at /usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0

Regards,
 K


On 3 Feb 2020, at 08:00, Matthew Waters <[hidden email]> wrote:

On 3/2/20 6:24 pm, Karl Lattimer wrote:

and push that data into a gstreamer pipeline. As far as I understand
the raop_ntp_t struct contains information regarding time
synchronisation and pts provides an accurate timestamp. In my tests
with gstreamer so far I’ve taken the buffer, dumped it to disk as an
MP4 file, then used a simple playbin pipeline to open the file and
play it. 

Have you looked at the appsrc element ? Note that GStreamer processing
is asynchronous, so you may have to copy the pointer, or make sure that
your wrapped buffer is consumed before returning.

It’s my understanding that for that to work I’d need a gmainloop in order to callback the appropriate functions for need data and have enough data, as well as being able to emit the signal push-data which seems to be where the data is inserted into the pipeline. 

Is this correct? I don’t think I can shove a gmainloop into RPiPlay it’s pretty much dependent on the data stream and has it’s own mainloop for that purpose.

You don't need a mainloop running to emit or retrieve those signals on appsrc.  Nothing in GStreamer requires there to be any kind of mainloop running except for platform or plugin specifics which are few and far between (e.g. MacOS video output).

Cheers
-Matt



I’d also like to know if it’s possible to get a playbin to dump the
pipeline that it’s using out somehow, there’s quite a lot of
information in the verbose output, but I can’t seem to spot a
pipeline which I could use on the command line in place of playbin, I
think that would help me in the final goal. 

When you are using gst-launch-1.0 you can dump the pipeline into DOT
files using the env GST_DEBUG_DUMP_DOT_DIR=

To get the same thing in your application, you have to call:

 GST_DEBUG_BIN_TO_DOT_FILE(bin, GST_DEBUG_GRAPH_SHOW_ALL, file_name);

perfect thanks! 

K,

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (499 bytes) Download Attachment