appsrc to filesink

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

appsrc to filesink

acolubri
Hello,

I wrote a very small sample program to demonstrate what I'm trying to
achieve with appsrc: to create a video file from buffers (generated by
the application itself) that are injected into the pipeline. This is the
actual pipeline I'm using in the example:

appsrc name=testsource caps=test_caps ! ffmpegcolorspace !
video/x-raw-yuv, width=80, height=60, bpp=32, depth=24 ! queue !
videorate ! video/x-raw-yuv, framerate=30/1 ! xvidenc ! queue ! avimux !
queue ! filesink location=test.avi

where test_caps =
"video/x-raw-rgb,width=80,height=60,bpp=32,depth=24,framerate=30/1"

I have tried different things, but irrespective of what I do, the
resulting avi file doesn't contain any playable video (and it always has
a size of 800 bytes).

Perhaps I'm making a very basic conceptual mistake, any help will be
greatly appreciated! I'm attaching the sample code and the resulting avi
file.

Thanks in advance,
Andres


#include <string.h>

#include <gst/gst.h>

#include <gst/app/gstappsrc.h>
#include <gst/app/gstappbuffer.h>

/* these are the caps we are going to pass through the appsrc */
const gchar *video_caps =
    "video/x-raw-rgb,width=80,height=60,bpp=32,depth=24,framerate=30/1";

typedef struct
{
  GMainLoop *loop;
  GstElement *sink;
} ProgramData;

/* used to inject a new buffer into the pipeline */
static void
inject_new_buffer_into_pipeline (ProgramData * data)
{
  guint size;
  gpointer raw_buffer;
  GstBuffer *app_buffer;
  GstElement *source;

  size = 80 * 60 * 8; // 80x60 pixels, 32 bpp.
  g_print ("Pushing a buffer of size %d\n", size);
 
  // Allocating the memory for the frame buffer.
  raw_buffer = g_malloc0 (size);
 
  app_buffer = gst_app_buffer_new (raw_buffer, size, g_free, raw_buffer);

  /* newer basesrc will set caps for use automatically but it does not really
   * hurt to set it on the buffer again */
  gst_buffer_set_caps (app_buffer, gst_caps_from_string (video_caps));

  /* get source an push new buffer */
  source = gst_bin_get_by_name (GST_BIN (data->sink), "testsource");
  gst_app_src_push_buffer (GST_APP_SRC (source), app_buffer);
}

/* called when we get a GstMessage from the sink pipeline when we get EOS, we
 * exit the mainloop and this testapp. */
static gboolean
on_sink_message (GstBus * bus, GstMessage * message, ProgramData * data)
{
  /* nil */
  switch (GST_MESSAGE_TYPE (message)) {
    case GST_MESSAGE_EOS:
      g_print ("Finished playback\n");
      g_main_loop_quit (data->loop);
      break;
    case GST_MESSAGE_ERROR:
      g_print ("Received error\n");
      g_main_loop_quit (data->loop);
      break;
    default:
      break;
  }
  return TRUE;
}

int
main (int argc, char *argv[])
{
  ProgramData *data = NULL;
  gchar *string = NULL;
  GstBus *bus = NULL;
  GstElement *testsink = NULL;
  GstElement *testsource = NULL;
  int i;

  gst_init (&argc, &argv);

  data = g_new0 (ProgramData, 1);

  data->loop = g_main_loop_new (NULL, FALSE);

  /* setting up sink pipeline, we push video data into this pipeline that will
   * then be recorded to an avi file. */
  string =
      g_strdup_printf ("appsrc name=testsource caps=\"%s\" ! ffmpegcolorspace ! video/x-raw-yuv, width=80, height=60, bpp=32, depth=24 ! queue ! videorate ! video/x-raw-yuv, framerate=30/1 ! xvidenc ! queue ! avimux ! queue ! filesink location=test.avi",
      video_caps);
  data->sink = gst_parse_launch (string, NULL);
  g_free (string);

  if (data->sink == NULL) {
    g_print ("Bad sink\n");
    return -1;
  }

  testsource = gst_bin_get_by_name (GST_BIN (data->sink), "testsource");
  /* configure for time-based format */
  g_object_set (testsource, "format", GST_FORMAT_TIME, NULL);
  /* uncomment the next line to block when appsrc has buffered enough */
  /* g_object_set (testsource, "block", TRUE, NULL); */
  gst_object_unref (testsource);

  bus = gst_element_get_bus (data->sink);
  gst_bus_add_watch (bus, (GstBusFunc) on_sink_message, data);
  gst_object_unref (bus);

  /* launching things */
  gst_element_set_state (data->sink, GST_STATE_PLAYING);

  g_print ("Let's run!\n");
  /* Injecting 300 frames into the pipeline, which should generate a 10 seconds
   * avi file. */
  for (i = 0; i < 300; i++) inject_new_buffer_into_pipeline(data);
  g_print ("Going out\n");

  gst_element_set_state (data->sink, GST_STATE_NULL);

  gst_object_unref (data->sink);
  g_main_loop_unref (data->loop);
  g_free (data);

  return 0;
}

------------------------------------------------------------------------------
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

test.avi (1K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: appsrc to filesink

Julien PUYDT
Andres Colubri a écrit :

> I wrote a very small sample program to demonstrate what I'm trying to
> achieve with appsrc: to create a video file from buffers (generated by
> the application itself) that are injected into the pipeline. This is the
> actual pipeline I'm using in the example:
>
> appsrc name=testsource caps=test_caps ! ffmpegcolorspace !
> video/x-raw-yuv, width=80, height=60, bpp=32, depth=24 ! queue !
> videorate ! video/x-raw-yuv, framerate=30/1 ! xvidenc ! queue ! avimux !
> queue ! filesink location=test.avi
>
> where test_caps =
> "video/x-raw-rgb,width=80,height=60,bpp=32,depth=24,framerate=30/1"
>
> I have tried different things, but irrespective of what I do, the
> resulting avi file doesn't contain any playable video (and it always has
> a size of 800 bytes).
>
> Perhaps I'm making a very basic conceptual mistake, any help will be
> greatly appreciated! I'm attaching the sample code and the resulting avi
> file.

I have struggled with appsrc/appsink too (and it's still not working
reliably enough) ; you may have a look at the result here :
http://svn.gnome.org/viewvc/ekiga/trunk/lib/engine/components/gstreamer/

(the video input mostly works ; the audio input doesn't seem to, and the
audio output works for sound events but not in a call).

Hope that helps,

Snark on #gstreamer and #ekiga

------------------------------------------------------------------------------
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: appsrc to filesink

Tim-Philipp Müller-2
In reply to this post by acolubri
On Sun, 2009-02-01 at 22:06 -0800, Andres Colubri wrote:

Hi,

> appsrc name=testsource caps=test_caps ! ffmpegcolorspace !
> video/x-raw-yuv, width=80, height=60, bpp=32, depth=24 ! queue !

The bpp/depth fields don't really make sense here. YUV caps usually look
like:

 video/x-raw-yuv,format=(fourcc)I420,width=80,height=60

with bits per pixel etc. implied by the pixel layout/format used. See:

http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/section-types-definitions.html#table-video-types

for more info. (You shouldn't *need* to specify a format here though,
ffmpegcolorspace should negotiate to one supported by
downstream/xvidenc).

> videorate ! video/x-raw-yuv, framerate=30/1 ! xvidenc ! queue ! avimux !
> queue ! filesink location=test.avi
>
> where test_caps =
> "video/x-raw-rgb,width=80,height=60,bpp=32,depth=24,framerate=30/1"

These RGB caps are incomplete. You need to also specify:
 - red_mask
 - green_mask
 - blue_mask
 - endianness

ffmpegcolorspace should reject these caps with a NOT_NEGOTIATED flow
return. Not sure what appsrc does with that, but I'd guess it would post
an error message on the pipeline's bus.

Cheers
 -Tim



------------------------------------------------------------------------------
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: appsrc to filesink

acolubri
In reply to this post by Julien PUYDT

> I have struggled with appsrc/appsink too (and it's still not working
> reliably enough) ; you may have a look at the result here :
> http://svn.gnome.org/viewvc/ekiga/trunk/lib/engine/components/gstreamer/
>
> (the video input mostly works ; the audio input doesn't seem to, and the
> audio output works for sound events but not in a call).
>
> Hope that helps,
>
> Snark on #gstreamer and #ekiga
>
>  
Thanks for pointing to your code. I looked at the gst-audiooutput.cpp
file, and although you are sending audio instead of video, the logic
seems to be the same as in my example.

------------------------------------------------------------------------------
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: appsrc to filesink

acolubri
In reply to this post by Tim-Philipp Müller-2
Many thanks for the comments. I corrected both caps (the one used for
ffmpegcolorspace and the one in appsrc). This is how they look like now:

video/x-raw-yuv,format=(fourcc)I420,width=80,height=60 (for
ffmpegcolorspace)
video/x-raw-rgb,width=80,height=60,bpp=32,endianness=4321,depth=24,red_mask=65280,green_mask=16711680,blue_mask=-16777216,framerate=30/1
(for appsrc)

I also set the is-live property of appsrc to true.

The program runs ok and gives no error. However, the resulting avi file
is completely empty (0 bytes). I also tried ffmpegcolorspace without
caps, and appsrc without is-live=true, but the same result.

Do you suggest any other change/test in the code to try to locate where
the problem is?

I attached the latest version of the example, just in case anyone wants
to take a look.

Again, thanks a lot.

> Hi,
>  
>> appsrc name=testsource caps=test_caps ! ffmpegcolorspace !
>> video/x-raw-yuv, width=80, height=60, bpp=32, depth=24 ! queue !
>>    
>
> The bpp/depth fields don't really make sense here. YUV caps usually look
> like:
>
>  video/x-raw-yuv,format=(fourcc)I420,width=80,height=60
>
> with bits per pixel etc. implied by the pixel layout/format used. See:
>
> http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/section-types-definitions.html#table-video-types
>
> for more info. (You shouldn't *need* to specify a format here though,
> ffmpegcolorspace should negotiate to one supported by
> downstream/xvidenc).
>
>  
>> videorate ! video/x-raw-yuv, framerate=30/1 ! xvidenc ! queue ! avimux !
>> queue ! filesink location=test.avi
>>
>> where test_caps =
>> "video/x-raw-rgb,width=80,height=60,bpp=32,depth=24,framerate=30/1"
>>    
>
> These RGB caps are incomplete. You need to also specify:
>  - red_mask
>  - green_mask
>  - blue_mask
>  - endianness
>
> ffmpegcolorspace should reject these caps with a NOT_NEGOTIATED flow
> return. Not sure what appsrc does with that, but I'd guess it would post
> an error message on the pipeline's bus.
>
> Cheers
>  -Tim
>
>  

>
> ------------------------------------------------------------------------------
> This SF.net email is sponsored by:
> SourcForge Community
> SourceForge wants to tell your story.
> http://p.sf.net/sfu/sf-spreadtheword
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>
>  

#include <string.h>

#include <gst/gst.h>

#include <gst/app/gstappsrc.h>
#include <gst/app/gstappbuffer.h>

/* these are the caps we are going to pass through the appsrc */
const gchar *video_caps =
    "video/x-raw-rgb,width=80,height=60,bpp=32,endianness=4321,depth=24,red_mask=65280,green_mask=16711680,blue_mask=-16777216,framerate=30/1";

typedef struct
{
  GMainLoop *loop;
  GstElement *sink;
} ProgramData;

/* used to inject a new buffer into the pipeline */
static void
inject_new_buffer_into_pipeline (ProgramData * data)
{
  guint size;
  gpointer raw_buffer;
  GstBuffer *app_buffer;
  GstElement *source;

  size = 80 * 60 * 4; // 80x60 pixels, 32 bpp.
  g_print ("Pushing a buffer of size %d\n", size);
 
  // Allocating the memory for the buffer.
  raw_buffer = g_malloc0 (size);
 
  app_buffer = gst_app_buffer_new (raw_buffer, size, g_free, raw_buffer);

  /* newer basesrc will set caps for use automatically but it does not really
   * hurt to set it on the buffer again */
  gst_buffer_set_caps (app_buffer, gst_caps_from_string (video_caps));

  /* get source an push new buffer */
  source = gst_bin_get_by_name (GST_BIN (data->sink), "testsource");
  gst_app_src_push_buffer (GST_APP_SRC (source), app_buffer);
}

/* called when we get a GstMessage from the sink pipeline when we get EOS, we
 * exit the mainloop and this testapp. */
static gboolean
on_sink_message (GstBus * bus, GstMessage * message, ProgramData * data)
{
  /* nil */
  switch (GST_MESSAGE_TYPE (message)) {
    case GST_MESSAGE_EOS:
      g_print ("Finished playback\n");
      g_main_loop_quit (data->loop);
      break;
    case GST_MESSAGE_ERROR:
      g_print ("Received error\n");
      g_main_loop_quit (data->loop);
      break;
    default:
      break;
  }
  return TRUE;
}

int
main (int argc, char *argv[])
{
  ProgramData *data = NULL;
  gchar *string = NULL;
  GstBus *bus = NULL;
  GstElement *testsink = NULL;
  GstElement *testsource = NULL;
  int i;

  gst_init (&argc, &argv);

  data = g_new0 (ProgramData, 1);

  data->loop = g_main_loop_new (NULL, FALSE);

  /* setting up sink pipeline, we push video data into this pipeline that will
   * then be recorded to an avi file. */
  string =
      g_strdup_printf ("appsrc is-live=true name=testsource caps=\"%s\" ! ffmpegcolorspace ! video/x-raw-yuv,format=(fourcc)I420,width=80,height=60 ! queue ! videorate ! video/x-raw-yuv,framerate=30/1 ! xvidenc ! queue ! avimux ! queue ! filesink location=test.avi",
      video_caps);
  data->sink = gst_parse_launch (string, NULL);
  g_free (string);

  if (data->sink == NULL) {
    g_print ("Bad sink\n");
    return -1;
  }

  testsource = gst_bin_get_by_name (GST_BIN (data->sink), "testsource");
  /* configure for time-based format */
  g_object_set (testsource, "format", GST_FORMAT_TIME, NULL);
  /* uncomment the next line to block when appsrc has buffered enough */
  /* g_object_set (testsource, "block", TRUE, NULL); */
  gst_object_unref (testsource);

  bus = gst_element_get_bus (data->sink);
  gst_bus_add_watch (bus, (GstBusFunc) on_sink_message, data);
  gst_object_unref (bus);

  /* launching things */
  gst_element_set_state (data->sink, GST_STATE_PLAYING);

  g_print ("Let's run!\n");
  /* Injecting 300 frames into the pipeline, which should generate a 10 seconds
   * avi file. */
  for (i = 0; i < 300; i++) inject_new_buffer_into_pipeline(data);
  g_print ("Going out\n");

  gst_element_set_state (data->sink, GST_STATE_NULL);

  gst_object_unref (data->sink);
  g_main_loop_unref (data->loop);
  g_free (data);

  return 0;
}

------------------------------------------------------------------------------
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: appsrc to filesink

acolubri
I've been trying a few more things. Basically what I do is first set the
pipeline to playing, then push the buffers through appsrc, and finally I
run the main application loop:

int
main (int argc, char *argv[])
{
...
gst_element_set_state (data->sink, GST_STATE_PLAYING);

for (i = 0; i < 30; i++) inject_new_buffer_into_pipeline(data);
gst_app_src_end_of_stream (GST_APP_SRC (testsource));

gst_element_set_state (data->sink, GST_STATE_PLAYING);

g_main_loop_run (data->loop);
...
}

I'm assuming that by running the main loop, the buffers in the appsrc
queue would go through the rest of the pipeline and will be finally
encoded and written to disk. Is this correct?

The inject_new_buffer_into_pipeline() function just pushes the buffers:

inject_new_buffer_into_pipeline (ProgramData * data)
{
  guint size;
  gpointer raw_buffer;
  GstBuffer *app_buffer;
  GstElement *source;

  size = 80 * 60 * 4; // 80x60 pixels, 32 bpp.
  g_print ("Pushing a buffer of size %d\n", size);
 
  // Allocating the memory for the buffer.
  raw_buffer = g_malloc0 (size);
 
  app_buffer = gst_app_buffer_new (raw_buffer, size, g_free, raw_buffer);

  /* newer basesrc will set caps for use automatically but it does not
really
   * hurt to set it on the buffer again */
  gst_buffer_set_caps (app_buffer, gst_caps_from_string (video_caps));

  /* get source an push new buffer */
  source = gst_bin_get_by_name (GST_BIN (data->sink), "testsource");
  gst_app_src_push_buffer (GST_APP_SRC (source), app_buffer);
}

I'm not setting any duration or timestamp for the pushed buffers, so
this could be a problem as well?

Anyways, the program runs and doesn't give any error, even though the
resulting avi file is invalid. The complete code of the example is attached.

> Many thanks for the comments. I corrected both caps (the one used for
> ffmpegcolorspace and the one in appsrc). This is how they look like now:
>
> video/x-raw-yuv,format=(fourcc)I420,width=80,height=60 (for
> ffmpegcolorspace)
> video/x-raw-rgb,width=80,height=60,bpp=32,endianness=4321,depth=24,red_mask=65280,green_mask=16711680,blue_mask=-16777216,framerate=30/1
> (for appsrc)
>
> I also set the is-live property of appsrc to true.
>
> The program runs ok and gives no error. However, the resulting avi
> file is completely empty (0 bytes). I also tried ffmpegcolorspace
> without caps, and appsrc without is-live=true, but the same result.
>
> Do you suggest any other change/test in the code to try to locate
> where the problem is?
>
> I attached the latest version of the example, just in case anyone
> wants to take a look.
>
> Again, thanks a lot.

#include <string.h>

#include <gst/gst.h>

#include <gst/app/gstappsrc.h>
#include <gst/app/gstappbuffer.h>

/* these are the caps we are going to pass through the appsrc */
const gchar *video_caps =
    "video/x-raw-rgb,width=80,height=60,bpp=32,endianness=4321,depth=24,red_mask=65280,green_mask=16711680,blue_mask=-16777216,framerate=30/1";

typedef struct
{
  GMainLoop *loop;
  GstElement *sink;
} ProgramData;

/* used to inject a new buffer into the pipeline */
static void
inject_new_buffer_into_pipeline (ProgramData * data)
{
  guint size;
  gpointer raw_buffer;
  GstBuffer *app_buffer;
  GstElement *source;

  size = 80 * 60 * 4; // 80x60 pixels, 32 bpp.
  g_print ("Pushing a buffer of size %d\n", size);
 
  // Allocating the memory for the buffer.
  raw_buffer = g_malloc0 (size);
 
  app_buffer = gst_app_buffer_new (raw_buffer, size, g_free, raw_buffer);

  /* newer basesrc will set caps for use automatically but it does not really
   * hurt to set it on the buffer again */
  gst_buffer_set_caps (app_buffer, gst_caps_from_string (video_caps));

  /* get source an push new buffer */
  source = gst_bin_get_by_name (GST_BIN (data->sink), "testsource");
  gst_app_src_push_buffer (GST_APP_SRC (source), app_buffer);
}

/* called when we get a GstMessage from the sink pipeline when we get EOS, we
 * exit the mainloop and this testapp. */
static gboolean
on_sink_message (GstBus * bus, GstMessage * message, ProgramData * data)
{
  GstState state, pending;
  GstElement *source;

  switch (GST_MESSAGE_TYPE (message)) {
    case GST_MESSAGE_EOS:
      g_print ("Received End of Stream message\n");
      g_main_loop_quit (data->loop);
      break;
    case GST_MESSAGE_ERROR:
      g_print ("Received error\n");
      g_main_loop_quit (data->loop);
      break;
    case GST_MESSAGE_STATE_CHANGED:
      source = gst_bin_get_by_name (GST_BIN (data->sink), "testsource");
      gst_element_get_state(source, &state, &pending, GST_CLOCK_TIME_NONE);
      g_print ("State changed from %i to %i\n", state, pending);
      break;
        default:
      break;
  }
  return TRUE;
}

int
main (int argc, char *argv[])
{
  ProgramData *data = NULL;
  gchar *string = NULL;
  GstBus *bus = NULL;
  GstElement *testsource = NULL;
  int i;

  gst_init (&argc, &argv);

  data = g_new0 (ProgramData, 1);

  data->loop = g_main_loop_new (NULL, FALSE);

  /* setting up sink pipeline, we push video data into this pipeline that will
   * then be recorded to an avi file. */
  string =
      g_strdup_printf ("appsrc is-live=true name=testsource caps=\"%s\" ! ffmpegcolorspace ! video/x-raw-yuv,format=(fourcc)I420,width=80,height=60 ! queue ! videorate ! video/x-raw-yuv,framerate=30/1 ! xvidenc ! queue ! avimux ! queue ! filesink location=test.avi",
      video_caps);
  data->sink = gst_parse_launch (string, NULL);
  g_free (string);

  if (data->sink == NULL) {
    g_print ("Bad sink\n");
    return -1;
  }

  testsource = gst_bin_get_by_name (GST_BIN (data->sink), "testsource");
  /* configure for time-based format */
  g_object_set (testsource, "format", GST_FORMAT_TIME, NULL);
  /* uncomment the next line to block when appsrc has buffered enough */
  /* g_object_set (testsource, "block", TRUE, NULL); */

  bus = gst_element_get_bus (data->sink);
  gst_bus_add_watch (bus, (GstBusFunc) on_sink_message, data);
  gst_object_unref (bus);

  /* launching things */
  gst_element_set_state (data->sink, GST_STATE_PLAYING);

  /* Injecting 300 frames into the pipeline, which should generate a 1 second
   * avi file with the current framerate. */
  g_print ("Injecting buffers...\n");
  for (i = 0; i < 30; i++) inject_new_buffer_into_pipeline(data);
  gst_app_src_end_of_stream (GST_APP_SRC (testsource));
  g_print ("Done.\n");

  gst_element_set_state (data->sink, GST_STATE_PLAYING);

  g_print ("Creating movie...\n");
  g_main_loop_run (data->loop);
  g_print ("Done.\n");

  gst_element_set_state (data->sink, GST_STATE_NULL);

  gst_object_unref (testsource);
  gst_object_unref (data->sink);
  g_main_loop_unref (data->loop);
  g_free (data);

  return 0;
}

------------------------------------------------------------------------------
Create and Deploy Rich Internet Apps outside the browser with Adobe(R)AIR(TM)
software. With Adobe AIR, Ajax developers can use existing skills and code to
build responsive, highly engaging applications that combine the power of local
resources and data with the reach of the web. Download the Adobe AIR SDK and
Ajax docs to start building applications today-http://p.sf.net/sfu/adobe-com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: appsrc to filesink

acolubri
In reply to this post by Tim-Philipp Müller-2
I finally managed to implement a pipeline that takes buffers with
appsrc, and saves them into a video file. I attached the test program,
for anyone who is interested.

Tim-Philipp Müller wrote:

> On Sun, 2009-02-01 at 22:06 -0800, Andres Colubri wrote:
>
> Hi,
>
>  
>> appsrc name=testsource caps=test_caps ! ffmpegcolorspace !
>> video/x-raw-yuv, width=80, height=60, bpp=32, depth=24 ! queue !
>>    
>
> The bpp/depth fields don't really make sense here. YUV caps usually look
> like:
>
>  video/x-raw-yuv,format=(fourcc)I420,width=80,height=60
>
> with bits per pixel etc. implied by the pixel layout/format used. See:
>
> http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/section-types-definitions.html#table-video-types
>
> for more info. (You shouldn't *need* to specify a format here though,
> ffmpegcolorspace should negotiate to one supported by
> downstream/xvidenc).
>
>  
>> videorate ! video/x-raw-yuv, framerate=30/1 ! xvidenc ! queue ! avimux !
>> queue ! filesink location=test.avi
>>
>> where test_caps =
>> "video/x-raw-rgb,width=80,height=60,bpp=32,depth=24,framerate=30/1"
>>    
>
> These RGB caps are incomplete. You need to also specify:
>  - red_mask
>  - green_mask
>  - blue_mask
>  - endianness
>
> ffmpegcolorspace should reject these caps with a NOT_NEGOTIATED flow
> return. Not sure what appsrc does with that, but I'd guess it would post
> an error message on the pipeline's bus.
>
> Cheers
>  -Tim
>
>
>
> ------------------------------------------------------------------------------
> This SF.net email is sponsored by:
> SourcForge Community
> SourceForge wants to tell your story.
> http://p.sf.net/sfu/sf-spreadtheword
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>
>  

#include <string.h>

#include <gst/gst.h>

#include <gst/app/gstappsrc.h>
#include <gst/app/gstappbuffer.h>

/*
 * an example application of using appsrc in push mode to create a video file.
 * from buffers we push into the pipeline.
 */

/* Video resolution: 80 x 60 x 4 = 80x60 pixels, 32 bpp (4 bytes per pixel) = 19200 bytes */
#define BUFFER_SIZE  19200

/* 300 frames = 10 seconds of video, since we are going to save at 30fps (se the video_caps next) */
#define TOTAL_FRAMES 30000

#define QUEUED_FRAMES 30

/* these are the caps we are going to pass through the appsrc */
const gchar *video_caps =
    "video/x-raw-rgb,width=80,height=60,bpp=32,endianness=4321,depth=24,red_mask=65280,green_mask=16711680,blue_mask=-16777216,framerate=30/1";

typedef struct
{
  GMainLoop *loop;
  GstElement *pipeline;
  GstElement *source;
  guint source_id;
  guint num_frame;
} AppData;

/* This method is called by the idle GSource in the mainloop. We feed 1 buffer
 * of BUFFER_SIZE bytes into appsrc.
 * The ide handler is added to the mainloop when appsrc requests us to start
 * sending data (need-data signal) and is removed when appsrc has enough data
 * (enough-data signal).
 */
static gboolean
push_buffer (AppData * app)
{
  gpointer raw_buffer;
  GstBuffer *app_buffer;
  GstFlowReturn ret;

  app->num_frame++;

  if (app->num_frame >= TOTAL_FRAMES) {
    /* we are EOS, send end-of-stream and remove the source */
    g_signal_emit_by_name (app->source, "end-of-stream", &ret);
    return FALSE;
  }
 
  /* Allocating the memory for the buffer */
  raw_buffer = g_malloc0 (BUFFER_SIZE);
 
  app_buffer = gst_app_buffer_new (raw_buffer, BUFFER_SIZE, g_free, raw_buffer);

  /* newer basesrc will set caps for use automatically but it does not really
   * hurt to set it on the buffer again */
  gst_buffer_set_caps (app_buffer, gst_caps_from_string (video_caps));

  /* Setting the correct timestamp for the buffer is very important, otherwise the
   * resulting video file won't be created correctly */
  GST_BUFFER_TIMESTAMP(app_buffer) = (GstClockTime)((app->num_frame / 30.0) * 1e9);

  /* push new buffer */
  g_signal_emit_by_name (app->source, "push-buffer", app_buffer, &ret);
  gst_buffer_unref (app_buffer);

  if (ret != GST_FLOW_OK) {
    /* some error, stop sending data */
    return FALSE;
  }

  return TRUE;
}

/* This signal callback is called when appsrc needs data, we add an idle handler
 * to the mainloop to start pushing data into the appsrc */
static void
start_feed (GstElement * pipeline, guint size, AppData * app)
{
  if (app->source_id == 0) {
        g_print ("start feeding at frame %i\n", app->num_frame);
    app->source_id = g_idle_add ((GSourceFunc) push_buffer, app);
  }
}

/* This callback is called when appsrc has enough data and we can stop sending.
 * We remove the idle handler from the mainloop */
static void
stop_feed (GstElement * pipeline, AppData * app)
{
  if (app->source_id != 0) {
        g_print ("stop feeding at frame %i\n", app->num_frame);
    g_source_remove (app->source_id);
    app->source_id = 0;
  }
}

/* called when we get a GstMessage from the pipeline when we get EOS, we
 * exit the mainloop and this testapp. */
static gboolean
on_pipeline_message (GstBus * bus, GstMessage * message, AppData * app)
{
  GstState state, pending;

  switch (GST_MESSAGE_TYPE (message)) {
    case GST_MESSAGE_EOS:
      g_print ("Received End of Stream message\n");
      g_main_loop_quit (app->loop);
      break;
    case GST_MESSAGE_ERROR:
      g_print ("Received error\n");
      g_main_loop_quit (app->loop);
      break;
    case GST_MESSAGE_STATE_CHANGED:
          gst_element_get_state(app->source, &state, &pending, GST_CLOCK_TIME_NONE);
      /* g_print ("State changed from %i to %i\n", state, pending); */
      break;
        default:
      break;
  }
  return TRUE;
}

int
main (int argc, char *argv[])
{
  AppData *app = NULL;
  gchar *string = NULL;
  GstBus *bus = NULL;
  GstElement *appsrc = NULL;

  gst_init (&argc, &argv);

  app = g_new0 (AppData, 1);

  app->loop = g_main_loop_new (NULL, FALSE);

  /* setting up pipeline, we push video data into this pipeline that will
   * then be recorded to an avi file, encoded with the h.264 codec*/
  string =
      g_strdup_printf ("appsrc is-live=true name=source caps=\"%s\" ! ffmpegcolorspace ! video/x-raw-yuv,format=(fourcc)I420,width=80,height=60 ! queue ! videorate ! video/x-raw-yuv,framerate=30/1 ! h264enc ! queue ! avimux ! queue ! filesink location=test.avi",
      video_caps);
  app->pipeline = gst_parse_launch (string, NULL);
  g_free (string);

  if (app->pipeline == NULL) {
    g_print ("Bad pipeline\n");
    return -1;
  }

  appsrc = gst_bin_get_by_name (GST_BIN (app->pipeline), "source");
  /* configure for time-based format */
  g_object_set (appsrc, "format", GST_FORMAT_TIME, NULL);
  /* setting maximum of bytes queued. default is 200000 */
  gst_app_src_set_max_bytes((GstAppSrc *)appsrc, QUEUED_FRAMES * BUFFER_SIZE);
  /* uncomment the next line to block when appsrc has buffered enough */
  /* g_object_set (appsrc, "block", TRUE, NULL); */
  app->source = appsrc;

  /* add watch for messages */
  bus = gst_element_get_bus (app->pipeline);
  gst_bus_add_watch (bus, (GstBusFunc) on_pipeline_message, app);
  gst_object_unref (bus);

  /* configure the appsrc, we will push data into the appsrc from the
   * mainloop */
  g_signal_connect (app->source, "need-data", G_CALLBACK (start_feed), app);
  g_signal_connect (app->source, "enough-data", G_CALLBACK (stop_feed), app);

  /* go to playing and wait in a mainloop */
  gst_element_set_state (app->pipeline, GST_STATE_PLAYING);

  /* this mainloop is stopped when we receive an error or EOS */
  g_print ("Creating movie...\n");
  g_main_loop_run (app->loop);
  g_print ("Done.\n");

  gst_app_src_end_of_stream (GST_APP_SRC (app->source));

  gst_element_set_state (app->pipeline, GST_STATE_NULL);

  /* Cleaning up */
  gst_object_unref (app->source);
  gst_object_unref (app->pipeline);
  g_main_loop_unref (app->loop);
  g_free (app);

  return 0;
}

------------------------------------------------------------------------------
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: appsrc to filesink

Nostalgia
CONTENTS DELETED
The author has deleted this message.
Reply | Threaded
Open this post in threaded view
|

Re: appsrc to filesink

Nicolas Dufresne-5


Le 1 mars 2018 23:57, "Nostalgia" <[hidden email]> a écrit :
Hi @acolubri,

I am interested in using appsrc for pushing data from an application to a
gstreamer pipeline, and this for my university final project. Till now, they
don't tell me what application I should use to produce multimedia data
(probably it is yuv video) to be pushed to the pipeline (to processing the
data and encode it ...). But I should, as first step, to understand the
concept of gstreamer's pipeline and to make an example using appsrc. But I
don't succeed to understand very well the concept of appsrc, so I found your
code and I try it, but I didn't understand where is the data that we should
push to the pipeline ? From where the data will be pushed when executing
your code ?

Thanks for helping me to understand the concept of using appsrc ...
I hope that you(@acolubri) or someone else can help me,

Best is to look at appsrc C API, since the GObject interface is difficult, appsrc is one of the rare plugin that comes with C library. To build with this API, add gstreamer-app-1.0 to your pkg-config call. The reference documentation is there:


The main data entry point is gst_app_src_push_buffer(). Note that this will place the buffer in a queue to be picked by the src streaming thread (in paused or playing state). Make sure to configure your streaming format (time/bytes) and the caps.




Regards,



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel