Why is GstBuffer not writable in the _fill method of GstPushSrc?

classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

Why is GstBuffer not writable in the _fill method of GstPushSrc?

Ben Rush
When implementing a GstPushSrc-derived class, I set the fill method to my own method. This fill method gets passed to it a GstBuffer that, presumably, is to be filled by the method. However, if I call 

GstMapInfo info;
gst_buffer_map(buffer, &info, GST_MAP_WRITE);

The data property of the info object is null. I can't copy data to a null pointer. I've been able to successfully use GstVideoFrame and gst_video_frame_map and write data to the video frame. So why can't I directly write to the buffer that's passed? 

As it turns out I'm reading data from a camera in YUV420, and don't want to incur any penalty for copying buffers around. I want to be able to just copy the frame data directly onto my buffer object. 

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Ben Rush
Just to be clear, I'm seeing examples of others doing what I want to do. That is, write directly to the GstBuffer as a GstPushSrc. Examples include: 

https://github.com/GStreamer/gst-plugins-bad/blob/master/gst/frei0r/gstfrei0rsrc.c  (between lines 96 to 106)
https://gitlab.collabora.com/gkiagia/gst-plugins-bad/blob/8cdfb13658a069cf8c45a3265bf865849d3dc8e9/ext/neon/gstneonhttpsrc.c  (between lines 955 to 995)

Among others. So, I don't feel like what I'm doing is strange or out of the ordinary. It's just that the map function call fails (returns false) and gives me a null buffer to write to when I try to open it with GST_MAP_WRITE. It succeeds when I try to open it with GST_MAP_READ, but that makes no sense when I'm trying to WRITE to the buffer. 
  

On Sat, Apr 27, 2019 at 11:56 PM Ben Rush <[hidden email]> wrote:
When implementing a GstPushSrc-derived class, I set the fill method to my own method. This fill method gets passed to it a GstBuffer that, presumably, is to be filled by the method. However, if I call 

GstMapInfo info;
gst_buffer_map(buffer, &info, GST_MAP_WRITE);

The data property of the info object is null. I can't copy data to a null pointer. I've been able to successfully use GstVideoFrame and gst_video_frame_map and write data to the video frame. So why can't I directly write to the buffer that's passed? 

As it turns out I'm reading data from a camera in YUV420, and don't want to incur any penalty for copying buffers around. I want to be able to just copy the frame data directly onto my buffer object. 

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Sebastian Dröge-3
On Sun, 2019-04-28 at 10:23 -0500, Ben Rush wrote:

> Just to be clear, I'm seeing examples of others doing what I want to
> do. That is, write directly to the GstBuffer as a GstPushSrc.
> Examples include:
>
> https://github.com/GStreamer/gst-plugins-bad/blob/master/gst/frei0r/gstfrei0rsrc.c  (between lines 96 to
> 106)
> https://gitlab.collabora.com/gkiagia/gst-plugins-bad/blob/8cdfb13658a069cf8c45a3265bf865849d3dc8e9/ext/neon/gstneonhttpsrc.c  (between lines 955 to
> 995)
>
> Among others. So, I don't feel like what I'm doing is strange or out
> of the ordinary. It's just that the map function call fails (returns
> false) and gives me a null buffer to write to when I try to open it
> with GST_MAP_WRITE. It succeeds when I try to open it with
> GST_MAP_READ, but that makes no sense when I'm trying to WRITE to the
> buffer.
Can you provide a standalone testcase for this so we have an idea what
your code is doing exactly and a way to reproduce it?

As you say yourself, it's rather useless to have a non-writable buffer
inside fill() and that definitely shouldn't happen.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (981 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Ben Rush
Yes, and I believe I have possibly narrowed it down. I should be able to provide a self-contained example as well as even a cause for it sometime today. 

Thank you for your response. 

On Mon, Apr 29, 2019, 01:29 Sebastian Dröge <[hidden email] wrote:
On Sun, 2019-04-28 at 10:23 -0500, Ben Rush wrote:
> Just to be clear, I'm seeing examples of others doing what I want to
> do. That is, write directly to the GstBuffer as a GstPushSrc.
> Examples include:
>
> https://github.com/GStreamer/gst-plugins-bad/blob/master/gst/frei0r/gstfrei0rsrc.c  (between lines 96 to
> 106)
> https://gitlab.collabora.com/gkiagia/gst-plugins-bad/blob/8cdfb13658a069cf8c45a3265bf865849d3dc8e9/ext/neon/gstneonhttpsrc.c  (between lines 955 to
> 995)
>
> Among others. So, I don't feel like what I'm doing is strange or out
> of the ordinary. It's just that the map function call fails (returns
> false) and gives me a null buffer to write to when I try to open it
> with GST_MAP_WRITE. It succeeds when I try to open it with
> GST_MAP_READ, but that makes no sense when I'm trying to WRITE to the
> buffer.

Can you provide a standalone testcase for this so we have an idea what
your code is doing exactly and a way to reproduce it?

As you say yourself, it's rather useless to have a non-writable buffer
inside fill() and that definitely shouldn't happen.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Ben Rush
Sebastian, 

So, I'm not sure precisely what's going on, but I think it might - maybe - have something to do with the fact that I'm using the MFX (Intel Media SDK) GStreamer modules, and that perhaps things aren't playing well together. Let me explain. I've got an example application that, all it does, is encode a static image of the scientist Richard Feynman to disk as an MP4 file (500 frames). I use it to kind of tinker and explore the various parts of the GStreamer pipeline. 

I have found that if I set the output caps to NV12, the gst_buffer_map() call fails: 

static GstStaticPadTemplate gst_feynman_template = GST_STATIC_PAD_TEMPLATE("src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{NV12}"))
);

If, on the other hand, I change this to AYUV, it works: 

static GstStaticPadTemplate gst_feynman_template = GST_STATIC_PAD_TEMPLATE("src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{AYUV}"))
);

It doesn't matter what I actually send as this failure happens before I send anything at all. During the first invocation to fill the buffer, the gst_buffer_map call fails as I've indicated in previous emails if I specify NV12. As I'm writing this email out certain things are starting to becoming clearer, and I presume what's happening is that per the handshake between my source DLL and the downstream gstmfx.dll (and the x264 encoder therein) something is failing to initialize properly, thereby causing the GStreamer pipeline to fail to also initialize the GstBuffer for me properly. Specifically, the gstmfxenc_h264.c encoder is failing to handle NV12 properly (or me asking it to handle this format is failing on its side and therefore falling back to some standard GStreamer allocator/handler (I don't know the terminology well as I'm still learning)). 

I know that the gstmfx encoder suite advertises that it supports these formats: 

# define GST_MFX_SUPPORTED_INPUT_FORMATS \
    "{ NV12, YV12, I420, YUY2, P010_10LE, BGRA, BGRx }"

What tipped me off was inspecting the GstBuffer object in-memory, and seeing when I specified NV12, the GstBuffer->Pool->Object->name was "mfxvideobufferpool0", whereas if I specify AYUV, the name is "videobufferpool0". 

So, does this appear to be a bug in the intel media encoder side of things? Or am I still not using something correctly? If it is a bug in the Intel side of things, do you have any advise on how to track it down? They've been pretty unhelpful so far regarding various other issues I've encountered using their stuff. 

Here is the full source to my Feynman encoder: 

#include <gst/gst.h>
#include <gst/base/gstpushsrc.h>
#include <gst/video/gstvideometa.h>
#include <gst/video/gstvideopool.h>

#include <Windows.h>
#include <stdio.h>

#define PACKAGE "gst_feynman"

/* Definition of structure storing data for this element. */
typedef struct _GstFeynman
{
GstPushSrc element;
GstPad* sourcepad;

/* running time and frames for current caps */
GstClockTime running_time;            /* total running time */
gint64 n_frames;                      /* total frames sent */
gboolean reverse;

/* previous caps running time and frames */
GstClockTime accum_rtime;              /* accumulated running_time */
gint64 accum_frames;                  /* accumulated frames */

GstVideoInfo info; /* protected by the object or stream lock */

  /* private */
  /* FIXME 2.0: Change type to GstClockTime */
gint64 timestamp_offset;              /* base offset */

gpointer* lines;

guint n_lines;
gint offset;
} GstFeynman;

/* Standard definition defining a class for this element. */
typedef struct _GstFeynmanClass
{
GstPushSrcClass parent_class;
} GstFeynmanClass;


/* Standard macros for defining types for this element.  */
#define GST_FEYNMAN_TYPE (gst_feynman_get_type())
#define GST_FEYNMAN(obj) \
  (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_FEYNMAN_TYPE,GstFeynman))
#define GST_FEYNMAN_CLASS(klass) \
  (G_TYPE_CHECK_CLASS_CAST((klass),GST_FEYNMAN_TYPE,GstFeynmanClass))
#define GST_IS_FEYNMAN(obj) \
  (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_FEYNMAN_TYPE))
#define GST_IS_FEYNMAN_CLASS(klass) \
  (G_TYPE_CHECK_CLASS_TYPE((klass),GST_FEYNMAN_TYPE))

/* Standard function returning type information. */
GType gst_feynman_get_type(void);

#define gst_feynman_parent_class parent_class
G_DEFINE_TYPE(GstFeynman, gst_feynman, GST_TYPE_PUSH_SRC);

#define VTS_VIDEO_CAPS GST_VIDEO_CAPS_MAKE (GST_VIDEO_FORMATS_ALL) 


static GstStaticPadTemplate gst_feynman_template = GST_STATIC_PAD_TEMPLATE("src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{AYUV}"))
);

static void gst_feynman_set_property(GObject* object, guint prop_id,
const GValue* value, GParamSpec* pspec);
static void gst_feynman_get_property(GObject* object, guint prop_id,
GValue* value, GParamSpec* pspec);
static GstFlowReturn
gst_feynman_src_fill(GstPushSrc* psrc, GstBuffer* buffer);
static gboolean gst_feynman_start(GstBaseSrc* basesrc);
static gboolean gst_feynman_stop(GstBaseSrc* basesrc);
static gboolean gst_feynman_set_caps(GstBaseSrc* bsrc, GstCaps* caps);
static gboolean gst_feynman_is_seekable(GstBaseSrc* src);
static GstCaps* gst_feynman_src_fixate(GstBaseSrc* bsrc, GstCaps* caps);

/* initialize the myfilter's class */
static void gst_feynman_class_init(GstFeynmanClass* klass)
{
// get the "object class". 
GObjectClass* gobject_class = (GObjectClass*)klass;
GstElementClass* gstelement_class = (GstElementClass*)klass;
GstBaseSrcClass* gstbasesrc_class = (GstBaseSrcClass*)klass;
GstPushSrcClass* gstpushsrc_class = (GstPushSrcClass*)klass;

// set the getters and setters for properties. 
gobject_class->set_property = gst_feynman_set_property;
gobject_class->get_property = gst_feynman_get_property;

gst_element_class_set_static_metadata(gstelement_class,
"Video test source", "Source/Video",
"Creates a test video stream", "David A. Schleef <[hidden email]>");

// assign the pad to this class. 
gst_element_class_add_static_pad_template(gstelement_class,
&gst_feynman_template);

gstbasesrc_class->fixate = gst_feynman_src_fixate;
gstbasesrc_class->is_seekable = gst_feynman_is_seekable;
gstbasesrc_class->set_caps = gst_feynman_set_caps;
gstbasesrc_class->start = gst_feynman_start;
gstbasesrc_class->stop = gst_feynman_stop;
gstpushsrc_class->fill = gst_feynman_src_fill;
}

static GstCaps* gst_feynman_src_fixate(GstBaseSrc* bsrc, GstCaps* caps)
{
GstFeynman* src = GST_FEYNMAN(bsrc);
GstStructure* structure;

caps = gst_caps_make_writable(caps);
structure = gst_caps_get_structure(caps, 0);

gst_structure_fixate_field_nearest_int(structure, "width", 320);
gst_structure_fixate_field_nearest_int(structure, "height", 240);

if (gst_structure_has_field(structure, "framerate"))
gst_structure_fixate_field_nearest_fraction(structure, "framerate", 30, 1);
else
gst_structure_set(structure, "framerate", GST_TYPE_FRACTION, 30, 1, NULL);

if (gst_structure_has_field(structure, "pixel-aspect-ratio"))
gst_structure_fixate_field_nearest_fraction(structure,
"pixel-aspect-ratio", 1, 1);
else
gst_structure_set(structure, "pixel-aspect-ratio", GST_TYPE_FRACTION, 1, 1,
NULL);

if (gst_structure_has_field(structure, "colorimetry"))
gst_structure_fixate_field_string(structure, "colorimetry", "bt601");
if (gst_structure_has_field(structure, "chroma-site"))
gst_structure_fixate_field_string(structure, "chroma-site", "mpeg2");

if (gst_structure_has_field(structure, "interlace-mode"))
gst_structure_fixate_field_string(structure, "interlace-mode",
"progressive");
else
gst_structure_set(structure, "interlace-mode", G_TYPE_STRING,
"progressive", NULL);

if (gst_structure_has_field(structure, "multiview-mode"))
gst_structure_fixate_field_string(structure, "multiview-mode",
gst_video_multiview_mode_to_caps_string
(GST_VIDEO_MULTIVIEW_MODE_MONO));
else
gst_structure_set(structure, "multiview-mode", G_TYPE_STRING,
gst_video_multiview_mode_to_caps_string(GST_VIDEO_MULTIVIEW_MODE_MONO),
NULL);

caps = GST_BASE_SRC_CLASS(parent_class)->fixate(bsrc, caps);

return caps;
}


static gboolean gst_feynman_is_seekable(GstBaseSrc* psrc)
{
/* we're seekable... */
return FALSE;
}

static gboolean gst_feynman_set_caps(GstBaseSrc* bsrc, GstCaps* caps)
{
const GstStructure* structure;
GstFeynman* feynman;
GstVideoInfo info;
guint i;
guint n_lines = 1;
gint offset = 0;

feynman = GST_FEYNMAN(bsrc);

structure = gst_caps_get_structure(caps, 0);

GST_OBJECT_LOCK(feynman);

gst_video_info_from_caps(&info, caps);

//info.width = 320; 
//info.height = 240; 
//info.fps_n = 30; 
//info.fps_d = 1; 

feynman->lines = (gpointer*)g_malloc(sizeof(gpointer) * n_lines);
for (i = 0; i < n_lines; i++)
feynman->lines[i] = g_malloc((info.width + 16) * 8);
feynman->n_lines = n_lines;
feynman->offset = offset;

/* looks ok here */
feynman->info = info;

GST_OBJECT_UNLOCK(feynman);

return TRUE;
}

static gboolean gst_feynman_stop(GstBaseSrc * basesrc)
{
return TRUE;
}

static gboolean gst_feynman_start(GstBaseSrc * basesrc)
{
GstFeynman* src = GST_FEYNMAN(basesrc);

GST_OBJECT_LOCK(src);
src->running_time = 0;
src->n_frames = 0;
src->accum_frames = 0;
src->accum_rtime = 0;

gst_video_info_init(&src->info);
GST_OBJECT_UNLOCK(src);

return TRUE;
}

static void gst_feynman_init(GstFeynman * src)
{
OutputDebugStringA("Feynman: gst_feynman_init called.");

gst_base_src_set_format(GST_BASE_SRC(src), GST_FORMAT_TIME);
gst_base_src_set_live(GST_BASE_SRC(src), 1);

return;
}

static gboolean plugin_init(GstPlugin * plugin)
{
OutputDebugStringA("Feynman: plugin_init called.");

return gst_element_register(plugin, "gstfeynman",
GST_RANK_NONE, GST_FEYNMAN_TYPE);
}

static gboolean gst_feynman_query(GstBaseSrc * bsrc, GstQuery * query)
{
return GST_BASE_SRC_CLASS(parent_class)->query(bsrc, query);
}

static void gst_feynman_set_property(GObject * object, guint prop_id,
const GValue * value, GParamSpec * pspec)
{
return;
}

static void gst_feynman_get_property(GObject * object, guint prop_id,
GValue * value, GParamSpec * pspec)
{
return;
}


#define CLIP(X) ( (X) > 255 ? 255 : (X) < 0 ? 0 : X)

// RGB -> YUV
#define RGB2Y(R, G, B) CLIP(( (  66 * (R) + 129 * (G) +  25 * (B) + 128) >> 8) +  16)
#define RGB2U(R, G, B) CLIP(( ( -38 * (R) -  74 * (G) + 112 * (B) + 128) >> 8) + 128)
#define RGB2V(R, G, B) CLIP(( ( 112 * (R) -  94 * (G) -  18 * (B) + 128) >> 8) + 128)

static GstFlowReturn gst_feynman_src_fill(GstPushSrc * psrc,
GstBuffer * buffer)
{
GstFeynman* src;
GstClockTime next_time;
GstVideoFrame frame;
gconstpointer pal;
gsize palsize;

src = GST_FEYNMAN(psrc);

printf(".");

if (G_UNLIKELY(GST_VIDEO_INFO_FORMAT(&src->info) ==
GST_VIDEO_FORMAT_UNKNOWN))
goto not_negotiated;

/* 0 framerate and we are at the second frame, eos */
if (G_UNLIKELY(src->info.fps_n == 0 && src->n_frames == 1))
goto eos;

if (G_UNLIKELY(src->n_frames == -1)) {
/* EOS for reverse playback */
goto eos;
}

GST_LOG_OBJECT(src,
"creating buffer from pool for frame %" G_GINT64_FORMAT, src->n_frames);

GST_BUFFER_PTS(buffer) =
src->accum_rtime + src->timestamp_offset + src->running_time;
GST_BUFFER_DTS(buffer) = GST_CLOCK_TIME_NONE;

gst_object_sync_values(GST_OBJECT(psrc), GST_BUFFER_PTS(buffer));

///////////////////////// SHOW FEYNMAN 1////////////////////////////////////
//if (!gst_video_frame_map(&frame, &src->info, buffer, GST_MAP_WRITE))
// goto invalid_frame;

//FILE* f = fopen("feynman.dat", "rb+");
//int* line = (int*)malloc(frame.info.width * 4);
//unsigned char lineBuffer[320 * 3]; 
//for (int j = 0; j < frame.info.height; j++)
//{
// fread(lineBuffer, 1, sizeof(lineBuffer), f);

// for (int i = 0; i < frame.info.width; i++)
// {
// int b = lineBuffer[i * 3];
// int g = lineBuffer[i * 3 + 1];
// int r = lineBuffer[i * 3 + 2];

// int Y = r * .299000 + g * .587000 + b * .114000; 
// int U = r * -.168736 + g * -.331264 + b * .500000 + 128;
// int V = r * .500000 + g * -.418688 + b * -.081312 + 128; 

// int value = (0 << 0) | (Y << 8) | (U << 16) | ((guint32) V << 24);

// line[i] = value; 
// }

// frame.info.finfo->pack_func(frame.info.finfo,
// GST_VIDEO_PACK_FLAG_NONE, line, 0, frame.data, frame.info.stride,
// frame.info.chroma_site, j, frame.info.width);
//}
//free(line);
//fclose(f);

//if ((pal = gst_video_format_get_palette(GST_VIDEO_FRAME_FORMAT(&frame),
// &palsize))) {
// memcpy(GST_VIDEO_FRAME_PLANE_DATA(&frame, 1), pal, palsize);
//}

//gst_video_frame_unmap(&frame);
/////////////////////////////////////////////////////////////////////////

////////////////////////////SHOW FEYNMAN 2//////////////////////////////
FILE* f = fopen("feynman.yuv", "rb+");
fseek(f, 0, SEEK_END); 
int length = ftell(f); 
fseek(f, 0, SEEK_SET); 
GstMapInfo info;
gst_buffer_map(buffer, &info, GST_MAP_WRITE);
char* buff = (char*)malloc(length);
fread(buff, 1, length, f);
memcpy(info.data, buff, length);
gst_buffer_unmap(buffer, &info);
fclose(f);
///////////////////////////////////////////////////////////////////////


// the data becomes planar when it is sent downstream. 
//GstMapInfo info; 
//gst_buffer_map(buffer, &info, GST_MAP_READ);
//FILE* fout = fopen("feynman.yuv", "wb+");
//fwrite(info.data, 1, info.size, fout);
//fclose(fout);
//gst_buffer_unmap(buffer, &info); 

GST_DEBUG_OBJECT(src, "Timestamp: %" GST_TIME_FORMAT " = accumulated %"
GST_TIME_FORMAT " + offset: %"
GST_TIME_FORMAT " + running time: %" GST_TIME_FORMAT,
GST_TIME_ARGS(GST_BUFFER_PTS(buffer)), GST_TIME_ARGS(src->accum_rtime),
GST_TIME_ARGS(src->timestamp_offset), GST_TIME_ARGS(src->running_time));

GST_BUFFER_OFFSET(buffer) = src->accum_frames + src->n_frames;
if (src->reverse) {
src->n_frames--;
}
else {
src->n_frames++;
}
GST_BUFFER_OFFSET_END(buffer) = GST_BUFFER_OFFSET(buffer) + 1;
if (src->info.fps_n) {
next_time = gst_util_uint64_scale(src->n_frames,
src->info.fps_d * GST_SECOND, src->info.fps_n);
if (src->reverse) {
GST_BUFFER_DURATION(buffer) = src->running_time - next_time;
}
else {
GST_BUFFER_DURATION(buffer) = next_time - src->running_time;
}
}
else {
next_time = src->timestamp_offset;
/* NONE means forever */
GST_BUFFER_DURATION(buffer) = GST_CLOCK_TIME_NONE;
}

src->running_time = next_time;

return GST_FLOW_OK;

not_negotiated:
{
return GST_FLOW_NOT_NEGOTIATED;
}
eos:
{
GST_DEBUG_OBJECT(src, "eos: 0 framerate, frame %d", (gint)src->n_frames);
return GST_FLOW_EOS;
}
invalid_frame:
{
GST_DEBUG_OBJECT(src, "invalid frame");
return GST_FLOW_OK;
}
}

GST_PLUGIN_DEFINE(
GST_VERSION_MAJOR,
GST_VERSION_MINOR,
feynman,
"feynman",
plugin_init,
".1",
"LGPL",
"gst_feynman",
)


On Mon, Apr 29, 2019 at 5:31 AM Ben Rush <[hidden email]> wrote:
Yes, and I believe I have possibly narrowed it down. I should be able to provide a self-contained example as well as even a cause for it sometime today. 

Thank you for your response. 

On Mon, Apr 29, 2019, 01:29 Sebastian Dröge <[hidden email] wrote:
On Sun, 2019-04-28 at 10:23 -0500, Ben Rush wrote:
> Just to be clear, I'm seeing examples of others doing what I want to
> do. That is, write directly to the GstBuffer as a GstPushSrc.
> Examples include:
>
> https://github.com/GStreamer/gst-plugins-bad/blob/master/gst/frei0r/gstfrei0rsrc.c  (between lines 96 to
> 106)
> https://gitlab.collabora.com/gkiagia/gst-plugins-bad/blob/8cdfb13658a069cf8c45a3265bf865849d3dc8e9/ext/neon/gstneonhttpsrc.c  (between lines 955 to
> 995)
>
> Among others. So, I don't feel like what I'm doing is strange or out
> of the ordinary. It's just that the map function call fails (returns
> false) and gives me a null buffer to write to when I try to open it
> with GST_MAP_WRITE. It succeeds when I try to open it with
> GST_MAP_READ, but that makes no sense when I'm trying to WRITE to the
> buffer.

Can you provide a standalone testcase for this so we have an idea what
your code is doing exactly and a way to reproduce it?

As you say yourself, it's rather useless to have a non-writable buffer
inside fill() and that definitely shouldn't happen.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Sebastian Dröge-3
On Mon, 2019-04-29 at 10:46 -0500, Ben Rush wrote:

> Sebastian,
>
> So, I'm not sure precisely what's going on, but I think it might -
> maybe - have something to do with the fact that I'm using the MFX
> (Intel Media SDK) GStreamer modules, and that perhaps things aren't
> playing well together. Let me explain. I've got an example
> application that, all it does, is encode a static image of the
> scientist Richard Feynman to disk as an MP4 file (500 frames). I use
> it to kind of tinker and explore the various parts of the GStreamer
> pipeline.
>
> I have found that if I set the output caps to NV12, the
> gst_buffer_map() call fails:
>
> static GstStaticPadTemplate gst_feynman_template =
> GST_STATIC_PAD_TEMPLATE("src",
> GST_PAD_SRC,
> GST_PAD_ALWAYS,
> GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{NV12}"))
> );
>
> If, on the other hand, I change this to AYUV, it works:
>
> static GstStaticPadTemplate gst_feynman_template =
> GST_STATIC_PAD_TEMPLATE("src",
> GST_PAD_SRC,
> GST_PAD_ALWAYS,
> GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{AYUV}"))
> );
>
> It doesn't matter what I actually send as this failure happens before
> I send anything at all. During the first invocation to fill the
> buffer, the gst_buffer_map call fails as I've indicated in previous
> emails if I specify NV12. As I'm writing this email out certain
> things are starting to becoming clearer, and I presume what's
> happening is that per the handshake between my source DLL and the
> downstream gstmfx.dll (and the x264 encoder therein) something is
> failing to initialize properly, thereby causing the GStreamer
> pipeline to fail to also initialize the GstBuffer for me properly.
> Specifically, the gstmfxenc_h264.c encoder is failing to handle NV12
> properly (or me asking it to handle this format is failing on its
> side and therefore falling back to some standard GStreamer
> allocator/handler (I don't know the terminology well as I'm still
> learning)).
>
> I know that the gstmfx encoder suite advertises that it supports
> these formats:
>
> # define GST_MFX_SUPPORTED_INPUT_FORMATS \
>     "{ NV12, YV12, I420, YUY2, P010_10LE, BGRA, BGRx }"
>
> What tipped me off was inspecting the GstBuffer object in-memory, and
> seeing when I specified NV12, the GstBuffer->Pool->Object->name was
> "mfxvideobufferpool0", whereas if I specify AYUV, the name is
> "videobufferpool0".
>
> So, does this appear to be a bug in the intel media encoder side of
> things? Or am I still not using something correctly? If it is a bug
> in the Intel side of things, do you have any advise on how to track
> it down? They've been pretty unhelpful so far regarding various other
> issues I've encountered using their stuff.
Does it work correctly if you us AYUV?

I don't see anything specifically wrong in your code, especially not in
the fill() function. What you say sounds like the MFX plugin is
providing you with a buffer pool that provides buffers that can't be
write-mapped. That would be a bug in the MFX plugin.

Which MFX plugin are you using? Does it work better if you use
something else instead?


In any case, instead of just providing the code in-line in a mail it
would be good if you could provide a git repository with everything
needed to compile and run it so that it's easier to check what exactly
is happening.

That said, from what you describe it sounds like a bug in the MFX
plugin.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (981 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Ben Rush
Sebastian, 

I apologize for not getting back to you sooner. I will try putting up some sample code for you in the near future. However, I have opened up the following issue on the Intel Media SDK gstreamer github repo that my be of interest to you: https://github.com/intel/gstreamer-media-SDK/issues/173

Essentially I'm using their x264 encoder. I don't know how it's working exactly, but when I request AYUV, I'm able to get a writable buffer (and yes, everything works). It looks as though the GstBuffer is writable, but allocated by the gstreamer pipeline itself. When I request NV12, which is supported by the x264 plugin as a media type from upstream elements, the Intel Media pipeline allocations memory for me. This is video memory. Now, it's possible that one could make the argument the memory isn't writeable from CPU code because it's on the video device, but I know in Intel's OpenCL drivers I'm able to get writable memory buffers since the GPU has a shared memory space with the CPU (the whole point of the Intel on-chip GPU). So, it should be possible. Why the code above appears to not work with it, I'm unsure. 

BTW: I've also opened another issue that some GStreamer users might encounter when using the Intel Media SDK (on Windows) here:  https://github.com/intel/gstreamer-media-SDK/issues/169. I might try fixing this (it might be done so by a simple preprocessor statement preventing that code from compiling on Windows). 

Anyway. Keeping you up to date.  

On Tue, Apr 30, 2019 at 5:39 AM Sebastian Dröge <[hidden email]> wrote:
On Mon, 2019-04-29 at 10:46 -0500, Ben Rush wrote:
> Sebastian,
>
> So, I'm not sure precisely what's going on, but I think it might -
> maybe - have something to do with the fact that I'm using the MFX
> (Intel Media SDK) GStreamer modules, and that perhaps things aren't
> playing well together. Let me explain. I've got an example
> application that, all it does, is encode a static image of the
> scientist Richard Feynman to disk as an MP4 file (500 frames). I use
> it to kind of tinker and explore the various parts of the GStreamer
> pipeline.
>
> I have found that if I set the output caps to NV12, the
> gst_buffer_map() call fails:
>
> static GstStaticPadTemplate gst_feynman_template =
> GST_STATIC_PAD_TEMPLATE("src",
>       GST_PAD_SRC,
>       GST_PAD_ALWAYS,
>       GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{NV12}"))
> );
>
> If, on the other hand, I change this to AYUV, it works:
>
> static GstStaticPadTemplate gst_feynman_template =
> GST_STATIC_PAD_TEMPLATE("src",
>       GST_PAD_SRC,
>       GST_PAD_ALWAYS,
>       GST_STATIC_CAPS(GST_VIDEO_CAPS_MAKE("{AYUV}"))
> );
>
> It doesn't matter what I actually send as this failure happens before
> I send anything at all. During the first invocation to fill the
> buffer, the gst_buffer_map call fails as I've indicated in previous
> emails if I specify NV12. As I'm writing this email out certain
> things are starting to becoming clearer, and I presume what's
> happening is that per the handshake between my source DLL and the
> downstream gstmfx.dll (and the x264 encoder therein) something is
> failing to initialize properly, thereby causing the GStreamer
> pipeline to fail to also initialize the GstBuffer for me properly.
> Specifically, the gstmfxenc_h264.c encoder is failing to handle NV12
> properly (or me asking it to handle this format is failing on its
> side and therefore falling back to some standard GStreamer
> allocator/handler (I don't know the terminology well as I'm still
> learning)).
>
> I know that the gstmfx encoder suite advertises that it supports
> these formats:
>
> # define GST_MFX_SUPPORTED_INPUT_FORMATS \
>     "{ NV12, YV12, I420, YUY2, P010_10LE, BGRA, BGRx }"
>
> What tipped me off was inspecting the GstBuffer object in-memory, and
> seeing when I specified NV12, the GstBuffer->Pool->Object->name was
> "mfxvideobufferpool0", whereas if I specify AYUV, the name is
> "videobufferpool0".
>
> So, does this appear to be a bug in the intel media encoder side of
> things? Or am I still not using something correctly? If it is a bug
> in the Intel side of things, do you have any advise on how to track
> it down? They've been pretty unhelpful so far regarding various other
> issues I've encountered using their stuff.

Does it work correctly if you us AYUV?

I don't see anything specifically wrong in your code, especially not in
the fill() function. What you say sounds like the MFX plugin is
providing you with a buffer pool that provides buffers that can't be
write-mapped. That would be a bug in the MFX plugin.

Which MFX plugin are you using? Does it work better if you use
something else instead?


In any case, instead of just providing the code in-line in a mail it
would be good if you could provide a git repository with everything
needed to compile and run it so that it's easier to check what exactly
is happening.

That said, from what you describe it sounds like a bug in the MFX
plugin.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Sebastian Dröge-3
Hi Ben,

On Fri, 2019-05-03 at 12:29 -0500, Ben Rush wrote:

>
> I apologize for not getting back to you sooner. I will try putting up
> some sample code for you in the near future. However, I have opened
> up the following issue on the Intel Media SDK gstreamer github repo
> that my be of interest to you:
> https://github.com/intel/gstreamer-media-SDK/issues/173.
>
> Essentially I'm using their x264 encoder. I don't know how it's
> working exactly, but when I request AYUV, I'm able to get a writable
> buffer (and yes, everything works). It looks as though the GstBuffer
> is writable, but allocated by the gstreamer pipeline itself. When I
> request NV12, which is supported by the x264 plugin as a media type
> from upstream elements, the Intel Media pipeline allocations memory
> for me. This is video memory. Now, it's possible that one could make
> the argument the memory isn't writeable from CPU code because it's on
> the video device, but I know in Intel's OpenCL drivers I'm able to
> get writable memory buffers since the GPU has a shared memory space
> with the CPU (the whole point of the Intel on-chip GPU). So, it
> should be possible. Why the code above appears to not work with it,
> I'm unsure.
That's nonetheless a bug in the MediaSDK plugin. It should not
unconditionally give you a useless buffer pool that requires special
mechanisms to do something with the buffers.

As an alternative you might want to try the MediaSDK plugin that is
part of the GStreamer plugin sets:
  https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/tree/master/sys/msdk

It shouldn't have the bug with the buffer pool and also is known to
work fine on Windows and Linux.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (981 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Ben Rush
Sebastian, 

Thanks for the link! I'll check it out. Part of what I'm doing is just digging into the code base in an attempt to understand what's happening. So, even though it's a bit of a pain, the struggle I'm dealing with is educational and will likely pay off heavily as I start to build more advanced stuff atop Gstreamer. That being said, I've got a question that maybe you could answer. A while ago I had mentioned I discovered the "pack_func" function and using it within the context of gst_video_frame_map. I started tinkering with the gst_video_frame_map function as a way to solve this and found that when I pin down the Intel video memory and map it with that function, I'm able to access it directly in write mode. Everything works. I discovered this by walking my way back up through the code base and see this is done in the gst_video_frame_map function (well, down the stack a way): https://github.com/jojva/gst-plugins-base/blob/a8353866857b05b3f447d2702194bb7bf1ec9e23/gst-libs/gst/video/gstvideometa.c#L268. This call to meta->unmap actually calls into the function I was trying to get to in the Intel Media SDK which maps the buffer for write access:  https://github.com/intel/gstreamer-media-SDK/blob/d4edf72ee2975186f11befbb24ea19d404b394fb/gst/mfx/gstmfxvideomemory.c#L118. My question is what's the difference between the gst_video_frame_map and the gst_buffer_map in that one is calling into these "meta" functions whereas the other is not. They appear to be two separate ways of mapping the memory, and indeed have dramatically different effects when used in my context, but it's my understanding they should both be doing about the same thing.   

Ultimately I wound up basically doing something like this: 

if (!gst_video_frame_map(&frame, &src->info, buffer, GST_MAP_WRITE))
goto invalid_frame;
...
int numberOfPlanes = GST_VIDEO_FRAME_N_PLANES(&frame); 
guint8* pixels = (guint8*)GST_VIDEO_FRAME_PLANE_DATA(&frame, 0);
...
memcpy(pixels, buff, length); 
...
gst_video_frame_unmap(&frame);

And was able to write onto the video memory just fine. 


On Sat, May 4, 2019 at 5:47 AM Sebastian Dröge <[hidden email]> wrote:
Hi Ben,

On Fri, 2019-05-03 at 12:29 -0500, Ben Rush wrote:
>
> I apologize for not getting back to you sooner. I will try putting up
> some sample code for you in the near future. However, I have opened
> up the following issue on the Intel Media SDK gstreamer github repo
> that my be of interest to you:
> https://github.com/intel/gstreamer-media-SDK/issues/173.
>
> Essentially I'm using their x264 encoder. I don't know how it's
> working exactly, but when I request AYUV, I'm able to get a writable
> buffer (and yes, everything works). It looks as though the GstBuffer
> is writable, but allocated by the gstreamer pipeline itself. When I
> request NV12, which is supported by the x264 plugin as a media type
> from upstream elements, the Intel Media pipeline allocations memory
> for me. This is video memory. Now, it's possible that one could make
> the argument the memory isn't writeable from CPU code because it's on
> the video device, but I know in Intel's OpenCL drivers I'm able to
> get writable memory buffers since the GPU has a shared memory space
> with the CPU (the whole point of the Intel on-chip GPU). So, it
> should be possible. Why the code above appears to not work with it,
> I'm unsure.

That's nonetheless a bug in the MediaSDK plugin. It should not
unconditionally give you a useless buffer pool that requires special
mechanisms to do something with the buffers.

As an alternative you might want to try the MediaSDK plugin that is
part of the GStreamer plugin sets:
  https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/tree/master/sys/msdk

It shouldn't have the bug with the buffer pool and also is known to
work fine on Windows and Linux.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Sebastian Dröge-3
On Mon, 2019-05-06 at 10:20 -0500, Ben Rush wrote:
> My question is what's the difference between the gst_video_frame_map
> and the gst_buffer_map in that one is calling into these "meta"
> functions whereas the other is not. They appear to be two separate
> ways of mapping the memory, and indeed have dramatically different
> effects when used in my context, but it's my understanding they
> should both be doing about the same thing.

They should be the same thing, but the GstVideoFrame variant might be
more optimized. But by default, both should behave the same or
otherwise that's a bug.

If a special allocator is negotiated between your element and the
MediaSDK elements (or a special caps feature or ...) then it would be
valid to only allow using the GstVideoFrame API (or even a completely
different API) as that could be part of the contract of that
negotiation result. But this must be opt-in.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (981 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Why is GstBuffer not writable in the _fill method of GstPushSrc?

Ben Rush
Got it. Well, since I'm tinkering in this area, I might try pushing that information back to Intel. 

Thanks again for all your help. I appreciate your expertise and the fact you're taking the time to educate me. 

On Mon, May 6, 2019 at 12:48 PM Sebastian Dröge <[hidden email]> wrote:
On Mon, 2019-05-06 at 10:20 -0500, Ben Rush wrote:
> My question is what's the difference between the gst_video_frame_map
> and the gst_buffer_map in that one is calling into these "meta"
> functions whereas the other is not. They appear to be two separate
> ways of mapping the memory, and indeed have dramatically different
> effects when used in my context, but it's my understanding they
> should both be doing about the same thing.

They should be the same thing, but the GstVideoFrame variant might be
more optimized. But by default, both should behave the same or
otherwise that's a bug.

If a special allocator is negotiated between your element and the
MediaSDK elements (or a special caps feature or ...) then it would be
valid to only allow using the GstVideoFrame API (or even a completely
different API) as that could be part of the contract of that
negotiation result. But this must be opt-in.

--
Sebastian Dröge, Centricular Ltd · https://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel