how to learn from .yuv files and encode with h.264?

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

how to learn from .yuv files and encode with h.264?

rafael.lmsousa
Hi all

I'm implementing a tool using the gstreamer to do the folowing:
  • Read a .YUV file from the disk;
  • While it is reading the file, the coding is done by the coder;
  • At the same time, the coder sends the packets with the video to the receiver using RTP;
  • At the receiver, the video is stored.
I found a code to do almost it, but it don't read from a .YUV file and don't store at the receiver too. I've made some modifications that you already told me to do but the following error has occurred:

Error: Encode x264 frame failed.

The function that I think is the trouble is the following the modifications are in bold:

static GstElement* construct_sender_pipeline(void){

GstElement *pipeline, *filesrc, *time, *gstrtpbin, *venc, *rtpenc;
GstElement *udpsink_rtp, *udpsink_rtcp, *udpsrc_rtcp, *identity;
GstCaps *caps;
GstPad  *pad;
gboolean err;
GstPadLinkReturn res;
GstBuffer *buffer;
gint size;
...

//Video source initialization
filesrc = gst_element_factory_make("filesrc", "my_filesource");
if (!filesrc){
    g_printerr("Failed to create filesrc");
return 0;
}
g_object_set (G_OBJECT (filesrc), "location", vsrc, NULL);
  
...


//Create video encoder
venc = gst_element_factory_make("x264enc", "video-encoder");
if ( !venc ) {
g_printerr("Failed to create %s\n", "x264enc");
return 0;
}
//kbits/sec --> bits/sec for H.264 encoder
bitrate *= 1024;

g_object_set(G_OBJECT (venc), "bitrate", bitrate, NULL);
//bitrate is not controllable

//Choose RTP encoder according to video codec
rtpencoder = g_strdup(select_rtp_encoder("x264enc"));

...

//Possible problem
size = 352*288*(3/2);
buffer = gst_buffer_try_new_and_alloc (size);
if (buffer==NULL){
  g_printerr("failed to allocate memory\n");
}
//Possible problem
gst_buffer_set_caps (buffer,caps);
//Set up the video encoding parameters
caps = gst_caps_new_simple ("video/x-raw-yuv",
"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'),
"width",  G_TYPE_INT, 352,
"height", G_TYPE_INT, 288,
"framerate", GST_TYPE_FRACTION, 25, 1, NULL);
if ( !caps ) {
g_printerr("Failed to create caps\n");
return 0;
}
err = gst_element_link_filtered(filesrc, time, caps);
gst_caps_unref(caps);
gst_buffer_unref (buffer);
....


return pipeline;
}
 

I supressed part of the function, but I put in annex the whole code. There is something wrong or missing in this function? How can I make what I want to work?

Please, I need help.

thanks for the previous answers and thanks in advance

------------------------------------------------------------------------------
Nokia and AT&T present the 2010 Calling All Innovators-North America contest
Create new apps & games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store
http://p.sf.net/sfu/nokia-dev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: how to learn from .yuv files and encode with h.264?

Marco Ballesio

Hi,

it appears the sources you attached to the message did not go through the moderation.. btw a few comments below.

..snip..

> *//Possible problem*
> * **size = 352*288*(3/2);*
> * **buffer = gst_buffer_try_new_and_alloc (size);*

is there a particular reason for allocating this from the application? The pipeline is usually handling buffer allocation/release automagically.


> * **if (buffer==NULL){*
> * **  g_printerr("failed to allocate memory\n");*
> * **}*
> *//Possible problem *
> * **gst_buffer_set_caps (buffer,caps);*
>  *//Set up the video encoding parameters*
> * **caps = gst_caps_new_simple ("video/x-raw-yuv",*
> * **"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'),*
> * **"width",  G_TYPE_INT, 352,*
> * **"height", G_TYPE_INT, 288,*
> * **"framerate", GST_TYPE_FRACTION, 25, 1, NULL);*
> * **if ( !caps ) {*
> * **g_printerr("Failed to create caps\n");*
> * **return 0;*
> * **}*
>  err = gst_element_link_filtered(filesrc, time, caps);

what is the time element? It's possible caps are not propagated to the encoder if not directly connected to the filesrc.

Moreover, I didn't catch where you're setting the blocksize property in your filesrc to "size".

..Snip..

> There is something wrong or missing in this function? How can I make
> what I want to work?

See my comments above ;)

P.S. Maybe you could attach (or better copy to pastebin) the output when setting GST-DEBUG to 2 from the shell prior executing your binary.

Regards


>
> Please, I need help.
>
> thanks for the previous answers and thanks in advance


------------------------------------------------------------------------------
Nokia and AT&T present the 2010 Calling All Innovators-North America contest
Create new apps & games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store
http://p.sf.net/sfu/nokia-dev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: how to learn from .yuv files and encode with h.264?

Marco Ballesio
Hi,

it appears the app you sent is actually more complex than what you need.. btw, some functions are not accessible from the code snippet you sent.

I found some soare time to write a minimalistic send/receive couple of applications, the first one reading from a yuv file generated with:

gst-launch -v videotestsrc num-buffers=2000 ! "video/x-raw-yuv, width=320, height=240, format=(fourcc)I420" ! filesink location=test.yuv

and streaming to an address specified from command line. The second app opens a connection and renders all the h264 frames it receives from it. Hopefully it will give you an idea about how to get your app working.

P.S. added back the gst-devel mailing list to the loop.
P.P.S hopefully the attachment will make its way through the moderation.

Regards

On Fri, Oct 29, 2010 at 4:57 PM, Rafael Sousa <[hidden email]> wrote:
Hi Gibro,

As I said, I'm a newbie in this list, and I don't know exactly how to fix the issues in my code, so, I'll send to you my whole code to you. If you may take a look I'd be be very gratefull for your help.

regards


On Fri, Oct 29, 2010 at 2:05 AM, Gibro Vacco <[hidden email]> wrote:

Hi,

it appears the sources you attached to the message did not go through the moderation.. btw a few comments below.

..snip..

> *//Possible problem*

> * **size = 352*288*(3/2);*
> * **buffer = gst_buffer_try_new_and_alloc (size);*

is there a particular reason for allocating this from the application? The pipeline is usually handling buffer allocation/release automagically.

> * **if (buffer==NULL){*
> * **  g_printerr("failed to allocate memory\n");*

> * **}*
> *//Possible problem *
> * **gst_buffer_set_caps (buffer,caps);*
>  *//Set up the video encoding parameters*
> * **caps = gst_caps_new_simple ("video/x-raw-yuv",*
> * **"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'),*
> * **"width",  G_TYPE_INT, 352,*
> * **"height", G_TYPE_INT, 288,*
> * **"framerate", GST_TYPE_FRACTION, 25, 1, NULL);*
> * **if ( !caps ) {*
> * **g_printerr("Failed to create caps\n");*
> * **return 0;*
> * **}*
>  err = gst_element_link_filtered(filesrc, time, caps);

what is the time element? It's possible caps are not propagated to the encoder if not directly connected to the filesrc.

Moreover, I didn't catch where you're setting the blocksize property in your filesrc to "size".

..Snip..

> There is something wrong or missing in this function? How can I make
> what I want to work?

See my comments above ;)

P.S. Maybe you could attach (or better copy to pastebin) the output when setting GST-DEBUG to 2 from the shell prior executing your binary.

Regards


>
> Please, I need help.
>
> thanks for the previous answers and thanks in advance




------------------------------------------------------------------------------
Nokia and AT&T present the 2010 Calling All Innovators-North America contest
Create new apps & games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store
http://p.sf.net/sfu/nokia-dev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

simplestream.tgz (2K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: how to learn from .yuv files and encode with h.264?

Marco Ballesio
mmhh.. md5 from: http://gstreamer-devel.966125.n4.nabble.com/how-to-learn-from-yuv-files-and-encode-with-h-264-td3017365.html#a3017365

is different. Attaching the sources directly.

Regards

On Sun, Oct 31, 2010 at 10:33 PM, Marco Ballesio <[hidden email]> wrote:
Hi,

it appears the app you sent is actually more complex than what you need.. btw, some functions are not accessible from the code snippet you sent.

I found some soare time to write a minimalistic send/receive couple of applications, the first one reading from a yuv file generated with:

gst-launch -v videotestsrc num-buffers=2000 ! "video/x-raw-yuv, width=320, height=240, format=(fourcc)I420" ! filesink location=test.yuv

and streaming to an address specified from command line. The second app opens a connection and renders all the h264 frames it receives from it. Hopefully it will give you an idea about how to get your app working.

P.S. added back the gst-devel mailing list to the loop.
P.P.S hopefully the attachment will make its way through the moderation.

Regards


On Fri, Oct 29, 2010 at 4:57 PM, Rafael Sousa <[hidden email]> wrote:
Hi Gibro,

As I said, I'm a newbie in this list, and I don't know exactly how to fix the issues in my code, so, I'll send to you my whole code to you. If you may take a look I'd be be very gratefull for your help.

regards


On Fri, Oct 29, 2010 at 2:05 AM, Gibro Vacco <[hidden email]> wrote:

Hi,

it appears the sources you attached to the message did not go through the moderation.. btw a few comments below.

..snip..

> *//Possible problem*

> * **size = 352*288*(3/2);*
> * **buffer = gst_buffer_try_new_and_alloc (size);*

is there a particular reason for allocating this from the application? The pipeline is usually handling buffer allocation/release automagically.

> * **if (buffer==NULL){*
> * **  g_printerr("failed to allocate memory\n");*

> * **}*
> *//Possible problem *
> * **gst_buffer_set_caps (buffer,caps);*
>  *//Set up the video encoding parameters*
> * **caps = gst_caps_new_simple ("video/x-raw-yuv",*
> * **"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'),*
> * **"width",  G_TYPE_INT, 352,*
> * **"height", G_TYPE_INT, 288,*
> * **"framerate", GST_TYPE_FRACTION, 25, 1, NULL);*
> * **if ( !caps ) {*
> * **g_printerr("Failed to create caps\n");*
> * **return 0;*
> * **}*
>  err = gst_element_link_filtered(filesrc, time, caps);

what is the time element? It's possible caps are not propagated to the encoder if not directly connected to the filesrc.

Moreover, I didn't catch where you're setting the blocksize property in your filesrc to "size".

..Snip..

> There is something wrong or missing in this function? How can I make
> what I want to work?

See my comments above ;)

P.S. Maybe you could attach (or better copy to pastebin) the output when setting GST-DEBUG to 2 from the shell prior executing your binary.

Regards


>
> Please, I need help.
>
> thanks for the previous answers and thanks in advance





------------------------------------------------------------------------------
Nokia and AT&T present the 2010 Calling All Innovators-North America contest
Create new apps & games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store
http://p.sf.net/sfu/nokia-dev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

Makefile (142 bytes) Download Attachment
recv.c (3K) Download Attachment
send.c (4K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: how to learn from .yuv files and encode with h.264?

rafael.lmsousa
Hi Marco

thank you so much for the help, it worked very well here... But what I want is to store the video at the receiver in the disk. I tried to modify this example that you sent to me, but it didn't saved the video. I modified the following in the recv.c. I used the foreman_cif.yuv video sample (352x288).


#define PIPELINE_FORMAT "\
gstrtpbin name=rtpbin \
udpsrc caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\" \
port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! rtph264depay ! ffdec_h264 ! \" video/x-raw-yuv, width=352, height=288, format=(fourcc)I420 \" ! filesink location=test_rcv.yuv ! \
udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \
rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false"


it returned an error: 

Error: internal error in data stream.

I'm very gratefull for your help.

regards



On Sun, Oct 31, 2010 at 4:48 PM, Marco Ballesio <[hidden email]> wrote:
mmhh.. md5 from: http://gstreamer-devel.966125.n4.nabble.com/how-to-learn-from-yuv-files-and-encode-with-h-264-td3017365.html#a3017365

is different. Attaching the sources directly.

Regards


On Sun, Oct 31, 2010 at 10:33 PM, Marco Ballesio <[hidden email]> wrote:
Hi,

it appears the app you sent is actually more complex than what you need.. btw, some functions are not accessible from the code snippet you sent.

I found some soare time to write a minimalistic send/receive couple of applications, the first one reading from a yuv file generated with:

gst-launch -v videotestsrc num-buffers=2000 ! "video/x-raw-yuv, width=320, height=240, format=(fourcc)I420" ! filesink location=test.yuv

and streaming to an address specified from command line. The second app opens a connection and renders all the h264 frames it receives from it. Hopefully it will give you an idea about how to get your app working.

P.S. added back the gst-devel mailing list to the loop.
P.P.S hopefully the attachment will make its way through the moderation.

Regards


On Fri, Oct 29, 2010 at 4:57 PM, Rafael Sousa <[hidden email]> wrote:
Hi Gibro,

As I said, I'm a newbie in this list, and I don't know exactly how to fix the issues in my code, so, I'll send to you my whole code to you. If you may take a look I'd be be very gratefull for your help.

regards


On Fri, Oct 29, 2010 at 2:05 AM, Gibro Vacco <[hidden email]> wrote:

Hi,

it appears the sources you attached to the message did not go through the moderation.. btw a few comments below.

..snip..

> *//Possible problem*

> * **size = 352*288*(3/2);*
> * **buffer = gst_buffer_try_new_and_alloc (size);*

is there a particular reason for allocating this from the application? The pipeline is usually handling buffer allocation/release automagically.

> * **if (buffer==NULL){*
> * **  g_printerr("failed to allocate memory\n");*

> * **}*
> *//Possible problem *
> * **gst_buffer_set_caps (buffer,caps);*
>  *//Set up the video encoding parameters*
> * **caps = gst_caps_new_simple ("video/x-raw-yuv",*
> * **"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'),*
> * **"width",  G_TYPE_INT, 352,*
> * **"height", G_TYPE_INT, 288,*
> * **"framerate", GST_TYPE_FRACTION, 25, 1, NULL);*
> * **if ( !caps ) {*
> * **g_printerr("Failed to create caps\n");*
> * **return 0;*
> * **}*
>  err = gst_element_link_filtered(filesrc, time, caps);

what is the time element? It's possible caps are not propagated to the encoder if not directly connected to the filesrc.

Moreover, I didn't catch where you're setting the blocksize property in your filesrc to "size".

..Snip..

> There is something wrong or missing in this function? How can I make
> what I want to work?

See my comments above ;)

P.S. Maybe you could attach (or better copy to pastebin) the output when setting GST-DEBUG to 2 from the shell prior executing your binary.

Regards


>
> Please, I need help.
>
> thanks for the previous answers and thanks in advance






------------------------------------------------------------------------------
Nokia and AT&T present the 2010 Calling All Innovators-North America contest
Create new apps & games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store
http://p.sf.net/sfu/nokia-dev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: how to learn from .yuv files and encode with h.264?

Marco Ballesio
Hi,

On Mon, Nov 1, 2010 at 5:56 PM, Rafael Sousa <[hidden email]> wrote:
Marco,

The modification that you proposed to use the mp4mux didn't work. The same error occurred. I'm using the yuv format because after the transimission of the video, I intend to calculate the psnr between the source yuv and the received yuv for academic analysis. But the mp4 format also helps.

the attached recv.c should do the trick.  I just wonder why at this point you can't simply use gst-launch...

Regards
 

Sorry,for the bothering, but I've reseached a lot before appealing to this list.

thanks for the help,
best regards,.


On Mon, Nov 1, 2010 at 4:44 AM, Marco Ballesio <[hidden email]> wrote:
Hi,

On Mon, Nov 1, 2010 at 2:46 AM, Rafael Sousa <[hidden email]> wrote:
Hi Marco

thank you so much for the help, it worked very well here... But what I want is to store the video at the receiver in the disk. I tried to modify this example that you sent to me, but it didn't saved the video. I modified the following in the recv.c. I used the foreman_cif.yuv video sample (352x288).


#define PIPELINE_FORMAT "\
gstrtpbin name=rtpbin \
udpsrc caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\" \
port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! rtph264depay ! ffdec_h264 ! \" video/x-raw-yuv, width=352, height=288, format=(fourcc)I420 \" ! filesink location=test_rcv.yuv ! \

you don't need the capsfilter between decoder and filesink.. everything will automagically work ;).

Btw, unless you're in love with raw yuv I suggest using a container format.. in this case you don't even need to decode and thus you could save some CPU at recording time. Try with this pipeline:


gstrtpbin name=rtpbin \
udpsrc caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\" \
port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! rtph264depay ! mp4mux ! filesink location=test_rcv.mp4 ! \

etc. etc.

NOTE: you'll need to add a way to send the EOS to the muxer element (SIGHUP signal handler/read the manual ;) ), but setting the mp4mux property "faststart" to true could make the file playable even when manually stopping the pipeline with ctrl+c (discaimer: I've not tried it).

Regards
 
udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \
rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false"


it returned an error: 

Error: internal error in data stream.

I'm very gratefull for your help.

regards



On Sun, Oct 31, 2010 at 4:48 PM, Marco Ballesio <[hidden email]> wrote:
mmhh.. md5 from: http://gstreamer-devel.966125.n4.nabble.com/how-to-learn-from-yuv-files-and-encode-with-h-264-td3017365.html#a3017365

is different. Attaching the sources directly.

Regards


On Sun, Oct 31, 2010 at 10:33 PM, Marco Ballesio <[hidden email]> wrote:
Hi,

it appears the app you sent is actually more complex than what you need.. btw, some functions are not accessible from the code snippet you sent.

I found some soare time to write a minimalistic send/receive couple of applications, the first one reading from a yuv file generated with:

gst-launch -v videotestsrc num-buffers=2000 ! "video/x-raw-yuv, width=320, height=240, format=(fourcc)I420" ! filesink location=test.yuv

and streaming to an address specified from command line. The second app opens a connection and renders all the h264 frames it receives from it. Hopefully it will give you an idea about how to get your app working.

P.S. added back the gst-devel mailing list to the loop.
P.P.S hopefully the attachment will make its way through the moderation.

Regards


On Fri, Oct 29, 2010 at 4:57 PM, Rafael Sousa <[hidden email]> wrote:
Hi Gibro,

As I said, I'm a newbie in this list, and I don't know exactly how to fix the issues in my code, so, I'll send to you my whole code to you. If you may take a look I'd be be very gratefull for your help.

regards


On Fri, Oct 29, 2010 at 2:05 AM, Gibro Vacco <[hidden email]> wrote:

Hi,

it appears the sources you attached to the message did not go through the moderation.. btw a few comments below.

..snip..

> *//Possible problem*

> * **size = 352*288*(3/2);*
> * **buffer = gst_buffer_try_new_and_alloc (size);*

is there a particular reason for allocating this from the application? The pipeline is usually handling buffer allocation/release automagically.

> * **if (buffer==NULL){*
> * **  g_printerr("failed to allocate memory\n");*

> * **}*
> *//Possible problem *
> * **gst_buffer_set_caps (buffer,caps);*
>  *//Set up the video encoding parameters*
> * **caps = gst_caps_new_simple ("video/x-raw-yuv",*
> * **"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'),*
> * **"width",  G_TYPE_INT, 352,*
> * **"height", G_TYPE_INT, 288,*
> * **"framerate", GST_TYPE_FRACTION, 25, 1, NULL);*
> * **if ( !caps ) {*
> * **g_printerr("Failed to create caps\n");*
> * **return 0;*
> * **}*
>  err = gst_element_link_filtered(filesrc, time, caps);

what is the time element? It's possible caps are not propagated to the encoder if not directly connected to the filesrc.

Moreover, I didn't catch where you're setting the blocksize property in your filesrc to "size".

..Snip..

> There is something wrong or missing in this function? How can I make
> what I want to work?

See my comments above ;)

P.S. Maybe you could attach (or better copy to pastebin) the output when setting GST-DEBUG to 2 from the shell prior executing your binary.

Regards


>
> Please, I need help.
>
> thanks for the previous answers and thanks in advance









------------------------------------------------------------------------------
Nokia and AT&T present the 2010 Calling All Innovators-North America contest
Create new apps & games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store
http://p.sf.net/sfu/nokia-dev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel

recv.c (3K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: how to learn from .yuv files and encode with h.264?

rafael.lmsousa
the gst-launch works fine for me, I was using the C API because was the first example next to what I want that I found, but the command line way works just fine too.

I'll try your solution. Thanks very much for the attention.


On Tue, Nov 2, 2010 at 2:57 PM, Marco Ballesio <[hidden email]> wrote:
Hi,

On Mon, Nov 1, 2010 at 5:56 PM, Rafael Sousa <[hidden email]> wrote:
Marco,

The modification that you proposed to use the mp4mux didn't work. The same error occurred. I'm using the yuv format because after the transimission of the video, I intend to calculate the psnr between the source yuv and the received yuv for academic analysis. But the mp4 format also helps.

the attached recv.c should do the trick.  I just wonder why at this point you can't simply use gst-launch...

Regards
 

Sorry,for the bothering, but I've reseached a lot before appealing to this list.

thanks for the help,
best regards,.


On Mon, Nov 1, 2010 at 4:44 AM, Marco Ballesio <[hidden email]> wrote:
Hi,

On Mon, Nov 1, 2010 at 2:46 AM, Rafael Sousa <[hidden email]> wrote:
Hi Marco

thank you so much for the help, it worked very well here... But what I want is to store the video at the receiver in the disk. I tried to modify this example that you sent to me, but it didn't saved the video. I modified the following in the recv.c. I used the foreman_cif.yuv video sample (352x288).


#define PIPELINE_FORMAT "\
gstrtpbin name=rtpbin \
udpsrc caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\" \
port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! rtph264depay ! ffdec_h264 ! \" video/x-raw-yuv, width=352, height=288, format=(fourcc)I420 \" ! filesink location=test_rcv.yuv ! \

you don't need the capsfilter between decoder and filesink.. everything will automagically work ;).

Btw, unless you're in love with raw yuv I suggest using a container format.. in this case you don't even need to decode and thus you could save some CPU at recording time. Try with this pipeline:


gstrtpbin name=rtpbin \
udpsrc caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\" \
port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! rtph264depay ! mp4mux ! filesink location=test_rcv.mp4 ! \

etc. etc.

NOTE: you'll need to add a way to send the EOS to the muxer element (SIGHUP signal handler/read the manual ;) ), but setting the mp4mux property "faststart" to true could make the file playable even when manually stopping the pipeline with ctrl+c (discaimer: I've not tried it).

Regards
 
udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \
rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false"


it returned an error: 

Error: internal error in data stream.

I'm very gratefull for your help.

regards



On Sun, Oct 31, 2010 at 4:48 PM, Marco Ballesio <[hidden email]> wrote:
mmhh.. md5 from: http://gstreamer-devel.966125.n4.nabble.com/how-to-learn-from-yuv-files-and-encode-with-h-264-td3017365.html#a3017365

is different. Attaching the sources directly.

Regards


On Sun, Oct 31, 2010 at 10:33 PM, Marco Ballesio <[hidden email]> wrote:
Hi,

it appears the app you sent is actually more complex than what you need.. btw, some functions are not accessible from the code snippet you sent.

I found some soare time to write a minimalistic send/receive couple of applications, the first one reading from a yuv file generated with:

gst-launch -v videotestsrc num-buffers=2000 ! "video/x-raw-yuv, width=320, height=240, format=(fourcc)I420" ! filesink location=test.yuv

and streaming to an address specified from command line. The second app opens a connection and renders all the h264 frames it receives from it. Hopefully it will give you an idea about how to get your app working.

P.S. added back the gst-devel mailing list to the loop.
P.P.S hopefully the attachment will make its way through the moderation.

Regards


On Fri, Oct 29, 2010 at 4:57 PM, Rafael Sousa <[hidden email]> wrote:
Hi Gibro,

As I said, I'm a newbie in this list, and I don't know exactly how to fix the issues in my code, so, I'll send to you my whole code to you. If you may take a look I'd be be very gratefull for your help.

regards


On Fri, Oct 29, 2010 at 2:05 AM, Gibro Vacco <[hidden email]> wrote:

Hi,

it appears the sources you attached to the message did not go through the moderation.. btw a few comments below.

..snip..

> *//Possible problem*

> * **size = 352*288*(3/2);*
> * **buffer = gst_buffer_try_new_and_alloc (size);*

is there a particular reason for allocating this from the application? The pipeline is usually handling buffer allocation/release automagically.

> * **if (buffer==NULL){*
> * **  g_printerr("failed to allocate memory\n");*

> * **}*
> *//Possible problem *
> * **gst_buffer_set_caps (buffer,caps);*
>  *//Set up the video encoding parameters*
> * **caps = gst_caps_new_simple ("video/x-raw-yuv",*
> * **"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'),*
> * **"width",  G_TYPE_INT, 352,*
> * **"height", G_TYPE_INT, 288,*
> * **"framerate", GST_TYPE_FRACTION, 25, 1, NULL);*
> * **if ( !caps ) {*
> * **g_printerr("Failed to create caps\n");*
> * **return 0;*
> * **}*
>  err = gst_element_link_filtered(filesrc, time, caps);

what is the time element? It's possible caps are not propagated to the encoder if not directly connected to the filesrc.

Moreover, I didn't catch where you're setting the blocksize property in your filesrc to "size".

..Snip..

> There is something wrong or missing in this function? How can I make
> what I want to work?

See my comments above ;)

P.S. Maybe you could attach (or better copy to pastebin) the output when setting GST-DEBUG to 2 from the shell prior executing your binary.

Regards


>
> Please, I need help.
>
> thanks for the previous answers and thanks in advance










------------------------------------------------------------------------------
Nokia and AT&T present the 2010 Calling All Innovators-North America contest
Create new apps & games for the Nokia N8 for consumers in  U.S. and Canada
$10 million total in prizes - $4M cash, 500 devices, nearly $6M in marketing
Develop with Nokia Qt SDK, Web Runtime, or Java and Publish to Ovi Store
http://p.sf.net/sfu/nokia-dev2dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: how to learn from .yuv files and encode with h.264?

rafael.lmsousa
Hi Marco

Thanks again, you solved my problem, the code worked beautifully.

regards

On Tue, Nov 2, 2010 at 4:13 PM, Rafael Sousa <[hidden email]> wrote:
the gst-launch works fine for me, I was using the C API because was the first example next to what I want that I found, but the command line way works just fine too.

I'll try your solution. Thanks very much for the attention.


On Tue, Nov 2, 2010 at 2:57 PM, Marco Ballesio <[hidden email]> wrote:
Hi,

On Mon, Nov 1, 2010 at 5:56 PM, Rafael Sousa <[hidden email]> wrote:
Marco,

The modification that you proposed to use the mp4mux didn't work. The same error occurred. I'm using the yuv format because after the transimission of the video, I intend to calculate the psnr between the source yuv and the received yuv for academic analysis. But the mp4 format also helps.

the attached recv.c should do the trick.  I just wonder why at this point you can't simply use gst-launch...

Regards
 

Sorry,for the bothering, but I've reseached a lot before appealing to this list.

thanks for the help,
best regards,.


On Mon, Nov 1, 2010 at 4:44 AM, Marco Ballesio <[hidden email]> wrote:
Hi,

On Mon, Nov 1, 2010 at 2:46 AM, Rafael Sousa <[hidden email]> wrote:
Hi Marco

thank you so much for the help, it worked very well here... But what I want is to store the video at the receiver in the disk. I tried to modify this example that you sent to me, but it didn't saved the video. I modified the following in the recv.c. I used the foreman_cif.yuv video sample (352x288).


#define PIPELINE_FORMAT "\
gstrtpbin name=rtpbin \
udpsrc caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\" \
port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! rtph264depay ! ffdec_h264 ! \" video/x-raw-yuv, width=352, height=288, format=(fourcc)I420 \" ! filesink location=test_rcv.yuv ! \

you don't need the capsfilter between decoder and filesink.. everything will automagically work ;).

Btw, unless you're in love with raw yuv I suggest using a container format.. in this case you don't even need to decode and thus you could save some CPU at recording time. Try with this pipeline:


gstrtpbin name=rtpbin \
udpsrc caps=\"application/x-rtp,media=(string)video,clock-rate=90000,encoding-name=(string)H264\" \
port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! rtph264depay ! mp4mux ! filesink location=test_rcv.mp4 ! \

etc. etc.

NOTE: you'll need to add a way to send the EOS to the muxer element (SIGHUP signal handler/read the manual ;) ), but setting the mp4mux property "faststart" to true could make the file playable even when manually stopping the pipeline with ctrl+c (discaimer: I've not tried it).

Regards
 
udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \
rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false"


it returned an error: 

Error: internal error in data stream.

I'm very gratefull for your help.

regards



On Sun, Oct 31, 2010 at 4:48 PM, Marco Ballesio <[hidden email]> wrote:
mmhh.. md5 from: http://gstreamer-devel.966125.n4.nabble.com/how-to-learn-from-yuv-files-and-encode-with-h-264-td3017365.html#a3017365

is different. Attaching the sources directly.

Regards


On Sun, Oct 31, 2010 at 10:33 PM, Marco Ballesio <[hidden email]> wrote:
Hi,

it appears the app you sent is actually more complex than what you need.. btw, some functions are not accessible from the code snippet you sent.

I found some soare time to write a minimalistic send/receive couple of applications, the first one reading from a yuv file generated with:

gst-launch -v videotestsrc num-buffers=2000 ! "video/x-raw-yuv, width=320, height=240, format=(fourcc)I420" ! filesink location=test.yuv

and streaming to an address specified from command line. The second app opens a connection and renders all the h264 frames it receives from it. Hopefully it will give you an idea about how to get your app working.

P.S. added back the gst-devel mailing list to the loop.
P.P.S hopefully the attachment will make its way through the moderation.

Regards


On Fri, Oct 29, 2010 at 4:57 PM, Rafael Sousa <[hidden email]> wrote:
Hi Gibro,

As I said, I'm a newbie in this list, and I don't know exactly how to fix the issues in my code, so, I'll send to you my whole code to you. If you may take a look I'd be be very gratefull for your help.

regards


On Fri, Oct 29, 2010 at 2:05 AM, Gibro Vacco <[hidden email]> wrote:

Hi,

it appears the sources you attached to the message did not go through the moderation.. btw a few comments below.

..snip..

> *//Possible problem*

> * **size = 352*288*(3/2);*
> * **buffer = gst_buffer_try_new_and_alloc (size);*

is there a particular reason for allocating this from the application? The pipeline is usually handling buffer allocation/release automagically.

> * **if (buffer==NULL){*
> * **  g_printerr("failed to allocate memory\n");*

> * **}*
> *//Possible problem *
> * **gst_buffer_set_caps (buffer,caps);*
>  *//Set up the video encoding parameters*
> * **caps = gst_caps_new_simple ("video/x-raw-yuv",*
> * **"format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('I', '4', '2', '0'),*
> * **"width",  G_TYPE_INT, 352,*
> * **"height", G_TYPE_INT, 288,*
> * **"framerate", GST_TYPE_FRACTION, 25, 1, NULL);*
> * **if ( !caps ) {*
> * **g_printerr("Failed to create caps\n");*
> * **return 0;*
> * **}*
>  err = gst_element_link_filtered(filesrc, time, caps);

what is the time element? It's possible caps are not propagated to the encoder if not directly connected to the filesrc.

Moreover, I didn't catch where you're setting the blocksize property in your filesrc to "size".

..Snip..

> There is something wrong or missing in this function? How can I make
> what I want to work?

See my comments above ;)

P.S. Maybe you could attach (or better copy to pastebin) the output when setting GST-DEBUG to 2 from the shell prior executing your binary.

Regards


>
> Please, I need help.
>
> thanks for the previous answers and thanks in advance











------------------------------------------------------------------------------
Achieve Improved Network Security with IP and DNS Reputation.
Defend against bad network traffic, including botnets, malware,
phishing sites, and compromised hosts - saving your company time,
money, and embarrassment.   Learn More!
http://p.sf.net/sfu/hpdev2dev-nov
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel