Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Shaf
This post was updated on .
My requirement is to stream mp4 video [audio is optional] via RTSP. I have
used a code shared in the forum to check the rtsp video
streaming.

Reference:
http://gstreamer-devel.966125.n4.nabble.com/Continuously-streaming-a-video-file-code-review-td4671364.html


Source Code:
test_rpsp_mp4.c

1.The pipeline seems to be proper and the bus callback and need data function
is invoked.

2. The caps string is retrieved from the video file using the
"gst-discoverer-1.0  xx.mp4 -v" command     [Is this the proper way to fetch
the caps ? ]

3.  Using this command [client]gst-launch-1.0 playbin
uri="rtsp://19x.16x.12x.xx:554/test" tried to play the video stream.

4. The command line shows the server is active and the video is streaming. But no
video output is displayed in the console. need_data and bus_callback
functions are called when needed and I could see the buffer is pushed to
appsrc.

5. The .mp4 file has both audio and video. But the pipeline
constructed has only video src. I was able to use the same pipeline and
stream the video without appsrc.

I am sure that the issue may be in caps filters, but not sure how t build the cap filters manually.The problem occurs only when trying to use appsrc to stream the video.

Can you please suggest the issue in the below code ? I tried to search in forum but with no help on this particular issue.I am using appsrc to continuously stream the video by adding a bus watch for the EOS.

Also tried using VLC to stream the video, no video is displayed but I could see the seekbar is moving with time lapsed [which means the video is streamed but not displayed due to wrong
caps ?? ]

Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Shaf
Can somebody help me on this ? I have been struggling to get his work for
past many days.



Thanks
Shaf



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Stephenwei
reference your URL
#############################################################################
stElement *playbin = gst_element_factory_make ("playbin", "play");
    app->videosink = gst_element_factory_make ("appsink", "video_sink");
    g_object_set (G_OBJECT (app->videosink), "emit-signals", FALSE, "sync",
TRUE, NULL);
    g_object_set (G_OBJECT (playbin), "video-sink", app->videosink, NULL);
    gst_app_sink_set_drop(GST_APP_SINK (app->videosink), TRUE);

############################################################################
playbin will be demux, decode your location file. Accordig the code, it was
send the RAW data to appsink

I guess your mp4.c filter the streaming because as below

const gchar *videocaps =  "video/x-h264, stream-format=(string)avc,
alignment=(string)au, level=(string)2.1, profile=(string)high,
codec_data=(buffer)01640015ffe1001967640015acd941e08fea1000000300100000030320f162d96001000668ebe3cb22c0,
width=(int)480, height=(int)270, framerate=(fraction)25/1,
pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive,
chroma-format=(string)4:2:0, bit-depth-luma=(uint)8,
bit-depth-chroma=(uint)8, parsed=(boolean)true";




--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Shaf
Hi Stephen,

Ok, I understood what you mean to say :
1. I should send h264 format to appsrc, right ?  
2. If so, what pipeline should I use ? Can I add caps fro the playbin [same
caps used for h264parse ] ?


Using "gst-inspect-1.0 playbin" gives me no information about the src/sink
caps of playbin. How did you say it is raw video ??


Note :

I managed to stream the video  using the below pipeline :
gst_rtsp_media_factory_set_launch(factory, "( appsrc name=mysrc !
videoconvert ! jpegenc ! rtpjpegpay name=pay0 pt=96 )");

and using the caps as [videoconvert uses video/x-raw format as src]:
const gchar *videocaps_mp4 = "video/x-raw, format=(string)I420,
width=(int)480, height=(int)270,interlace-mode=(string)progressive";


But my intention is to convert the .mp4 to h264 format and then play it.
What is the issue with the below pipeline ? Why cant I use h264parse with
appsrc ?

const gchar *pipe_line_string = "appsrc name=mysrc is-live=true max-bytes=0
do-timestamp=true min-latency=0 ! queue ! h264parse name=parse ! queue !
rtph264pay name=pay0 pt=96 timestamp-offset=0";






--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Stephenwei
Hi Shaf,

I'm sorry I don't know what you want to do.

playbin will help you to automatic link demux, decode, video convert and
videosink(audio, too)

According to your reference code, you used playbin to decompress a file and
passthrough appsink/src to receive this raw data as well as encode to jpeg.

If you just want to replace mp4 header to h264
Why don't you reference the original of test-mp4.c?

Have a nice day!




--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Shaf
Hi Stephen,

I have already tried test-mp4.c and it works perfectly. But now I am trying
to make a mp4 streaming server that can continuously play a video/a
different video when the pipeline encounters EOS.

So my aim is to :
1. Play .mp4 file using appsink, using a bus callback I will detect the EOS
and change the source file [This way I can run video continuously or change
the file src location]
2. Use appsrc to dynamically connect with appsink to pull the buffer and
stream the output via rtsp.
[This will be useful for RTSP file streaming]

I could do the above using a playbin for appsink and then detect EOS by
bus-callback signal.

Now I am trying to see, if I can replace the playbin to a construct the
apprsc pipeline dynamically. The difficulty I face here is to convert the
below pipeline by linking elements separately :

Playbin [appsink]:
GstElement *playbin = gst_parse_launch(" filesrc
location=C:/videos/dolby.mp4 ! qtdemux name=mdemux ! h264parse !
video/x-h264,stream-format=byte-stream ! appsink name=video_sink
emit-signals=FALSE sync=TRUE drop=TRUE num-buffers=2000 ", NULL);

I expect to create the above pipeline something like this :
        /* Create the elements */
        data.source = gst_element_factory_make("filesrc", "source");
        data.qtdemuxer = gst_element_factory_make("qtdemux", "mdemux");
        data.h264parser = gst_element_factory_make("h264parse", "h264_parse");
        data.sink = gst_element_factory_make("appsink", "video_sink");

I have no much idea of linking qtdemux to appsink. Going through the
documents [gst-inspect-1.0 qtdemux], I came to know that qtdemux is sometime
pad and need to link it using pad-added callback.

I have not seen any examples for linking qtdemux to appsink. So trying to
understand.

If you have any suggestion please let me know. Thanks for your support !









--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Stephenwei
Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Shaf
Hi Stephen,

I am trying change the file source from the below pipeline when the
currently running pipeline come to EOS.


****** Pipeline ************

GstElement *playbin = gst_parse_launch(
                "filesrc name=media_source ! "
                "qtdemux name=mdemux ! h264parse ! video/x-h264,stream-format=byte-stream
! "
                "appsink name=video_sink  emit-signals=FALSE sync=TRUE drop=TRUE
num-buffers=2000 ", &error);

if (!playbin) {
        g_print("Parse error: %s\n", error->message);
        exit(1);
}

filesrc = gst_bin_get_by_name(GST_BIN(playbin), "media_source");
g_object_set(filesrc, "location", argv[0], NULL);
g_object_unref(filesrc);

***************** bus callback  -- seeking to beginning of file is working

gboolean bus_callback(GstBus *bus, GstMessage *msg, gpointer data) {
        GstElement *pipeline = GST_ELEMENT(data);
        switch (GST_MESSAGE_TYPE(msg)) {
        case GST_MESSAGE_EOS:
                if (!gst_element_seek(pipeline,
                        1.0, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH,
                        GST_SEEK_TYPE_SET, 1000000000,
                        GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE)) {
                        g_message("Seek failed!");
                }
                break;
        default:
                break;
        }
        return TRUE;
}


******but changing the file source in bus call back is not working. The
video gets stuck after playing once.

Here is how the file source is changed in bus call-back.

gboolean bus_callback(GstBus *bus, GstMessage *msg, gpointer data)
{
        App *pipeline_data = (App*)data;

        GstElement *pipeline = GST_ELEMENT(pipeline_data->playbin);
        switch (GST_MESSAGE_TYPE(msg)) {
        case GST_MESSAGE_EOS:
                gst_element_set_state(pipeline, GST_STATE_NULL);
                // get the filesrc bin using name "media_source"
     g_object_set(gst_bin_get_by_name(GST_BIN(pipeline), "media_source"),
"location", "C:\\Videos\\test.mp4", NULL);

        // set the pipeline to playing
                gst_element_set_state(pipeline, GST_STATE_PLAYING);
                break;
        default:
                break;
        }
        return TRUE;
}

(The pipeline STATE is displayed as ASYNC while printing the STATE)

Is this possible ?

Note :
I was able to change the video source using the below pipeline (But I need
to use Qtdemuxer, as above pipeline)

        // Playbin, setup and configuration
        GstElement *playbin = gst_element_factory_make("playbin", "play");

        app->videosink = gst_element_factory_make("appsink", "video_sink");
        g_object_set(G_OBJECT(app->videosink), "emit-signals", FALSE, "sync", TRUE,
NULL);
        g_object_set(G_OBJECT(playbin), "video-sink", app->videosink, NULL);
        gst_app_sink_set_drop(GST_APP_SINK(app->videosink), TRUE);
        gst_app_sink_set_max_buffers(GST_APP_SINK(app->videosink), 1);
        bus = gst_pipeline_get_bus(GST_PIPELINE(playbin));
        gst_bus_add_watch(bus, bus_callbackz, playbin);
        g_object_set(G_OBJECT(playbin), "uri", mp4_file_src, NULL);
        gst_element_set_state(playbin, GST_STATE_PLAYING);




--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Gstreamer-1.0 Mp4 video RTSP streaming - appsrc

Shaf
This post was updated on .
In reply to this post by Stephenwei
Hi Stephen,

I was able to use Qtdemux "pad-added" signal to link the filesrc to h264parse.
Now the video RTSP video is streaming properly.
The rtsp streaming is as follows :    
   1. appsink pipeline is created :
          filesrc->qtdemux->h264parse->appsink    
   2. RTSP appsrc pipeline is created :  
         appsrc name=mysrc ! decodebin ! x264enc byte-stream=true tune=zerolatency ! rtph264pay name=pay0 pt=96  
   3.  The need-data signal of appsrc push the byte samples from appsink

Now I also need to add audio to this pipeline.
So I modified the appsink to:      
           filesrc-->   qtdemux   --> queue-->h264parse->appsink --> queue-->aacparse -->appsink        

I am able to Link the audio and video caps inside the "pad-added" signal call-back of qtdemux.

1. How can push the samples from appsink to RTSP appsrc ?
2. Do I need 2 appsrc element (for audio and video)  in same pipeline ?
3. How can I sync the audio and video ?



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
gstreamer-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel