Continuously streaming a video file code review

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Continuously streaming a video file code review

Russell Willis
Hi All,

I'm hoping someone can help me.  I've been asked to write a small program to continuously stream a video file, or loop the file, over rtsp.  I'm using the gstrtspserver, but having never used gstreamer before I've no idea if I have approached this correctly.  The solution I have works, but I would be grateful if someone could give me an indication if I've taken the right design decisions or not.
I've mainly used bits and pieces from the example files found within the gstreamer and gstrtspserver source and documentation.
Basically the first pipeline, using filesrc, reads in a file and uses the EOS bus message to loop to the beginning of the file.  The rtspserver pulls a sample from the videosink of the first pipeline and pushes it out across the network.  All code I use is below, there is no error checking and its not efficient, I want to make sure I'm on the right track before investing any more time.

#include <gst/gst.h>
#include <gst/app/gstappsink.h>
#include <gst/app/gstappsrc.h>
#include <rtsp-server.h>
 
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
 
typedef struct _App App;
struct _App
{
    GstElement *videosink;
};
App s_app;
 
typedef struct {
    App *glblapp;
    GstClockTime timestamp;
} Context;
 
const gchar *videocaps =
    "video/x-raw, format=(string)I420, width=(int)720, height=(int)480,
     pixel-aspect-ratio=(fraction)10/11, interlace-mode=(string)progressive,
     colorimetry=(string)bt601, framerate=(fraction)5000/167";
 
// RTSP server signal and event handler
static void
need_data (GstElement *appsrc, guint unused, Context *ctx)
{
    GstFlowReturn ret;
    GstSample *sample = gst_app_sink_pull_sample (GST_APP_SINK(ctx->glblapp->videosink));
    if (sample != NULL) {
        GstBuffer *buffer = gst_sample_get_buffer(sample);
        gst_sample_unref (sample);
        GST_BUFFER_PTS(buffer) = ctx->timestamp;
        GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 25);
        ctx->timestamp += GST_BUFFER_DURATION (buffer);
        g_signal_emit_by_name(appsrc, "push-buffer", buffer, &ret);
    }
}
 
static void
media_configure (GstRTSPMediaFactory *factory, GstRTSPMedia *media, App *app)
{
    Context *ctx;
    GstElement *pipeline;
    GstElement *appsrc;
    pipeline = gst_rtsp_media_get_element(media);
    appsrc = gst_bin_get_by_name_recurse_up (GST_BIN (pipeline), "mysrc");
    gst_rtsp_media_set_reusable(media, TRUE);
    gst_util_set_object_arg (G_OBJECT (appsrc), "format", "time");
    g_object_set (G_OBJECT (appsrc), "caps", gst_caps_from_string(videocaps), NULL);
    g_object_set(G_OBJECT(appsrc), "max-bytes",
        gst_app_src_get_max_bytes(GST_APP_SRC(appsrc)), NULL);
    ctx = g_new0 (Context, 1);
    ctx->glblapp = app;
    ctx->timestamp = 0;
    g_signal_connect (appsrc, "need-data", (GCallback) need_data, ctx);
}
 
// Bus message handler
gboolean
bus_callback(GstBus *bus, GstMessage *msg, gpointer data)
{
    GstElement *pipeline = GST_ELEMENT(data);
    switch (GST_MESSAGE_TYPE(msg)) {
    case GST_MESSAGE_EOS:
        if (!gst_element_seek(pipeline,
            1.0, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH,
            GST_SEEK_TYPE_SET, 1000000000,
            GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE)) {
            g_message("Seek failed!");
        }
        break;
    default:
        break;
    }
    return TRUE;
}
 
gint
main (gint argc, gchar *argv[])
{
    App *app = &s_app;
    GstBus *bus;
    GstRTSPServer *server;
    GstRTSPMediaFactory *factory;
    GstRTSPMountPoints *mountpoints;
 
    gst_init (&argc, &argv);
    GMainLoop *loop = g_main_loop_new (NULL, FALSE);
 
    // Playbin, setup and configuration
    GstElement *playbin = gst_element_factory_make ("playbin", "play");
    app->videosink = gst_element_factory_make ("appsink", "video_sink");
    g_object_set (G_OBJECT (app->videosink), "emit-signals", FALSE, "sync", TRUE, NULL);
    g_object_set (G_OBJECT (playbin), "video-sink", app->videosink, NULL);
    gst_app_sink_set_drop(GST_APP_SINK (app->videosink), TRUE);
    gst_app_sink_set_max_buffers(GST_APP_SINK (app->videosink), 1);   
    bus = gst_pipeline_get_bus (GST_PIPELINE (playbin));
    gst_bus_add_watch (bus, bus_callback, playbin);
    g_object_set (G_OBJECT (playbin), "uri", "file:///home/user/Videos/TestVideo.mp4", NULL);
    gst_element_set_state (playbin, GST_STATE_PLAYING);
 
    // RTSP server, setup and configuration
    server = gst_rtsp_server_new();
    mountpoints = gst_rtsp_server_get_mount_points(server);
    factory = gst_rtsp_media_factory_new();
    gst_rtsp_media_factory_set_shared(factory, TRUE);
    gst_rtsp_media_factory_set_launch (factory,
        "( appsrc name=mysrc ! videoconvert ! jpegenc ! rtpjpegpay name=pay0 pt=96 )");
    g_signal_connect (factory, "media-configure", (GCallback) media_configure, app);
    gst_rtsp_mount_points_add_factory (mountpoints, "/test", factory);
    g_object_unref(mountpoints);
    gst_rtsp_server_attach (server, NULL);
    g_print("RTSP Server started...");
    g_main_loop_run (loop);
 
    // Clean up
    gst_element_set_state (playbin, GST_STATE_NULL);
    gst_object_unref (bus);
    return 0;
}


Many thanks for your help.

-RW

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Continuously streaming a video file code review

Tim Müller
On Fri, 2015-03-27 at 14:18 +0000, Russell Willis wrote:

Hi,

> I'm hoping someone can help me.  I've been asked to write a small
> program to continuously stream a video file, or loop the file, over
> rtsp.  I'm using the gstrtspserver, but having never used gstreamer
> before I've no idea if I have approached this correctly.  The solution
> I have works, but I would be grateful if someone could give me an
> indication if I've taken the right design decisions or not.
>
> I've mainly used bits and pieces from the example files found within
> the gstreamer and gstrtspserver source and documentation.
>
> Basically the first pipeline, using filesrc, reads in a file and uses
> the EOS bus message to loop to the beginning of the file.  The
> rtspserver pulls a sample from the videosink of the first pipeline and
> pushes it out across the network.  All code I use is below, there is
> no error checking and its not efficient, I want to make sure I'm on
> the right track before investing any more time.
> (snip code)

That looks like a reasonable enough approach at first glance. It all
depends on your exact requirements of course.

You could simplify things by using the intervideosink / intervideosrc
elements instead, then you don't have to take care of moving the buffers
from sink to src yourself (you may end up encoding more or fewer frames
though because the intervideosrc creates/duplicates frames to a fixed
framerate which may not match the incoming framerate if you don't match
them up yourself; which has advantages and disadvantages).

 Cheers
  -Tim

--
Tim Müller, Centricular Ltd - http://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Continuously streaming a video file code review

Shaf
In reply to this post by Russell Willis
Hi Russel,

I have the similar requirement and I tried your source code to stream a mp4
video continuously using RTSP. But the video seems not displayed.
<br/>

1. The pipeline seems to be proper and the bus callback and need data
function is invoked.<br/>
/const gchar *pipe_line_string = "appsrc name=mysrc is-live=true max-bytes=0
do-timestamp=true min-latency=0 ! queue ! h264parse name=parse ! queue !
rtph264pay name=pay0 pt=96 timestamp-offset=0";
/
<br/>
2. The caps string is retrieved from the video file using the
"gst-discoverer-1.0  xx.mp4 -v" command
    [Is this the proper way to fetch the caps ? ] <br/>

/const gchar *videocaps =  "video/x-h264, stream-format=(string)avc,
alignment=(string)au, level=(string)2.1, profile=(string)high,
codec_data=(buffer)01640015ffe1001967640015acd941e08fea1000000300100000030320f162d96001000668ebe3cb22c0,
width=(int)480, height=(int)270, framerate=(fraction)25/1,
pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive,
chroma-format=(string)4:2:0, bit-depth-luma=(uint)8,
bit-depth-chroma=(uint)8, parsed=(boolean)true";/

<br/>
3.  Using this command [client]
gst-launch-1.0 playbin uri="rtsp://19x.16x.12x.xx:554/test" tried to play
the video stream.
<br/>
4. The command line shows the server is active and the video is streaming.
But no video output is displayed in the console. need_data and bus_callback
functions are called when needed and I could see the buffer is pushed to
appsrc.
<br/>
5. The .mp4 file has both audio and video. But the pipeline constructed has
only video src. I was able to use the same pipeline and stream the video
without appsrc.
<br/>
I am sure that the issue may be in caps filters, but not sure how to build
the cap filters manually.The problem occurs only when trying to use appsrc
to stream the video. Can you please suggest the issue in the below code ?
<br/>
I am using appsrc to continuously stream the video by adding a bus watch for
the EOS. Also tried using VLC to stream the video, no video is displayed but
I could see the seekbar is moving with time lapsed [which means the video is
streamed but not displayed due to wrong caps ?? ]

<br/>

Many thanks

Shaf



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel