iOS and gstreamer, or trouble with glimagesink

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

iOS and gstreamer, or trouble with glimagesink

alex_malishev
Hi everybody! Sorry for my english.
 I'm developing the video streaming app for Android and iOS. I want to show video coming from media server on apple device with iOS 10. I built a pipeline which have udpsrc as a source element and glimagesing as sink element. Between them, of course, there are other elements which decode stream from h264. Pipeline works great, I have already tested it on android device. But on my iOS device video doesn't show, but I see in tcpdump on my server that the packets arrive on the  iOS device. I think it's because of glimagesink.
 - Could anyone tell me how should I use glimagesink properly?
 Also on Android devices I can see the video, but when I try to build and launch other pipeline with glimagesink video doesn't show but I can hear the sound which means that device recieve packets.
So that also means that the previous window didn't finalize properly, so what should I do with this?
P.S. I should also say that I init gstreamer library when my app started so I give app context to gstreamer only one time.
Reply | Threaded
Open this post in threaded view
|

Re: iOS and gstreamer, or trouble with glimagesink

Sebastian Dröge-3
On Sat, 2016-10-22 at 07:28 -0700, alex_malishev wrote:

> Hi everybody! Sorry for my english.
>  I'm developing the video streaming app for Android and iOS. I want to show
> video coming from media server on apple device with iOS 10. I built a
> pipeline which have udpsrc as a source element and glimagesing as sink
> element. Between them, of course, there are other elements which decode
> stream from h264. Pipeline works great, I have already tested it on android
> device. But on my iOS device video doesn't show, but I see in tcpdump on my
> server that the packets arrive on the  iOS device. I think it's because of
> glimagesink. 
> * - Could anyone tell me how should I use glimagesink properly?*
How do you know it's because of glimagesink? Does it work with a
different pipeline with glimagesink, e.g. when using videotestsrc?

Always try to reproduce the problem with the simplest pipeline
possible, simplify the pipeline.

If you can reproduce it with videotestsrc, please share your testcase
code so we can take a look.

>  Also on Android devices I can see the video, but when I try to build and
> launch other pipeline with glimagesink video doesn't show but I can hear the
> sound which means that device recieve packets. 

Try to make an as simple as possible testcase that reproduces this
problem and share it. How and which window ids are you setting on the
two different glimagesink?

> *So that also means that the previous window didn't finalize properly, so
> what should I do with this? *

glimagesink will stop using it once you set it to NULL state.

--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (949 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re[2]: iOS and gstreamer, or trouble with glimagesink

alex_malishev
Thanks for your reply, Sebastian. 
iOS: Yes, I already tried to simplify my pipeline. The simple pipeline, which works:
 gst_parse_launch("videotestsrc ! glimagesink", &error);
gst_element_set_state(data->pipeline, GST_STATE_READY);
This is how I init surface: 
 video_sink = gst_bin_get_by_name(GST_BIN(pipeline), "video");
if (!video_sink) {
NSLog(@"Could not retrieve video sink");
return;
}
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
gst_object_unref(video_sink);

Where ui_video_view is instance of this class:
#import "EaglUIView.h"

#import <QuartzCore/QuartzCore.h>

@implementation EaglUIView

+ (Class) layerClass
{
// return [CEAGLLayer class];
return [CAEAGLLayer class];
}

@end

Android: In my android device I set pipeline in READY state, then I call native_camera_surface_init(), which i made by myself, and pass it my surface from android code. In this function I do this:
 static void gst_native_camera_surface_init(JNIEnv *env, jobject thiz, jobject surface) { // begin
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id); 
if (!data) return; 
if (data->native_window_camera) { 
GST_DEBUG ("Releasing previous native window %p", data->native_window_camera); 
ANativeWindow_release (data->native_window_camera); 

gst_element_set_state (data->pipeline, GST_STATE_READY); 
data->native_window_camera = ANativeWindow_fromSurface (env, surface); 

glimagesink = gst_bin_get_by_name(GST_BIN(data->pipeline), "vsink"); 
__android_log_print(ANDROID_LOG_INFO, "gst", "pipeline is %p", data->pipeline); 
if(!data->pipeline){ 
__android_log_print(ANDROID_LOG_INFO, "gst", "pipeline is null"); 

if(!glimagesink){ 
__android_log_print(ANDROID_LOG_INFO, "gst", "sink is null"); 

__android_log_print(ANDROID_LOG_INFO, "gst", "try to set new window"); 
if (data->pipeline) { 
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY(glimagesink), (guintptr) data->native_window_camera); 
gst_object_unref (glimagesink); 
__android_log_print(ANDROID_LOG_INFO, "gst", "set new window"); 

gst_element_set_state (data->pipeline, GST_STATE_NULL); 
check_initialization_complete (data); 
} //end

When I want to use other pipeline, I finalize this pipeline by setting it to NULL state:
gst_element_set_state(data->pipeline, GST_STATE_NULL);

Then, I finalize window by calling this method:
static void gst_native_camera_surface_finalize(JNIEnv *env, jobject thiz){ 
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id); 
if (!data) return; 
GST_DEBUG ("Releasing Native Window %p", data->native_window_camera); 

if (data->pipeline) { 
__android_log_print(ANDROID_LOG_INFO, "gst", "deleting window"); 
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (glimagesink), (guintptr)NULL); 
gst_element_set_state (data->pipeline, GST_STATE_READY); 

__android_log_print(ANDROID_LOG_INFO, "gst", "release window"); 
ANativeWindow_release (data->native_window_camera); 
__android_log_print(ANDROID_LOG_INFO, "gst", "done releasing window"); 
data->native_window = NULL; 
__android_log_print(ANDROID_LOG_INFO, "gst", "native window = null"); 
data->initialized = FALSE; 
}

My problem occur even if I try to reuse previous pipeline. 

Понедельник, 24 октября 2016, 9:47 +03:00 от Sebastian Dröge <[hidden email]>:

On Sat, 2016-10-22 at 07:28 -0700, alex_malishev wrote:
> Hi everybody! Sorry for my english.
>  I'm developing the video streaming app for Android and iOS. I want to show
> video coming from media server on apple device with iOS 10. I built a
> pipeline which have udpsrc as a source element and glimagesing as sink
> element. Between them, of course, there are other elements which decode
> stream from h264. Pipeline works great, I have already tested it on android
> device. But on my iOS device video doesn't show, but I see in tcpdump on my
> server that the packets arrive on the  iOS device. I think it's because of
> glimagesink. 
> * - Could anyone tell me how should I use glimagesink properly?*

How do you know it's because of glimagesink? Does it work with a
different pipeline with glimagesink, e.g. when using videotestsrc?

Always try to reproduce the problem with the simplest pipeline
possible, simplify the pipeline.

If you can reproduce it with videotestsrc, please share your testcase
code so we can take a look.

>  Also on Android devices I can see the video, but when I try to build and
> launch other pipeline with glimagesink video doesn't show but I can hear the
> sound which means that device recieve packets. 

Try to make an as simple as possible testcase that reproduces this
problem and share it. How and which window ids are you setting on the
two different glimagesink?

> *So that also means that the previous window didn't finalize properly, so
> what should I do with this? *

glimagesink will stop using it once you set it to NULL state.

--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: iOS and gstreamer, or trouble with glimagesink

alex_malishev
In reply to this post by Sebastian Dröge-3
Thanks for your reply, Sebastian.
iOS: Yes, I already tried to simplify my pipeline. The simple pipeline, which works:
 gst_parse_launch("videotestsrc ! glimagesink", &error);
gst_element_set_state(data->pipeline, GST_STATE_READY);
This is how I init surface:
 video_sink = gst_bin_get_by_name(GST_BIN(pipeline), "video");
if (!video_sink) {
NSLog(@"Could not retrieve video sink");
return;
}
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
gst_object_unref(video_sink);

Where ui_video_view is instance of this class:
#import "EaglUIView.h"

#import <QuartzCore/QuartzCore.h>

@implementation EaglUIView

+ (Class) layerClass
{
// return [CEAGLLayer class];
return [CAEAGLLayer class];
}

@end

Android: In my android device I set pipeline in READY state, then I call native_camera_surface_init(), which i made by myself, and pass it my surface from android code. In this function I do this:
 static void gst_native_camera_surface_init(JNIEnv *env, jobject thiz, jobject surface) { // begin
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
if (data->native_window_camera) {
GST_DEBUG ("Releasing previous native window %p", data->native_window_camera);
ANativeWindow_release (data->native_window_camera);
}
gst_element_set_state (data->pipeline, GST_STATE_READY);
data->native_window_camera = ANativeWindow_fromSurface (env, surface);

glimagesink = gst_bin_get_by_name(GST_BIN(data->pipeline), "vsink");
__android_log_print(ANDROID_LOG_INFO, "gst", "pipeline is %p", data->pipeline);
if(!data->pipeline){
__android_log_print(ANDROID_LOG_INFO, "gst", "pipeline is null");
}
if(!glimagesink){
__android_log_print(ANDROID_LOG_INFO, "gst", "sink is null");
}
__android_log_print(ANDROID_LOG_INFO, "gst", "try to set new window");
if (data->pipeline) {
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY(glimagesink), (guintptr) data->native_window_camera);
gst_object_unref (glimagesink);
__android_log_print(ANDROID_LOG_INFO, "gst", "set new window");
}
gst_element_set_state (data->pipeline, GST_STATE_NULL);
check_initialization_complete (data);
} //end

When I want to use other pipeline, I finalize this pipeline by setting it to NULL state:
gst_element_set_state(data->pipeline, GST_STATE_NULL);

Then, I finalize window by calling this method:
static void gst_native_camera_surface_finalize(JNIEnv *env, jobject thiz){
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
GST_DEBUG ("Releasing Native Window %p", data->native_window_camera);

if (data->pipeline) {
__android_log_print(ANDROID_LOG_INFO, "gst", "deleting window");
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (glimagesink), (guintptr)NULL);
gst_element_set_state (data->pipeline, GST_STATE_READY);
}
__android_log_print(ANDROID_LOG_INFO, "gst", "release window");
ANativeWindow_release (data->native_window_camera);
__android_log_print(ANDROID_LOG_INFO, "gst", "done releasing window");
data->native_window = NULL;
__android_log_print(ANDROID_LOG_INFO, "gst", "native window = null");
data->initialized = FALSE;
}