2016-05-27 02:21:04 +00:00
|
|
|
|
# iOS tutorial 3: Video
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 22:41:07 +00:00
|
|
|
|
## Goal
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
![screenshot]
|
|
|
|
|
|
|
|
|
|
Except for [](sdk-basic-tutorial-toolkit-integration.md),
|
2016-05-16 14:30:34 +00:00
|
|
|
|
which embedded a video window on a GTK application, all tutorials so far
|
|
|
|
|
relied on GStreamer video sinks to create a window to display their
|
|
|
|
|
contents. The video sink on iOS is not capable of creating its own
|
|
|
|
|
window, so a drawing surface always needs to be provided. This tutorial
|
|
|
|
|
shows:
|
|
|
|
|
|
|
|
|
|
- How to allocate a drawing surface on the Xcode Interface Builder and
|
|
|
|
|
pass it to GStreamer
|
|
|
|
|
|
2016-06-17 22:41:07 +00:00
|
|
|
|
## Introduction
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
|
|
|
|
Since iOS does not provide a windowing system, a GStreamer video sink
|
|
|
|
|
cannot create pop-up windows as it would do on a Desktop platform.
|
2016-06-17 19:32:33 +00:00
|
|
|
|
Fortunately, the `VideoOverlay` interface allows providing video sinks with
|
2016-05-16 14:30:34 +00:00
|
|
|
|
an already created window onto which they can draw, as we have seen
|
2016-06-17 19:32:33 +00:00
|
|
|
|
in [](sdk-basic-tutorial-toolkit-integration.md).
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
In this tutorial, a `UIView` widget (actually, a subclass of it) is
|
|
|
|
|
placed on the main storyboard. In the `viewDidLoad` method of the
|
|
|
|
|
`ViewController`, we pass a pointer to this `UIView `to the instance of
|
2016-05-16 14:30:34 +00:00
|
|
|
|
the `GStreamerBackend`, so it can tell the video sink where to draw.
|
|
|
|
|
|
2016-06-17 22:41:07 +00:00
|
|
|
|
## The User Interface
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
The storyboard from the previous tutorial is expanded: A `UIView `is
|
2016-05-16 14:30:34 +00:00
|
|
|
|
added over the toolbar and pinned to all sides so it takes up all
|
|
|
|
|
available space (`video_container_view` outlet). Inside it, another
|
2016-06-17 19:32:33 +00:00
|
|
|
|
`UIView `is added (`video_view` outlet) which contains the actual video,
|
2016-05-16 14:30:34 +00:00
|
|
|
|
centered to its parent, and with a size that adapts to the media size
|
2016-06-17 19:32:33 +00:00
|
|
|
|
(through the `video_width_constraint` and `video_height_constraint`
|
2016-05-16 14:30:34 +00:00
|
|
|
|
outlets):
|
|
|
|
|
|
|
|
|
|
**ViewController.h**
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
#import <UIKit/UIKit.h>
|
|
|
|
|
#import "GStreamerBackendDelegate.h"
|
|
|
|
|
|
|
|
|
|
@interface ViewController : UIViewController <GStreamerBackendDelegate> {
|
|
|
|
|
IBOutlet UILabel *message_label;
|
|
|
|
|
IBOutlet UIBarButtonItem *play_button;
|
|
|
|
|
IBOutlet UIBarButtonItem *pause_button;
|
|
|
|
|
IBOutlet UIView *video_view;
|
|
|
|
|
IBOutlet UIView *video_container_view;
|
|
|
|
|
IBOutlet NSLayoutConstraint *video_width_constraint;
|
|
|
|
|
IBOutlet NSLayoutConstraint *video_height_constraint;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
-(IBAction) play:(id)sender;
|
|
|
|
|
-(IBAction) pause:(id)sender;
|
|
|
|
|
|
|
|
|
|
/* From GStreamerBackendDelegate */
|
|
|
|
|
-(void) gstreamerInitialized;
|
|
|
|
|
-(void) gstreamerSetUIMessage:(NSString *)message;
|
|
|
|
|
|
|
|
|
|
@end
|
|
|
|
|
```
|
|
|
|
|
|
2016-06-17 22:41:07 +00:00
|
|
|
|
## The View Controller
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
The `ViewController `class manages the UI, instantiates
|
|
|
|
|
the `GStreamerBackend` and also performs some UI-related tasks on its
|
2016-05-16 14:30:34 +00:00
|
|
|
|
behalf:
|
|
|
|
|
|
|
|
|
|
**ViewController.m**
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
#import "ViewController.h"
|
|
|
|
|
#import "GStreamerBackend.h"
|
|
|
|
|
#import <UIKit/UIKit.h>
|
|
|
|
|
|
|
|
|
|
@interface ViewController () {
|
|
|
|
|
GStreamerBackend *gst_backend;
|
|
|
|
|
int media_width;
|
|
|
|
|
int media_height;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
@end
|
|
|
|
|
|
|
|
|
|
@implementation ViewController
|
|
|
|
|
|
|
|
|
|
/*
|
|
|
|
|
* Methods from UIViewController
|
|
|
|
|
*/
|
|
|
|
|
|
|
|
|
|
- (void)viewDidLoad
|
|
|
|
|
{
|
|
|
|
|
[super viewDidLoad];
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
play_button.enabled = FALSE;
|
|
|
|
|
pause_button.enabled = FALSE;
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
/* Make these constant for now, later tutorials will change them */
|
|
|
|
|
media_width = 320;
|
|
|
|
|
media_height = 240;
|
|
|
|
|
|
|
|
|
|
gst_backend = [[GStreamerBackend alloc] init:self videoView:video_view];
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
- (void)didReceiveMemoryWarning
|
|
|
|
|
{
|
|
|
|
|
[super didReceiveMemoryWarning];
|
|
|
|
|
// Dispose of any resources that can be recreated.
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/* Called when the Play button is pressed */
|
|
|
|
|
-(IBAction) play:(id)sender
|
|
|
|
|
{
|
|
|
|
|
[gst_backend play];
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/* Called when the Pause button is pressed */
|
|
|
|
|
-(IBAction) pause:(id)sender
|
|
|
|
|
{
|
|
|
|
|
[gst_backend pause];
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
- (void)viewDidLayoutSubviews
|
|
|
|
|
{
|
|
|
|
|
CGFloat view_width = video_container_view.bounds.size.width;
|
|
|
|
|
CGFloat view_height = video_container_view.bounds.size.height;
|
|
|
|
|
|
|
|
|
|
CGFloat correct_height = view_width * media_height / media_width;
|
|
|
|
|
CGFloat correct_width = view_height * media_width / media_height;
|
|
|
|
|
|
|
|
|
|
if (correct_height < view_height) {
|
|
|
|
|
video_height_constraint.constant = correct_height;
|
|
|
|
|
video_width_constraint.constant = view_width;
|
|
|
|
|
} else {
|
|
|
|
|
video_width_constraint.constant = correct_width;
|
|
|
|
|
video_height_constraint.constant = view_height;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/*
|
|
|
|
|
* Methods from GstreamerBackendDelegate
|
|
|
|
|
*/
|
|
|
|
|
|
|
|
|
|
-(void) gstreamerInitialized
|
|
|
|
|
{
|
|
|
|
|
dispatch_async(dispatch_get_main_queue(), ^{
|
|
|
|
|
play_button.enabled = TRUE;
|
|
|
|
|
pause_button.enabled = TRUE;
|
|
|
|
|
message_label.text = @"Ready";
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
-(void) gstreamerSetUIMessage:(NSString *)message
|
|
|
|
|
{
|
|
|
|
|
dispatch_async(dispatch_get_main_queue(), ^{
|
|
|
|
|
message_label.text = message;
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
@end
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
We expand the class to remember the width and height of the media we are
|
|
|
|
|
currently playing:
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
@interface ViewController () {
|
|
|
|
|
GStreamerBackend *gst_backend;
|
|
|
|
|
int media_width;
|
|
|
|
|
int media_height;
|
|
|
|
|
}
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
In later tutorials this data is retrieved from the GStreamer pipeline,
|
|
|
|
|
but in this tutorial, for simplicity’s sake, the width and height of the
|
|
|
|
|
media is constant and initialized in `viewDidLoad`:
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
- (void)viewDidLoad
|
|
|
|
|
{
|
|
|
|
|
[super viewDidLoad];
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
play_button.enabled = FALSE;
|
|
|
|
|
pause_button.enabled = FALSE;
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
/* Make these constant for now, later tutorials will change them */
|
|
|
|
|
media_width = 320;
|
|
|
|
|
media_height = 240;
|
|
|
|
|
|
|
|
|
|
gst_backend = [[GStreamerBackend alloc] init:self videoView:video_view];
|
|
|
|
|
}
|
|
|
|
|
```
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
As shown below, the `GStreamerBackend` constructor has also been
|
|
|
|
|
expanded to accept another parameter: the `UIView *` where the video
|
2016-05-16 14:30:34 +00:00
|
|
|
|
sink should draw.
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
The rest of the `ViewController `code is the same as the previous
|
|
|
|
|
tutorial, except for the code that adapts the `video_view` size to the
|
2016-05-16 14:30:34 +00:00
|
|
|
|
media size, respecting its aspect ratio:
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
- (void)viewDidLayoutSubviews
|
|
|
|
|
{
|
|
|
|
|
CGFloat view_width = video_container_view.bounds.size.width;
|
|
|
|
|
CGFloat view_height = video_container_view.bounds.size.height;
|
|
|
|
|
|
|
|
|
|
CGFloat correct_height = view_width * media_height / media_width;
|
|
|
|
|
CGFloat correct_width = view_height * media_width / media_height;
|
|
|
|
|
|
|
|
|
|
if (correct_height < view_height) {
|
|
|
|
|
video_height_constraint.constant = correct_height;
|
|
|
|
|
video_width_constraint.constant = view_width;
|
|
|
|
|
} else {
|
|
|
|
|
video_width_constraint.constant = correct_width;
|
|
|
|
|
video_height_constraint.constant = view_height;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
```
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
The `viewDidLayoutSubviews` method is called every time the main view
|
2016-05-16 14:30:34 +00:00
|
|
|
|
size has changed (for example, due to a device orientation change) and
|
|
|
|
|
the entire layout has been recalculated. At this point, we can access
|
2016-06-17 19:32:33 +00:00
|
|
|
|
the `bounds` property of the `video_container_view` to retrieve its new
|
|
|
|
|
size and change the `video_view` size accordingly.
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
|
|
|
|
The simple algorithm above maximizes either the width or the height of
|
|
|
|
|
the `video_view`, while changing the other axis so the aspect ratio of
|
|
|
|
|
the media is preserved. The goal is to provide the GStreamer video sink
|
|
|
|
|
with a surface of the correct proportions, so it does not need to add
|
|
|
|
|
black borders (*letterboxing*), which is a waste of processing power.
|
|
|
|
|
|
|
|
|
|
The final size is reported to the layout engine by changing the
|
2016-06-17 19:32:33 +00:00
|
|
|
|
`constant` field in the width and height `Constraints` of the
|
2016-05-16 14:30:34 +00:00
|
|
|
|
`video_view`. These constraints have been created in the storyboard and
|
2016-06-17 19:32:33 +00:00
|
|
|
|
are accessible to the `ViewController `through IBOutlets, as is usually
|
2016-05-16 14:30:34 +00:00
|
|
|
|
done with other widgets.
|
|
|
|
|
|
2016-06-17 22:41:07 +00:00
|
|
|
|
## The GStreamer Backend
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
The `GStreamerBackend` class performs all GStreamer-related tasks and
|
2016-05-16 14:30:34 +00:00
|
|
|
|
offers a simplified interface to the application, which does not need to
|
|
|
|
|
deal with all the GStreamer details. When it needs to perform any UI
|
|
|
|
|
action, it does so through a delegate, which is expected to adhere to
|
2016-06-17 19:32:33 +00:00
|
|
|
|
the `GStreamerBackendDelegate` protocol:
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
|
|
|
|
**GStreamerBackend.m**
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
#import "GStreamerBackend.h"
|
|
|
|
|
|
|
|
|
|
#include <gst/gst.h>
|
2016-06-17 19:32:33 +00:00
|
|
|
|
#include <gst/video/video.h>
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
|
|
|
|
GST_DEBUG_CATEGORY_STATIC (debug_category);
|
|
|
|
|
#define GST_CAT_DEFAULT debug_category
|
|
|
|
|
|
|
|
|
|
@interface GStreamerBackend()
|
|
|
|
|
-(void)setUIMessage:(gchar*) message;
|
|
|
|
|
-(void)app_function;
|
|
|
|
|
-(void)check_initialization_complete;
|
|
|
|
|
@end
|
|
|
|
|
|
|
|
|
|
@implementation GStreamerBackend {
|
|
|
|
|
id ui_delegate; /* Class that we use to interact with the user interface */
|
|
|
|
|
GstElement *pipeline; /* The running pipeline */
|
2016-06-17 19:32:33 +00:00
|
|
|
|
GstElement *video_sink;/* The video sink element which receives VideoOverlay commands */
|
2016-05-16 14:30:34 +00:00
|
|
|
|
GMainContext *context; /* GLib context used to run the main loop */
|
|
|
|
|
GMainLoop *main_loop; /* GLib main loop */
|
|
|
|
|
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
|
|
|
|
|
UIView *ui_video_view; /* UIView that holds the video */
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/*
|
|
|
|
|
* Interface methods
|
|
|
|
|
*/
|
|
|
|
|
|
|
|
|
|
-(id) init:(id) uiDelegate videoView:(UIView *)video_view
|
|
|
|
|
{
|
|
|
|
|
if (self = [super init])
|
|
|
|
|
{
|
|
|
|
|
self->ui_delegate = uiDelegate;
|
|
|
|
|
self->ui_video_view = video_view;
|
|
|
|
|
|
|
|
|
|
GST_DEBUG_CATEGORY_INIT (debug_category, "tutorial-3", 0, "iOS tutorial 3");
|
|
|
|
|
gst_debug_set_threshold_for_name("tutorial-3", GST_LEVEL_DEBUG);
|
|
|
|
|
|
|
|
|
|
/* Start the bus monitoring task */
|
|
|
|
|
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
|
|
|
|
|
[self app_function];
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
return self;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
-(void) dealloc
|
|
|
|
|
{
|
|
|
|
|
if (pipeline) {
|
|
|
|
|
GST_DEBUG("Setting the pipeline to NULL");
|
|
|
|
|
gst_element_set_state(pipeline, GST_STATE_NULL);
|
|
|
|
|
gst_object_unref(pipeline);
|
|
|
|
|
pipeline = NULL;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
-(void) play
|
|
|
|
|
{
|
|
|
|
|
if(gst_element_set_state(pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
|
|
|
|
|
[self setUIMessage:"Failed to set pipeline to playing"];
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
-(void) pause
|
|
|
|
|
{
|
|
|
|
|
if(gst_element_set_state(pipeline, GST_STATE_PAUSED) == GST_STATE_CHANGE_FAILURE) {
|
|
|
|
|
[self setUIMessage:"Failed to set pipeline to paused"];
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/*
|
|
|
|
|
* Private methods
|
|
|
|
|
*/
|
|
|
|
|
|
|
|
|
|
/* Change the message on the UI through the UI delegate */
|
|
|
|
|
-(void)setUIMessage:(gchar*) message
|
|
|
|
|
{
|
|
|
|
|
NSString *string = [NSString stringWithUTF8String:message];
|
|
|
|
|
if(ui_delegate && [ui_delegate respondsToSelector:@selector(gstreamerSetUIMessage:)])
|
|
|
|
|
{
|
|
|
|
|
[ui_delegate gstreamerSetUIMessage:string];
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/* Retrieve errors from the bus and show them on the UI */
|
|
|
|
|
static void error_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
|
|
|
|
|
{
|
|
|
|
|
GError *err;
|
|
|
|
|
gchar *debug_info;
|
|
|
|
|
gchar *message_string;
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
gst_message_parse_error (msg, &err, &debug_info);
|
|
|
|
|
message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
|
|
|
|
|
g_clear_error (&err);
|
|
|
|
|
g_free (debug_info);
|
|
|
|
|
[self setUIMessage:message_string];
|
|
|
|
|
g_free (message_string);
|
|
|
|
|
gst_element_set_state (self->pipeline, GST_STATE_NULL);
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/* Notify UI about pipeline state changes */
|
|
|
|
|
static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
|
|
|
|
|
{
|
|
|
|
|
GstState old_state, new_state, pending_state;
|
|
|
|
|
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
|
|
|
|
/* Only pay attention to messages coming from the pipeline, not its children */
|
|
|
|
|
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (self->pipeline)) {
|
|
|
|
|
gchar *message = g_strdup_printf("State changed to %s", gst_element_state_get_name(new_state));
|
|
|
|
|
[self setUIMessage:message];
|
|
|
|
|
g_free (message);
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/* Check if all conditions are met to report GStreamer as initialized.
|
|
|
|
|
* These conditions will change depending on the application */
|
|
|
|
|
-(void) check_initialization_complete
|
|
|
|
|
{
|
|
|
|
|
if (!initialized && main_loop) {
|
|
|
|
|
GST_DEBUG ("Initialization complete, notifying application.");
|
|
|
|
|
if (ui_delegate && [ui_delegate respondsToSelector:@selector(gstreamerInitialized)])
|
|
|
|
|
{
|
|
|
|
|
[ui_delegate gstreamerInitialized];
|
|
|
|
|
}
|
|
|
|
|
initialized = TRUE;
|
|
|
|
|
}
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/* Main method for the bus monitoring code */
|
|
|
|
|
-(void) app_function
|
|
|
|
|
{
|
|
|
|
|
GstBus *bus;
|
|
|
|
|
GSource *bus_source;
|
|
|
|
|
GError *error = NULL;
|
|
|
|
|
|
|
|
|
|
GST_DEBUG ("Creating pipeline");
|
|
|
|
|
|
|
|
|
|
/* Create our own GLib Main Context and make it the default one */
|
|
|
|
|
context = g_main_context_new ();
|
|
|
|
|
g_main_context_push_thread_default(context);
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
/* Build pipeline */
|
2016-06-17 19:32:33 +00:00
|
|
|
|
pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
|
2016-05-16 14:30:34 +00:00
|
|
|
|
if (error) {
|
|
|
|
|
gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
|
|
|
|
|
g_clear_error (&error);
|
|
|
|
|
[self setUIMessage:message];
|
|
|
|
|
g_free (message);
|
|
|
|
|
return;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
/* Set the pipeline to READY, so it can already accept a window handle */
|
|
|
|
|
gst_element_set_state(pipeline, GST_STATE_READY);
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
|
2016-05-16 14:30:34 +00:00
|
|
|
|
if (!video_sink) {
|
|
|
|
|
GST_ERROR ("Could not retrieve video sink");
|
|
|
|
|
return;
|
|
|
|
|
}
|
2016-06-17 19:32:33 +00:00
|
|
|
|
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
|
|
|
|
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
|
|
|
|
bus = gst_element_get_bus (pipeline);
|
|
|
|
|
bus_source = gst_bus_create_watch (bus);
|
|
|
|
|
g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, NULL, NULL);
|
|
|
|
|
g_source_attach (bus_source, context);
|
|
|
|
|
g_source_unref (bus_source);
|
|
|
|
|
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, (__bridge void *)self);
|
|
|
|
|
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, (__bridge void *)self);
|
|
|
|
|
gst_object_unref (bus);
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
/* Create a GLib Main Loop and set it to run */
|
|
|
|
|
GST_DEBUG ("Entering main loop...");
|
|
|
|
|
main_loop = g_main_loop_new (context, FALSE);
|
|
|
|
|
[self check_initialization_complete];
|
|
|
|
|
g_main_loop_run (main_loop);
|
|
|
|
|
GST_DEBUG ("Exited main loop");
|
|
|
|
|
g_main_loop_unref (main_loop);
|
|
|
|
|
main_loop = NULL;
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
/* Free resources */
|
|
|
|
|
g_main_context_pop_thread_default(context);
|
|
|
|
|
g_main_context_unref (context);
|
|
|
|
|
gst_element_set_state (pipeline, GST_STATE_NULL);
|
|
|
|
|
gst_object_unref (pipeline);
|
2016-05-27 02:21:04 +00:00
|
|
|
|
|
2016-05-16 14:30:34 +00:00
|
|
|
|
return;
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
@end
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
The main differences with the previous tutorial are related to the
|
2016-06-17 19:32:33 +00:00
|
|
|
|
handling of the `VideoOverlay` interface:
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
@implementation GStreamerBackend {
|
|
|
|
|
id ui_delegate; /* Class that we use to interact with the user interface */
|
|
|
|
|
GstElement *pipeline; /* The running pipeline */
|
2016-06-17 19:32:33 +00:00
|
|
|
|
GstElement *video_sink;/* The video sink element which receives VideoOverlay commands */
|
2016-05-16 14:30:34 +00:00
|
|
|
|
GMainContext *context; /* GLib context used to run the main loop */
|
|
|
|
|
GMainLoop *main_loop; /* GLib main loop */
|
|
|
|
|
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
|
|
|
|
|
UIView *ui_video_view; /* UIView that holds the video */
|
|
|
|
|
}
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
The class is expanded to keep track of the video sink element in the
|
2016-06-17 19:32:33 +00:00
|
|
|
|
pipeline and the `UIView *` onto which rendering is to occur.
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
-(id) init:(id) uiDelegate videoView:(UIView *)video_view
|
|
|
|
|
{
|
|
|
|
|
if (self = [super init])
|
|
|
|
|
{
|
|
|
|
|
self->ui_delegate = uiDelegate;
|
|
|
|
|
self->ui_video_view = video_view;
|
|
|
|
|
|
|
|
|
|
GST_DEBUG_CATEGORY_INIT (debug_category, "tutorial-3", 0, "iOS tutorial 3");
|
|
|
|
|
gst_debug_set_threshold_for_name("tutorial-3", GST_LEVEL_DEBUG);
|
|
|
|
|
|
|
|
|
|
/* Start the bus monitoring task */
|
|
|
|
|
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
|
|
|
|
|
[self app_function];
|
|
|
|
|
});
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
return self;
|
|
|
|
|
}
|
|
|
|
|
```
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
The constructor accepts the `UIView *` as a new parameter, which, at
|
2016-05-16 14:30:34 +00:00
|
|
|
|
this point, is simply remembered in `ui_video_view`.
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
/* Build pipeline */
|
2016-06-17 19:32:33 +00:00
|
|
|
|
pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
|
2016-05-16 14:30:34 +00:00
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
Then, in the `app_function`, the pipeline is constructed. This time we
|
2016-06-17 19:32:33 +00:00
|
|
|
|
build a video pipeline using a simple `videotestsrc` element with a
|
|
|
|
|
`warptv` to add some spice. The video sink is `autovideosink`, which
|
2016-05-16 14:30:34 +00:00
|
|
|
|
choses the appropriate sink for the platform (currently,
|
2016-06-17 19:32:33 +00:00
|
|
|
|
`glimagesink` is the only option for
|
2016-05-16 14:30:34 +00:00
|
|
|
|
iOS).
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
/* Set the pipeline to READY, so it can already accept a window handle */
|
|
|
|
|
gst_element_set_state(pipeline, GST_STATE_READY);
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
|
2016-05-16 14:30:34 +00:00
|
|
|
|
if (!video_sink) {
|
|
|
|
|
GST_ERROR ("Could not retrieve video sink");
|
|
|
|
|
return;
|
|
|
|
|
}
|
2016-06-17 19:32:33 +00:00
|
|
|
|
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
|
2016-05-16 14:30:34 +00:00
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
Once the pipeline is built, we set it to READY. In this state, dataflow
|
|
|
|
|
has not started yet, but the caps of adjacent elements have been
|
|
|
|
|
verified to be compatible and their pads have been linked. Also, the
|
2016-06-17 19:32:33 +00:00
|
|
|
|
`autovideosink` has already instantiated the actual video sink so we can
|
2016-05-16 14:30:34 +00:00
|
|
|
|
ask for it immediately.
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
The `gst_bin_get_by_interface()` method will examine the whole pipeline
|
2016-05-16 14:30:34 +00:00
|
|
|
|
and return a pointer to an element which supports the requested
|
2016-06-17 19:32:33 +00:00
|
|
|
|
interface. We are asking for the `VideoOverlay` interface, explained in
|
|
|
|
|
[](sdk-basic-tutorial-toolkit-integration.md),
|
2016-05-16 14:30:34 +00:00
|
|
|
|
which controls how to perform rendering into foreign (non-GStreamer)
|
2016-06-17 19:32:33 +00:00
|
|
|
|
windows. The internal video sink instantiated by `autovideosink` is the
|
2016-05-16 14:30:34 +00:00
|
|
|
|
only element in this pipeline implementing it, so it will be returned.
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
Once we have the video sink, we inform it of the `UIView` to use for
|
|
|
|
|
rendering, through the `gst_video_overlay_set_window_handle()` method.
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 22:41:07 +00:00
|
|
|
|
## EaglUIView
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
One last detail remains. In order for `glimagesink` to be able to draw
|
2016-05-16 14:30:34 +00:00
|
|
|
|
on the
|
|
|
|
|
[`UIView`](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIView_Class/UIView/UIView.html),
|
|
|
|
|
the
|
2016-06-17 19:32:33 +00:00
|
|
|
|
[`Layer`](http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Reference/CALayer_class/Introduction/Introduction.html#//apple_ref/occ/cl/CALayer) associated
|
2016-05-16 14:30:34 +00:00
|
|
|
|
with this view must be of the
|
2016-06-17 19:32:33 +00:00
|
|
|
|
[`CAEAGLLayer`](http://developer.apple.com/library/ios/#documentation/QuartzCore/Reference/CAEAGLLayer_Class/CAEGLLayer/CAEGLLayer.html#//apple_ref/occ/cl/CAEAGLLayer) class.
|
|
|
|
|
To this avail, we create the `EaglUIView` class, derived from
|
|
|
|
|
`UIView `and overriding the `layerClass` method:
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
|
|
|
|
**EaglUIView.m**
|
|
|
|
|
|
2016-05-27 02:48:36 +00:00
|
|
|
|
```
|
2016-05-16 14:30:34 +00:00
|
|
|
|
#import "EaglUIVIew.h"
|
|
|
|
|
|
|
|
|
|
#import <QuartzCore/QuartzCore.h>
|
|
|
|
|
|
|
|
|
|
@implementation EaglUIView
|
|
|
|
|
|
|
|
|
|
+ (Class) layerClass
|
|
|
|
|
{
|
|
|
|
|
return [CAEAGLLayer class];
|
|
|
|
|
}
|
|
|
|
|
|
|
|
|
|
@end
|
|
|
|
|
```
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
When creating storyboards, bear in mind that the `UIView `which should
|
|
|
|
|
contain the video must have `EaglUIView` as its custom class. This is
|
|
|
|
|
easy to setup from the Xcode interface builder. Take a look at the
|
2016-05-16 14:30:34 +00:00
|
|
|
|
tutorial storyboard to see how to achieve this.
|
|
|
|
|
|
|
|
|
|
And this is it, using GStreamer to output video onto an iOS application
|
|
|
|
|
is as simple as it seems.
|
|
|
|
|
|
2016-06-17 22:41:07 +00:00
|
|
|
|
## Conclusion
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
|
|
|
|
This tutorial has shown:
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
- How to display video on iOS using a `UIView `and
|
|
|
|
|
the `VideoOverlay` interface.
|
2016-05-16 14:30:34 +00:00
|
|
|
|
- How to report the media size to the iOS layout engine through
|
|
|
|
|
runtime manipulation of width and height constraints.
|
|
|
|
|
|
|
|
|
|
The following tutorial plays an actual clip and adds a few more controls
|
|
|
|
|
to this tutorial in order to build a simple media player.
|
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
It has been a pleasure having you here, and see you soon!
|
2016-05-16 14:30:34 +00:00
|
|
|
|
|
2016-06-17 19:32:33 +00:00
|
|
|
|
[screenshot]: images/sdk-ios-tutorial-video-screenshot.png
|