Updated iOS tutorials

This commit is contained in:
Olivier Crête 2016-06-17 15:32:33 -04:00
parent 1912801c9c
commit 1c35f99e7a
16 changed files with 310 additions and 364 deletions

11
TODO.md
View file

@ -4,11 +4,6 @@ This is just a simple TODO list to follow progress of the port from
gstreamer.com content to hotdoc
Pages to review:
- sdk-ios-tutorials.md
- sdk-ios-tutorial-link-against-gstreamer.md
- sdk-ios-tutorial-a-running-pipeline.md
- sdk-ios-tutorial-video.md
- sdk-ios-tutorial-a-basic-media-player.md
- [installing]
- sdk-installing-for-ios-development.md
- sdk-installing-on-linux.md
@ -24,6 +19,7 @@ Screenshots:
- Create new ones with the official GStreamer logo and not saying "0.10.36". Affected:
- Android tutorial 1
- Android tutorial 2
- iOS tutorial 1
- Fix filenames of all attachments to make sense
Code:
@ -57,6 +53,11 @@ Reviewed pages:
- sdk-android-tutorial-video.md
- sdk-android-tutorial-a-complete-media-player.md
- sdk-android-tutorial-media-player.md
- sdk-ios-tutorials.md
- sdk-ios-tutorial-link-against-gstreamer.md
- sdk-ios-tutorial-a-running-pipeline.md
- sdk-ios-tutorial-video.md
- sdk-ios-tutorial-a-basic-media-player.md
- sdk-playback-tutorials.md
- sdk-playback-tutorial-playbin-usage.md
- sdk-playback-tutorial-subtitle-management.md

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 191 KiB

View file

Before

Width:  |  Height:  |  Size: 400 KiB

After

Width:  |  Height:  |  Size: 400 KiB

View file

Before

Width:  |  Height:  |  Size: 191 KiB

After

Width:  |  Height:  |  Size: 191 KiB

View file

Before

Width:  |  Height:  |  Size: 387 KiB

After

Width:  |  Height:  |  Size: 387 KiB

View file

Before

Width:  |  Height:  |  Size: 20 KiB

After

Width:  |  Height:  |  Size: 20 KiB

View file

Before

Width:  |  Height:  |  Size: 21 KiB

After

Width:  |  Height:  |  Size: 21 KiB

View file

Before

Width:  |  Height:  |  Size: 115 KiB

After

Width:  |  Height:  |  Size: 115 KiB

View file

@ -1,46 +1,47 @@
# iOS tutorial 4: A basic media player
# Goal![](attachments/3571758/3539044.png)
## Goal
Enough testing with synthetic images and audio tones\! This tutorial
![screenshot]
Enough testing with synthetic images and audio tones! This tutorial
finally plays actual media, streamed directly from the Internet, in your
iOS device. It shows:
- How to keep the User Interface regularly updated with the current
playback position and duration
- How to implement a [Time
- How to implement a [Time
Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html)
- How to report the media size to adapt the display surface
It also uses the knowledge gathered in the [Basic
tutorials](Basic%2Btutorials.html) regarding:
It also uses the knowledge gathered in the [](sdk-basic-tutorials.md) regarding:
- How to use `playbin` to play any kind of media
- How to use `playbin` to play any kind of media
- How to handle network resilience problems
# Introduction
## Introduction
From the previous tutorials, we already have almost all necessary pieces
to build a media player. The most complex part is assembling a pipeline
which retrieves, decodes and displays the media, but we already know
that the `playbin` element can take care of all that for us. We only
need to replace the manual pipeline we used in [iOS tutorial 3:
Video](iOS%2Btutorial%2B3%253A%2BVideo.html) with a
single-element `playbin` pipeline and we are good to go\!
From the previous tutorials, we already have almost all necessary
pieces to build a media player. The most complex part is assembling a
pipeline which retrieves, decodes and displays the media, but we
already know that the `playbin` element can take care of all that for
us. We only need to replace the manual pipeline we used in
[](sdk-ios-tutorial-video.md) with a single-element `playbin` pipeline
and we are good to go!
However, we can do better than. We will add a [Time
However, we can do better than. We will add a [Time
Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html),
with a moving thumb that will advance as our current position in the
media advances. We will also allow the user to drag the thumb, to jump
(or *seek*) to a different position.
(or *seek*) to a different position.
And finally, we will make the video surface adapt to the media size, so
the video sink is not forced to draw black borders around the clip.
 This also allows the iOS layout to adapt more nicely to the actual
This also allows the iOS layout to adapt more nicely to the actual
media content. You can still force the video surface to have a specific
size if you really want to.
# The User Interface
## The User Interface
The User Interface from the previous tutorial is expanded again. A
`UISlider` has been added to the toolbar, to keep track of the current
@ -83,15 +84,15 @@ duration.
```
Note how we register callbacks for some of the Actions the
[UISlider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html) generates.
Also note that the class has been renamed from `ViewController` to
[UISlider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html) generates.
Also note that the class has been renamed from `ViewController` to
`VideoViewController`, since the next tutorial adds another
`ViewController` and we will need to differentiate.
# The Video View Controller
## The Video View Controller
The `ViewController `class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its
The `ViewController `class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its
behalf:
![](images/icons/grey_arrow_down.gif)Due to the extension of this code,
@ -300,14 +301,14 @@ this view is collapsed by default. Click here to expand…
Supporting arbitrary media URIs
The `GStreamerBackend`  provides the `setUri()` method so we can
indicate the URI of the media to play. Since `playbin` will be taking
The `GStreamerBackend` provides the `setUri()` method so we can
indicate the URI of the media to play. Since `playbin` will be taking
care of retrieving the media, we can use local or remote URIs
indistinctly (`file://` or `http://`, for example). From the UI code,
indistinctly (`file://` or `http://`, for example). From the UI code,
though, we want to keep track of whether the file is local or remote,
because we will not offer the same functionalities. We keep track of
this in the `is_local_media` variable, which is set when the URI is set,
in the `gstreamerInitialized` method:
this in the `is_local_media` variable, which is set when the URI is set,
in the `gstreamerInitialized` method:
```
-(void) gstreamerInitialized
@ -327,7 +328,7 @@ Reporting media size
Every time the size of the media changes (which could happen mid-stream,
for some kind of streams), or when it is first detected,
`GStreamerBackend`  calls our `mediaSizeChanged()` callback:
`GStreamerBackend` calls our `mediaSizeChanged()` callback:
```
-(void) mediaSizeChanged:(NSInteger)width height:(NSInteger)height
@ -343,18 +344,16 @@ for some kind of streams), or when it is first detected,
```
Here we simply store the new size and ask the layout to be recalculated.
As we have already seen in [iOS tutorial 2: A running
pipeline](iOS%2Btutorial%2B2%253A%2BA%2Brunning%2Bpipeline.html),
As we have already seen in [](sdk-ios-tutorial-a-running-pipeline.md),
methods which change the UI must be called from the main thread, and we
are now in a callback from some GStreamer internal thread. Hence, the
usage
of `dispatch_async()`[.](http://developer.android.com/reference/android/app/Activity.html#runOnUiThread\(java.lang.Runnable\))
of `dispatch_async()`[.](http://developer.android.com/reference/android/app/Activity.html#runOnUiThread\(java.lang.Runnable\))
### Refreshing the Time Slider
[Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html) has
already shown how to implement a Seek Bar (or [Time
[](sdk-basic-tutorial-toolkit-integration.md) has
already shown how to implement a Seek Bar (or [Time
Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html)
in this tutorial) using the GTK+ toolkit. The implementation on iOS is
very similar.
@ -363,8 +362,8 @@ The Seek Bar accomplishes to functions: First, it moves on its own to
reflect the current playback position in the media. Second, it can be
dragged by the user to seek to a different position.
To realize the first function, `GStreamerBackend`  will periodically
call our `setCurrentPosition` method so we can update the position of
To realize the first function, `GStreamerBackend` will periodically
call our `setCurrentPosition` method so we can update the position of
the thumb in the Seek Bar. Again we do so from the UI thread, using
`dispatch_async()`.
@ -383,15 +382,15 @@ the thumb in the Seek Bar. Again we do so from the UI thread, using
```
Also note that if the user is currently dragging the slider (the
`dragging_slider` variable is explained below) we ignore
`setCurrentPosition` calls from `GStreamerBackend`, as they would
`dragging_slider` variable is explained below) we ignore
`setCurrentPosition` calls from `GStreamerBackend`, as they would
interfere with the users actions.
To the left of the Seek Bar (refer to the screenshot at the top of this
page), there is
a [TextField](https://developer.apple.com/library/ios/#documentation/UIKit/Reference/UITextField_Class/Reference/UITextField.html) widget
a [TextField](https://developer.apple.com/library/ios/#documentation/UIKit/Reference/UITextField_Class/Reference/UITextField.html) widget
which we will use to display the current position and duration in
"`HH:mm:ss / HH:mm:ss"` textual format. The `updateTimeWidget` method
"`HH:mm:ss / HH:mm:ss"` textual format. The `updateTimeWidget` method
takes care of it, and must be called every time the Seek Bar is
updated:
@ -429,7 +428,7 @@ updated:
Seeking with the Seek Bar
To perform the second function of the Seek Bar (allowing the user to
To perform the second function of the Seek Bar (allowing the user to
seek by dragging the thumb), we register some callbacks through IBAction
outlets. Refer to the storyboard in this tutorials project to see which
outlets are connected. We will be notified when the user starts dragging
@ -444,11 +443,11 @@ the Slider.
}
```
`sliderTouchDown` is called when the user starts dragging. Here we pause
`sliderTouchDown` is called when the user starts dragging. Here we pause
the pipeline because if the user is searching for a particular scene, we
do not want it to keep moving. We also mark that a drag operation is in
progress in the
`dragging_slider` variable.
`dragging_slider` variable.
```
/* Called when the time slider position has changed, either because the user dragged it or
@ -462,10 +461,10 @@ progress in the
}
```
`sliderValueChanged` is called every time the Sliders thumb moves, be
`sliderValueChanged` is called every time the Sliders thumb moves, be
it because the user dragged it, or because we changed its value form the
program. We discard the latter case using the
`dragging_slider` variable.
`dragging_slider` variable.
As the comment says, if this is a local media, we allow scrub seeking,
this is, we jump to the indicated position as soon as the thumb moves.
@ -486,24 +485,21 @@ widget.
}
```
Finally, `sliderTouchUp` is called when the thumb is released. We
Finally, `sliderTouchUp` is called when the thumb is released. We
perform the seek operation if the file was non-local, restore the
pipeline to the desired playing state and end the dragging operation by
setting `dragging_slider` to NO.
This concludes the User interface part of this tutorial. Lets review
now the `GStreamerBackend`  class that allows this to work.
now the `GStreamerBackend` class that allows this to work.
# The GStreamer Backend
## The GStreamer Backend
The `GStreamerBackend` class performs all GStreamer-related tasks and
The `GStreamerBackend` class performs all GStreamer-related tasks and
offers a simplified interface to the application, which does not need to
deal with all the GStreamer details. When it needs to perform any UI
action, it does so through a delegate, which is expected to adhere to
the `GStreamerBackendDelegate` protocol.
![](images/icons/grey_arrow_down.gif)Due to the extension of this code,
this view is collapsed by default. Click here to expand…
the `GStreamerBackendDelegate` protocol.
**GStreamerBackend.m**
@ -511,7 +507,6 @@ this view is collapsed by default. Click here to expand…
#import "GStreamerBackend.h"
#include <gst/gst.h>
#include <gst/interfaces/xoverlay.h>
#include <gst/video/video.h>
GST_DEBUG_CATEGORY_STATIC (debug_category);
@ -530,7 +525,7 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
@implementation GStreamerBackend {
id ui_delegate; /* Class that we use to interact with the user interface */
GstElement *pipeline; /* The running pipeline */
GstElement *video_sink; /* The video sink element which receives XOverlay commands */
GstElement *video_sink; /* The video sink element which receives VideoOverlay commands */
GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
@ -630,7 +625,6 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
/* If we have pipeline and it is running, query the current position and clip duration and inform
* the application */
static gboolean refresh_ui (GStreamerBackend *self) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 position;
/* We do not want to update anything unless we have a working pipeline in the PAUSED or PLAYING state */
@ -639,10 +633,10 @@ static gboolean refresh_ui (GStreamerBackend *self) {
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (self->duration)) {
gst_element_query_duration (self->pipeline, &fmt, &self->duration);
gst_element_query_duration (self->pipeline, GST_FORMAT_TIME, &self->duration);
}
if (gst_element_query_position (self->pipeline, &fmt, &position)) {
if (gst_element_query_position (self->pipeline, GST_FORMAT_TIME, &position)) {
/* The UI expects these values in milliseconds, and GStreamer provides nanoseconds */
[self setCurrentUIPosition:position / GST_MSECOND duration:self->duration / GST_MSECOND];
}
@ -756,9 +750,7 @@ static void check_media_size (GStreamerBackend *self) {
GstElement *video_sink;
GstPad *video_sink_pad;
GstCaps *caps;
GstVideoFormat fmt;
int width;
int height;
GstVideoInfo info;
/* Retrieve the Caps at the entrance of the video sink */
g_object_get (self->pipeline, "video-sink", &video_sink, NULL);
@ -767,18 +759,15 @@ static void check_media_size (GStreamerBackend *self) {
if (!video_sink) return;
video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
caps = gst_pad_get_negotiated_caps (video_sink_pad);
caps = gst_pad_get_current_caps (video_sink_pad);
if (gst_video_format_parse_caps(caps, &fmt, &width, &height)) {
int par_n, par_d;
if (gst_video_parse_caps_pixel_aspect_ratio (caps, &par_n, &par_d)) {
width = width * par_n / par_d;
}
GST_DEBUG ("Media size is %dx%d, notifying application", width, height);
if (gst_video_info_from_caps(&info, caps)) {
info.width = info.width * info.par_n / info.par_d
GST_DEBUG ("Media size is %dx%d, notifying application", info.width, info.height);
if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:height:)])
if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:info.height:)])
{
[self->ui_delegate mediaSizeChanged:width height:height];
[self->ui_delegate mediaSizeChanged:info.width height:info.height];
}
}
@ -851,12 +840,12 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_X_OVERLAY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
if (!video_sink) {
GST_ERROR ("Could not retrieve video sink");
return;
}
gst_x_overlay_set_window_handle(GST_X_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (pipeline);
@ -905,7 +894,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
Supporting arbitrary media URIs
The UI code will call `setUri` whenever it wants to change the playing
The UI code will call `setUri` whenever it wants to change the playing
URI (in this tutorial the URI never changes, but it does in the next
one):
@ -918,11 +907,11 @@ one):
}
```
We first need to obtain a plain `char *` from within the `NSString *` we
get, using the `UTF8String` method.
We first need to obtain a plain `char *` from within the `NSString *` we
get, using the `UTF8String` method.
`playbin`s URI is exposed as a common GObject property, so we simply
set it with `g_object_set()`.
set it with `g_object_set()`.
### Reporting media size
@ -930,7 +919,7 @@ Some codecs allow the media size (width and height of the video) to
change during playback. For simplicity, this tutorial assumes that they
do not. Therefore, in the READY to PAUSED state change, once the Caps of
the decoded media are known, we inspect them
in `check_media_size()`:
in `check_media_size()`:
```
/* Retrieve the video sink's Caps and tell the application about the media size */
@ -938,9 +927,7 @@ static void check_media_size (GStreamerBackend *self) {
GstElement *video_sink;
GstPad *video_sink_pad;
GstCaps *caps;
GstVideoFormat fmt;
int width;
int height;
GstVideoInfo info;
/* Retrieve the Caps at the entrance of the video sink */
g_object_get (self->pipeline, "video-sink", &video_sink, NULL);
@ -949,18 +936,15 @@ static void check_media_size (GStreamerBackend *self) {
if (!video_sink) return;
video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
caps = gst_pad_get_negotiated_caps (video_sink_pad);
caps = gst_pad_get_current_caps (video_sink_pad);
if (gst_video_format_parse_caps(caps, &fmt, &width, &height)) {
int par_n, par_d;
if (gst_video_parse_caps_pixel_aspect_ratio (caps, &par_n, &par_d)) {
width = width * par_n / par_d;
}
GST_DEBUG ("Media size is %dx%d, notifying application", width, height);
if (gst_video_info_from_caps(&info, caps)) {
info.width = info.width * info.par_n / info.par_d;
GST_DEBUG ("Media size is %dx%d, notifying application", info.width, info.height);
if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:height:)])
if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:info.height:)])
{
[self->ui_delegate mediaSizeChanged:width height:height];
[self->ui_delegate mediaSizeChanged:info.width height:info.height];
}
}
@ -971,19 +955,19 @@ static void check_media_size (GStreamerBackend *self) {
```
We first retrieve the video sink element from the pipeline, using
the `video-sink` property of `playbin`, and then its sink Pad. The
the `video-sink` property of `playbin`, and then its sink Pad. The
negotiated Caps of this Pad, which we recover using
`gst_pad_get_negotiated_caps()`,  are the Caps of the decoded media.
`gst_pad_get_current_caps()`, are the Caps of the decoded media.
The helper functions `gst_video_format_parse_caps()` and
`gst_video_parse_caps_pixel_aspect_ratio()` turn the Caps into
The helper functions `gst_video_format_parse_caps()` and
`gst_video_parse_caps_pixel_aspect_ratio()` turn the Caps into
manageable integers, which we pass to the application through
its `mediaSizeChanged` callback.
its `mediaSizeChanged` callback.
### Refreshing the Seek Bar
To keep the UI updated, a GLib timer is installed in
the `app_function` that fires 4 times per second (or every 250ms),
the `app_function` that fires 4 times per second (or every 250ms),
right before entering the main loop:
```
@ -1001,7 +985,6 @@ method:
/* If we have pipeline and it is running, query the current position and clip duration and inform
* the application */
static gboolean refresh_ui (GStreamerBackend *self) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 position;
/* We do not want to update anything unless we have a working pipeline in the PAUSED or PLAYING state */
@ -1010,10 +993,10 @@ static gboolean refresh_ui (GStreamerBackend *self) {
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (self->duration)) {
gst_element_query_duration (self->pipeline, &fmt, &self->duration);
gst_element_query_duration (self->pipeline, GST_FORMAT_TIME, &self->duration);
}
if (gst_element_query_position (self->pipeline, &fmt, &position)) {
if (gst_element_query_position (self->pipeline, GST_FORMAT_TIME, &position)) {
/* The UI expects these values in milliseconds, and GStreamer provides nanoseconds */
[self setCurrentUIPosition:position / GST_MSECOND duration:self->duration / GST_MSECOND];
}
@ -1021,21 +1004,20 @@ static gboolean refresh_ui (GStreamerBackend *self) {
}
```
If it is unknown, the clip duration is retrieved, as explained in [Basic
tutorial 4: Time
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html). The
current position is retrieved next, and the UI is informed of both
through its `setCurrentUIPosition` callback.
If it is unknown, the clip duration is retrieved, as explained in
[](sdk-basic-tutorial-time-management.md). The current position is
retrieved next, and the UI is informed of both through its
`setCurrentUIPosition` callback.
Bear in mind that all time-related measures returned by GStreamer are in
nanoseconds, whereas, for simplicity, we decided to make the UI code
work in milliseconds. 
work in milliseconds.
### Seeking with the Seek Bar
The UI code already takes care of most of the complexity of seeking by
dragging the thumb of the Seek Bar. From the `GStreamerBackend`, we just
need to honor the calls to `setPosition` and instruct the pipeline to
need to honor the calls to `setPosition` and instruct the pipeline to
jump to the indicated position.
There are, though, a couple of caveats. Firstly, seeks are only possible
@ -1047,7 +1029,7 @@ see how to overcome these problems.
#### Delayed seeks
In `setPosition`:
In `setPosition`:
```
-(void) setPosition:(NSInteger)milliseconds
@ -1064,8 +1046,8 @@ In `setPosition`:
If we are already in the correct state for seeking, execute it right
away; otherwise, store the desired position in
the `desired_position` variable. Then, in
the `state_changed_cb()` callback:
the `desired_position` variable. Then, in
the `state_changed_cb()` callback:
```
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED)
@ -1080,7 +1062,7 @@ the `state_changed_cb()` callback:
Once the pipeline moves from the READY to the PAUSED state, we check if
there is a pending seek operation and execute it.
The `desired_position` variable is reset inside `execute_seek()`.
The `desired_position` variable is reset inside `execute_seek()`.
#### Seek throttling
@ -1097,11 +1079,11 @@ second one, it is up to it to finish the first one, start the second one
or abort both, which is a bad thing. A simple method to avoid this issue
is *throttling*, which means that we will only allow one seek every half
a second (for example): after performing a seek, only the last seek
request received during the next 500ms is stored, and will be honored
request received during the next 500ms is stored, and will be honored
once this period elapses.
To achieve this, all seek requests are routed through
the `execute_seek()` method:
To achieve this, all seek requests are routed through
the `execute_seek()` method:
```
/* Perform seek, if we are not too close to the previous seek. Otherwise, schedule the seek for
@ -1141,34 +1123,28 @@ static void execute_seek (gint64 position, GStreamerBackend *self) {
```
The time at which the last seek was performed is stored in
the `last_seek_time` variable. This is wall clock time, not to be
the `last_seek_time` variable. This is wall clock time, not to be
confused with the stream time carried in the media time stamps, and is
obtained with `gst_util_get_timestamp()`.
If enough time has passed since the last seek operation, the new one is
directly executed and `last_seek_time` is updated. Otherwise, the new
directly executed and `last_seek_time` is updated. Otherwise, the new
seek is scheduled for later. If there is no previously scheduled seek, a
one-shot timer is setup to trigger 500ms after the last seek operation.
If another seek was already scheduled, its desired position is simply
updated with the new one.
The one-shot timer calls `delayed_seek_cb()`, which simply
calls `execute_seek()` again.
The one-shot timer calls `delayed_seek_cb()`, which simply
calls `execute_seek()` again.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p><span>Ideally, </span><code>execute_seek()</code><span> will now find that enough time has indeed passed since the last seek and the scheduled one will proceed. It might happen, though, that after 500ms of the previous seek, and before the timer wakes up, yet another seek comes through and is executed. </span><code>delayed_seek_cb()</code><span> needs to check for this condition to avoid performing two very close seeks, and therefore calls </span><code>execute_seek()</code><span> instead of performing the seek itself.</span></p>
<p>This is not a complete solution: the scheduled seek will still be executed, even though a more-recent seek has already been executed that should have cancelled it. However, it is a good tradeoff between functionality and simplicity.</p></td>
</tr>
</tbody>
</table>
> ![information]
> Ideally, `execute_seek()` will now find that enough time has indeed passed since the last seek and the scheduled one will proceed. It might happen, though, that after 500ms of the previous seek, and before the timer wakes up, yet another seek comes through and is executed. `delayed_seek_cb()` needs to check for this condition to avoid performing two very close seeks, and therefore calls `execute_seek()` instead of performing the seek itself.
>
>This is not a complete solution: the scheduled seek will still be executed, even though a more-recent seek has already been executed that should have cancelled it. However, it is a good tradeoff between functionality and simplicity.
Network resilience
### Network resilience
[Basic tutorial 12:
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) has already
[](sdk-basic-tutorial-streaming.md) has already
shown how to adapt to the variable nature of the network bandwidth by
using buffering. The same procedure is used here, by listening to the
buffering
@ -1182,7 +1158,7 @@ And pausing the pipeline until buffering is complete (unless this is a
live
source):
 
```
/* Called when buffering messages are received. We inform the UI about the current buffering level and
@ -1207,14 +1183,14 @@ static void buffering_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
}
```
`target_state` is the state in which we have been instructed to set the
`target_state` is the state in which we have been instructed to set the
pipeline, which might be different to the current state, because
buffering forces us to go to PAUSED. Once buffering is complete we set
the pipeline to the `target_state`.
the pipeline to the `target_state`.
### Conclusion
## Conclusion
This tutorial has shown how to embed a `playbin` pipeline into an iOS
This tutorial has shown how to embed a `playbin` pipeline into an iOS
application. This, effectively, turns such application into a basic
media player, capable of streaming and decoding all the formats
GStreamer understands. More particularly, it has shown:
@ -1230,8 +1206,5 @@ GStreamer understands. More particularly, it has shown:
The next tutorial adds the missing bits to turn the application built
here into an acceptable iOS media player.
## Attachments:
![](images/icons/bullet_blue.gif)
[ios-tutorial4-screenshot.png](attachments/3571758/3539044.png)
(image/png)
[information]: images/icons/emoticons/information.png
[screenshot]: images/sdk-ios-tutorial-a-basic-media-player-screenshot.png

View file

@ -1,20 +1,22 @@
# iOS tutorial 5: A Complete media player
# Goal![](attachments/3571769/3539046.png)![](attachments/3571769/3539045.png)
## Goal
![screenshot0]
![screenshot1]
This tutorial wants to be the “demo application” that showcases what can
be done with GStreamer on the iOS platform.
It is intended to be built and run, rather than analyzed for its
pedagogical value, since it adds very little GStreamer knowledge over
what has already been shown in [iOS tutorial 4: A basic media
player](iOS%2Btutorial%2B4%253A%2BA%2Bbasic%2Bmedia%2Bplayer.html).
what has already been shown in [](sdk-ios-tutorial-a-basic-media-player.md).
It demonstrates the main functionality that a conventional media player
has, but it is not a complete application yet, therefore it has not been
uploaded to the AppStore.
# Introduction
## Introduction
The previous tutorial already implemented a basic media player. This one
simply adds a few finishing touches. In particular, it adds the
@ -25,12 +27,11 @@ These are not features directly related to GStreamer, and are therefore
outside the scope of these tutorials. Only a few implementation pointers
are given here.
# Selecting the media to play
## Selecting the media to play
A new `UIView` has been added, derived from `UITableViewController`
which shows a list of clips. When one is selected, the
`VideoViewController` from [Tutorial
4](iOS%2Btutorial%2B4%253A%2BA%2Bbasic%2Bmedia%2Bplayer.html) appears
`VideoViewController` from [](sdk-ios-tutorial-a-basic-media-player.md) appears
and its URI property is set to the URI of the selected clip.
The list of clips is populated from three sources: Media from the
@ -39,17 +40,17 @@ devices Photo library, Media from the applications Documents folder
Internet addresses, selected to showcase different container and codec
formats, and a couple of bogus ones, to illustrate error reporting.
# Preventing the screen from turning off
## Preventing the screen from turning off
While watching a movie, there is typically no user activity. After a
short period of such inactivity, iOS will dim the screen, and then turn
it off completely. To prevent this, the `idleTimerDisabled` property of
the `UIApplication` class is used. The application sets it to YES
it off completely. To prevent this, the `idleTimerDisabled` property of
the `UIApplication` class is used. The application sets it to YES
(screen locking disabled) when the Play button is pressed, so the screen
is never turned off, and sets it back to NO when the Pause button is
pressed.
# Conclusion
## Conclusion
This finishes the series of iOS tutorials. Each one of the preceding
tutorials has evolved on top of the previous one, showing how to
@ -57,16 +58,7 @@ implement a particular set of features, and concluding in this Tutorial
5. The goal of Tutorial 5 is to build a complete media player which can
already be used to showcase the integration of GStreamer and iOS.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot0.png](attachments/3571769/3539071.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot1.png](attachments/3571769/3539046.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot0.png](attachments/3571769/3539045.png)
(image/png)
[screenshot0]: images/sdk-ios-tutorial-a-complete-media-player-screenshot-0.png
[screenshot1]: images/sdk-ios-tutorial-a-complete-media-player-screenshot-1.png

View file

@ -1,9 +1,11 @@
# iOS tutorial 2: A running pipeline
# Goal![](attachments/3571718/3538953.png)
## Goal
As seen in the [Basic](Basic%2Btutorials.html) and
[Playback](Playback%2Btutorials.html) tutorials, GStreamer integrates
![screenshot]
As seen in the [Basic](sdk-basic-tutorials.md) and
[Playback](sdk-playback-tutorials.md) tutorials, GStreamer integrates
nicely with GLibs main loops, so pipeline operation and user interface
can be monitored simultaneously in a very simple way. However, platforms
like iOS or Android do not use GLib and therefore extra care must be
@ -17,18 +19,18 @@ This tutorial shows:
- How to communicate between the Objective-C UI code and the C
GStreamer code
# Introduction
## Introduction
When using a Graphical User Interface (UI), if the application waits for
GStreamer calls to complete the user experience will suffer. The usual
approach, with the [GTK+ toolkit](http://www.gtk.org/) for example, is
to relinquish control to a GLib `GMainLoop` and let it control the
approach, with the [GTK+ toolkit](http://www.gtk.org/) for example, is
to relinquish control to a GLib `GMainLoop` and let it control the
events coming from the UI or GStreamer.
Other graphical toolkits that are not based on GLib, like the [Cocoa
Touch](https://developer.apple.com/technologies/ios/cocoa-touch.html)
Touch](https://developer.apple.com/library/ios/documentation/General/Conceptual/DevPedia-CocoaCore/Cocoa.html)
framework used on iOS devices, cannot use this option, though. The
solution used in this tutorial uses a GLib `GMainLoop` for its
solution used in this tutorial uses a GLib `GMainLoop` for its
simplicity, but moves it to a separate thread (a [Dispatch
Queue](http://developer.apple.com/library/ios/#documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html)
different than the main one) so it does not block the user interface
@ -37,12 +39,12 @@ operation.
Additionally, this tutorial shows a few places where caution has to be
taken when calling from Objective-C to C and vice versa.
The code below builds a pipeline with an `audiotestsrc` and
an `autoaudiosink` (it plays an audible tone). Two buttons in the UI
The code below builds a pipeline with an `audiotestsrc` and
an `autoaudiosink` (it plays an audible tone). Two buttons in the UI
allow setting the pipeline to PLAYING or PAUSED. A Label in the UI shows
messages sent from the C code (for errors and state changes).
# The User Interface
## The User Interface
A toolbar at the bottom of the screen contains a Play and a Pause
button. Over the toolbar there is a Label used to display messages from
@ -50,10 +52,10 @@ GStreamer. This tutorial does not require more elements, but the
following lessons will build their User Interfaces on top of this one,
adding more components.
# The View Controller
## The View Controller
The `ViewController` class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its
The `ViewController` class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its
behalf:
**ViewController.m**
@ -126,7 +128,7 @@ behalf:
@end
```
An instance of the `GStreamerBackend` in stored inside the class:
An instance of the `GStreamerBackend` in stored inside the class:
```
@interface ViewController () {
@ -134,8 +136,8 @@ An instance of the `GStreamerBackend` in stored inside the class:
}
```
This instance is created in the `viewDidLoad` function through a custom
`init:` method in the `GStreamerBackend`:
This instance is created in the `viewDidLoad` function through a custom
`init:` method in the `GStreamerBackend`:
```
- (void)viewDidLoad
@ -150,11 +152,11 @@ This instance is created in the `viewDidLoad` function through a custom
```
This custom method is required to pass the object that has to be used as
the UI delegate (in this case, ourselves, the `ViewController`).
the UI delegate (in this case, ourselves, the `ViewController`).
The Play and Pause buttons are also disabled in the
`viewDidLoad` function, and they are not re-enabled until the
`GStreamerBackend` reports that it is initialized and ready.
`viewDidLoad` function, and they are not re-enabled until the
`GStreamerBackend` reports that it is initialized and ready.
```
/* Called when the Play button is pressed */
@ -185,13 +187,13 @@ buttons, and simply forward the call to the appropriate method in the
}
```
The `gstreamerInitialized` method is defined in the
`GStreamerBackendDelegate` protocol and indicates that the backend is
The `gstreamerInitialized` method is defined in the
`GStreamerBackendDelegate` protocol and indicates that the backend is
ready to accept commands. In this case, the Play and Pause buttons are
re-enabled and the Label text is set to “Ready”. This method is called
from a Dispatch Queue other than the Main one; therefore the need for
the
[dispatch\_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call
[dispatch_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call
wrapping all UI code.
```
@ -203,19 +205,19 @@ wrapping all UI code.
}
```
The `gstreamerSetUIMessage:` method also belongs to the
`GStreamerBackendDelegate` protocol. It is called when the backend wants
The `gstreamerSetUIMessage:` method also belongs to the
`GStreamerBackendDelegate` protocol. It is called when the backend wants
to report some message to the user. In this case, the message is copied
onto the Label in the UI, again, from within a
[dispatch\_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call.
[dispatch_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call.
# The GStreamer Backend
## The GStreamer Backend
The `GStreamerBackend` class performs all GStreamer-related tasks and
The `GStreamerBackend` class performs all GStreamer-related tasks and
offers a simplified interface to the application, which does not need to
deal with all the GStreamer details. When it needs to perform any UI
action, it does so through a delegate, which is expected to adhere to
the `GStreamerBackendDelegate` protocol:
the `GStreamerBackendDelegate` protocol:
**GStreamerBackend.m**
@ -398,7 +400,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
@end
```
 
#### Interface methods:
@ -422,13 +424,13 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
}
```
The `init:` method creates the instance by calling `[super init]`,
The `init:` method creates the instance by calling `[super init]`,
stores the delegate object that will handle the UI interaction and
launches the `app_function`, from a separate, concurrent, Dispatch
Queue. The `app_function` monitors the GStreamer bus for messages and
Queue. The `app_function` monitors the GStreamer bus for messages and
warns the application when interesting things happen.
`init:` also registers a new GStreamer debug category and sets its
`init:` also registers a new GStreamer debug category and sets its
threshold, so we can see the debug output from within Xcode and keep
track of our application progress.
@ -444,7 +446,7 @@ track of our application progress.
}
```
The `dealloc` method takes care of bringing the pipeline to the NULL
The `dealloc` method takes care of bringing the pipeline to the NULL
state and releasing it.
```
@ -463,7 +465,7 @@ state and releasing it.
}
```
The `play` and `pause` methods simply try to set the pipeline to the
The `play` and `pause` methods simply try to set the pipeline to the
desired state and warn the application if something fails.
#### Private methods:
@ -480,11 +482,11 @@ desired state and warn the application if something fails.
}
```
`setUIMessage:` turns the C strings that GStreamer uses (UTF8 `char *`)
into `NSString *` and displays them through the
`gstreamerSetUIMessage` method of the `GStreamerBackendDelegate`. The
`setUIMessage:` turns the C strings that GStreamer uses (UTF8 `char *`)
into `NSString *` and displays them through the
`gstreamerSetUIMessage` method of the `GStreamerBackendDelegate`. The
implementation of this method is marked as `@optional`, and hence the
check for its existence in the delegate with `respondsToSelector:`
check for its existence in the delegate with `respondsToSelector:`
```
/* Retrieve errors from the bus and show them on the UI */
@ -517,18 +519,18 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
}
```
The `error_cb()` and `state_changed_cb()` are callbacks registered to
the `error` and `state-changed` events in GStreamer, and their goal is
The `error_cb()` and `state_changed_cb()` are callbacks registered to
the `error` and `state-changed` events in GStreamer, and their goal is
to inform the user about these events. These callbacks have been widely
used in the [Basic tutorials](Basic%2Btutorials.html) and their
used in the [Basic tutorials](sdk-basic-tutorials.md) and their
implementation is very similar, except for two points:
Firstly, the messages are conveyed to the user through the
`setUIMessage:` private method discussed above.
`setUIMessage:` private method discussed above.
Secondly, they require an instance of a `GStreamerBackend` object in
Secondly, they require an instance of a `GStreamerBackend` object in
order to call its instance method `setUIMessage:`, which is passed
through the `userdata` pointer of the callbacks (the `self` pointer in
through the `userdata` pointer of the callbacks (the `self` pointer in
these implementations). This is discussed below when registering the
callbacks in the `app_function`.
@ -548,14 +550,14 @@ callbacks in the `app_function`.
}
```
`check_initialization_complete()` verifies that all conditions are met
`check_initialization_complete()` verifies that all conditions are met
to consider the backend ready to accept commands and tell the
application if so. In this simple tutorial the only conditions are that
the main loop exists and that we have not already told the application
about this fact. Later (more complex) tutorials include additional
conditions.
Finally, most of the GStreamer work is performed in the app\_function.
Finally, most of the GStreamer work is performed in the app_function.
It exists with almost identical content in the Android tutorial, which
exemplifies how the same code can run on both platforms with little
change.
@ -566,11 +568,11 @@ change.
g_main_context_push_thread_default(context);
```
It first creates a GLib context so all `GSource`s are kept in the same
It first creates a GLib context so all `GSource`s are kept in the same
place. This also helps cleaning after GSources created by other
libraries which might not have been properly disposed of. A new context
is created with `g_main_context_new()` and then it is made the default
one for the thread with `g_main_context_push_thread_default()`.
is created with `g_main_context_new()` and then it is made the default
one for the thread with `g_main_context_push_thread_default()`.
```
/* Build pipeline */
@ -584,9 +586,9 @@ one for the thread with `g_main_context_push_thread_default()`.
}
```
It then creates a pipeline the easy way, with `gst_parse_launch()`. In
this case, it is simply an  `audiotestsrc` (which produces a continuous
tone) and an `autoaudiosink`, with accompanying adapter
It then creates a pipeline the easy way, with `gst_parse_launch()`. In
this case, it is simply an `audiotestsrc` (which produces a continuous
tone) and an `autoaudiosink`, with accompanying adapter
elements.
```
@ -603,11 +605,11 @@ elements.
These lines create a bus signal watch and connect to some interesting
signals, just like we have been doing in the [Basic
tutorials](Basic%2Btutorials.html). The creation of the watch is done
step by step instead of using `gst_bus_add_signal_watch()` to exemplify
tutorials](sdk-basic-tutorials.md). The creation of the watch is done
step by step instead of using `gst_bus_add_signal_watch()` to exemplify
how to use a custom GLib context. The interesting bit here is the usage
of a
[\_\_bridge](http://clang.llvm.org/docs/AutomaticReferenceCounting.html#bridged-casts)
[__bridge](http://clang.llvm.org/docs/AutomaticReferenceCounting.html#bridged-casts)
cast to convert an Objective-C object into a plain C pointer. In this
case we do not worry much about transferal of ownership of the object,
because it travels through C-land untouched. It re-emerges at the
@ -626,38 +628,32 @@ different callbacks through the userdata pointer and cast again to a
```
Finally, the main loop is created and set to run. Before entering the
main loop, though, `check_initialization_complete()` is called. Upon
main loop, though, `check_initialization_complete()` is called. Upon
exit, the main loop is disposed of.
And this is it\! This has been a rather long tutorial, but we covered a
And this is it! This has been a rather long tutorial, but we covered a
lot of territory. Building on top of this one, the following ones are
shorter and focus only on the new topics.
# Conclusion
## Conclusion
This tutorial has shown:
- How to handle GStreamer code from a separate thread using a
[Dispatch
Queue](http://developer.apple.com/library/ios/#documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html) other
Queue](http://developer.apple.com/library/ios/#documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html) other
than the Main one.
- How to pass objects between the Objective-C UI code and the C
GStreamer code.
Most of the methods introduced in this tutorial,
like `check_initialization_complete()`and `app_function()`, and the
interface methods `init:`, `play:`, `pause:`,
`gstreamerInitialized:` and `setUIMessage:` will continue to be used in
like `check_initialization_complete()`and `app_function()`, and the
interface methods `init:`, `play:`, `pause:`,
`gstreamerInitialized:` and `setUIMessage:` will continue to be used in
the following tutorials with minimal modifications, so better get used
to them\!
to them!
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[ios-tutorial2-screenshot.png](attachments/3571718/3538954.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial2-screenshot.png](attachments/3571718/3538953.png)
(image/png)
[screenshot]: images/sdk-ios-tutorial-a-running-pipeline-screenshot.png

View file

@ -1,30 +1,31 @@
# iOS tutorial 1: Link against GStreamer
# Goal![](attachments/thumbnails/3014792/3113601)
## Goal
![screenshot]
The first iOS tutorial is simple. The objective is to get the GStreamer
version and display it on screen. It exemplifies how to link against the
GStreamer library from Xcode using objective-C.
# Hello GStreamer\!
## Hello GStreamer!
The code for this project can be found in the tutorials folder of your
GStreamer SDK iOS installation. It was created using the GStreamer
Single View Application template. The view contains only a
`UILabel` that will be used to display the GStreamer's version to the
user.
The code for this project can be found in the tutorials folder of
**FIXME: where**. It was created using the GStreamer Single View
Application template. The view contains only a `UILabel` that will be
used to display the GStreamer's version to the user.
# The User Interface
## The User Interface
The UI uses storyboards and contains a single `View` with a centered
`UILabel`. The `ViewController` for the `View` links its
`label` variable to this `UILabel` as an `IBOutlet`.
The UI uses storyboards and contains a single `View` with a centered
`UILabel`. The `ViewController` for the `View` links its
`label` variable to this `UILabel` as an `IBOutlet`.
**ViewController.h**
```
#import <UIKit/UIKit.h>
 
@interface ViewController : UIViewController {
IBOutlet UILabel *label;
}
@ -34,18 +35,18 @@ The UI uses storyboards and contains a single `View` with a centered
@end
```
# The GStreamer backend
## The GStreamer backend
All GStreamer-handling code is kept in a single Objective-C class called
`GStreamerBackend`. In successive tutorials it will get expanded, but,
for now, it only contains a method to retrieve the GStreamer version.
The `GStreamerBackend` is made in Objective-C so it can take care of the
The `GStreamerBackend` is made in Objective-C so it can take care of the
few C-to-Objective-C conversions that might be necessary (like `char
*` to `NSString *`, for example). This eases the usage of this class by
*` to `NSString *`, for example). This eases the usage of this class by
the UI code, which is typically made in pure Objective-C.
`GStreamerBackend` serves exactly the same purpose as the JNI code in
the [Android tutorials](Android%2Btutorials.html).
`GStreamerBackend` serves exactly the same purpose as the JNI code in
the [](sdk-android-tutorials.md).
**GStreamerBackend.m**
@ -67,19 +68,19 @@ the [Android tutorials](Android%2Btutorials.html).
@end
```
The `getGStreamerVersion()` method simply calls
`gst_version_string()` to obtain a string describing this version of
GStreamer. This [Modified
UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) string is then
converted to a `NSString *` by ` NSString:stringWithUTF8String `and
The `getGStreamerVersion()` method simply calls
`gst_version_string()` to obtain a string describing this version of
GStreamer. This [Modified
UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) string is then
converted to a `NSString *` by ` NSString:stringWithUTF8String `and
returned. Objective-C will take care of freeing the memory used by the
new `NSString *`, but we need to free the `char *` returned
by `gst_version_string()`.
new `NSString *`, but we need to free the `char *` returned
by `gst_version_string()`.
# The View Controller
## The View Controller
The view controller instantiates the GStremerBackend and asks it for the
GStreamer version to display at the label. That's it\!
GStreamer version to display at the label. That's it!
**ViewController.m**
@ -115,28 +116,18 @@ GStreamer version to display at the label. That's it\!
@end
```
# Conclusion
## Conclusion
This ends the first iOS tutorial. It has shown that, due to the
compatibility of C and Objective-C, adding GStreamer support to an iOS
compatibility of C and Objective-C, adding GStreamer support to an iOS
app is as easy as it is on a Desktop application. An extra Objective-C
wrapper has been added (the `GStreamerBackend` class) for clarity, but
wrapper has been added (the `GStreamerBackend` class) for clarity, but
calls to the GStreamer framework are valid from any part of the
application code.
The following tutorials detail the few places in which care has to be
taken when developing specifically for the iOS platform.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113602.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113603.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113601.png)
(image/png)
[screenshot]: images/sdk-ios-tutorial-link-against-gstreamer-screenshot.png

View file

@ -1,9 +1,10 @@
# iOS tutorial 3: Video
# Goal![](attachments/3571736/3538955.png)
# Goal
Except for [Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html),
![screenshot]
Except for [](sdk-basic-tutorial-toolkit-integration.md),
which embedded a video window on a GTK application, all tutorials so far
relied on GStreamer video sinks to create a window to display their
contents. The video sink on iOS is not capable of creating its own
@ -17,24 +18,23 @@ shows:
Since iOS does not provide a windowing system, a GStreamer video sink
cannot create pop-up windows as it would do on a Desktop platform.
Fortunately, the `XOverlay` interface allows providing video sinks with
Fortunately, the `VideoOverlay` interface allows providing video sinks with
an already created window onto which they can draw, as we have seen
in [Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html).
in [](sdk-basic-tutorial-toolkit-integration.md).
In this tutorial, a `UIView` widget (actually, a subclass of it) is
placed on the main storyboard. In the `viewDidLoad` method of the
`ViewController`, we pass a pointer to this `UIView `to the instance of
In this tutorial, a `UIView` widget (actually, a subclass of it) is
placed on the main storyboard. In the `viewDidLoad` method of the
`ViewController`, we pass a pointer to this `UIView `to the instance of
the `GStreamerBackend`, so it can tell the video sink where to draw.
# The User Interface
The storyboard from the previous tutorial is expanded: A `UIView `is
The storyboard from the previous tutorial is expanded: A `UIView `is
added over the toolbar and pinned to all sides so it takes up all
available space (`video_container_view` outlet). Inside it, another
`UIView `is added (`video_view` outlet) which contains the actual video,
`UIView `is added (`video_view` outlet) which contains the actual video,
centered to its parent, and with a size that adapts to the media size
(through the `video_width_constraint` and `video_height_constraint`
(through the `video_width_constraint` and `video_height_constraint`
outlets):
**ViewController.h**
@ -65,8 +65,8 @@ outlets):
# The View Controller
The `ViewController `class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its
The `ViewController `class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its
behalf:
**ViewController.m**
@ -193,12 +193,12 @@ media is constant and initialized in `viewDidLoad`:
}
```
As shown below, the `GStreamerBackend` constructor has also been
expanded to accept another parameter: the `UIView *` where the video
As shown below, the `GStreamerBackend` constructor has also been
expanded to accept another parameter: the `UIView *` where the video
sink should draw.
The rest of the `ViewController `code is the same as the previous
tutorial, except for the code that adapts the `video_view` size to the
The rest of the `ViewController `code is the same as the previous
tutorial, except for the code that adapts the `video_view` size to the
media size, respecting its aspect ratio:
```
@ -220,11 +220,11 @@ media size, respecting its aspect ratio:
}
```
The `viewDidLayoutSubviews` method is called every time the main view
The `viewDidLayoutSubviews` method is called every time the main view
size has changed (for example, due to a device orientation change) and
the entire layout has been recalculated. At this point, we can access
the `bounds` property of the `video_container_view` to retrieve its new
size and change the `video_view` size accordingly.
the `bounds` property of the `video_container_view` to retrieve its new
size and change the `video_view` size accordingly.
The simple algorithm above maximizes either the width or the height of
the `video_view`, while changing the other axis so the aspect ratio of
@ -233,18 +233,18 @@ with a surface of the correct proportions, so it does not need to add
black borders (*letterboxing*), which is a waste of processing power.
The final size is reported to the layout engine by changing the
`constant` field in the width and height `Constraints` of the
`constant` field in the width and height `Constraints` of the
`video_view`. These constraints have been created in the storyboard and
are accessible to the `ViewController `through IBOutlets, as is usually
are accessible to the `ViewController `through IBOutlets, as is usually
done with other widgets.
# The GStreamer Backend
The `GStreamerBackend` class performs all GStreamer-related tasks and
The `GStreamerBackend` class performs all GStreamer-related tasks and
offers a simplified interface to the application, which does not need to
deal with all the GStreamer details. When it needs to perform any UI
action, it does so through a delegate, which is expected to adhere to
the `GStreamerBackendDelegate` protocol:
the `GStreamerBackendDelegate` protocol:
**GStreamerBackend.m**
@ -252,7 +252,7 @@ the `GStreamerBackendDelegate` protocol:
#import "GStreamerBackend.h"
#include <gst/gst.h>
#include <gst/interfaces/xoverlay.h>
#include <gst/video/video.h>
GST_DEBUG_CATEGORY_STATIC (debug_category);
#define GST_CAT_DEFAULT debug_category
@ -266,7 +266,7 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
@implementation GStreamerBackend {
id ui_delegate; /* Class that we use to interact with the user interface */
GstElement *pipeline; /* The running pipeline */
GstElement *video_sink;/* The video sink element which receives XOverlay commands */
GstElement *video_sink;/* The video sink element which receives VideoOverlay commands */
GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
@ -391,7 +391,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
g_main_context_push_thread_default(context);
/* Build pipeline */
pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink", &error);
pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
if (error) {
gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
g_clear_error (&error);
@ -403,12 +403,12 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_X_OVERLAY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
if (!video_sink) {
GST_ERROR ("Could not retrieve video sink");
return;
}
gst_x_overlay_set_window_handle(GST_X_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (pipeline);
@ -442,13 +442,13 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
```
The main differences with the previous tutorial are related to the
handling of the `XOverlay` interface:
handling of the `VideoOverlay` interface:
```
@implementation GStreamerBackend {
id ui_delegate; /* Class that we use to interact with the user interface */
GstElement *pipeline; /* The running pipeline */
GstElement *video_sink;/* The video sink element which receives XOverlay commands */
GstElement *video_sink;/* The video sink element which receives VideoOverlay commands */
GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
@ -457,7 +457,7 @@ handling of the `XOverlay` interface:
```
The class is expanded to keep track of the video sink element in the
pipeline and the `UIView *` onto which rendering is to occur.
pipeline and the `UIView *` onto which rendering is to occur.
```
-(id) init:(id) uiDelegate videoView:(UIView *)video_view
@ -480,62 +480,61 @@ pipeline and the `UIView *` onto which rendering is to occur.
}
```
The constructor accepts the `UIView *` as a new parameter, which, at
The constructor accepts the `UIView *` as a new parameter, which, at
this point, is simply remembered in `ui_video_view`.
```
/* Build pipeline */
pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink", &error);
pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
```
Then, in the `app_function`, the pipeline is constructed. This time we
build a video pipeline using a simple `videotestsrc` element with a
`warptv` to add some spice. The video sink is `autovideosink`, which
build a video pipeline using a simple `videotestsrc` element with a
`warptv` to add some spice. The video sink is `autovideosink`, which
choses the appropriate sink for the platform (currently,
`eglglessink` is the only option for
`glimagesink` is the only option for
iOS).
```
/* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_X_OVERLAY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
if (!video_sink) {
GST_ERROR ("Could not retrieve video sink");
return;
}
gst_x_overlay_set_window_handle(GST_X_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
```
Once the pipeline is built, we set it to READY. In this state, dataflow
has not started yet, but the caps of adjacent elements have been
verified to be compatible and their pads have been linked. Also, the
`autovideosink` has already instantiated the actual video sink so we can
`autovideosink` has already instantiated the actual video sink so we can
ask for it immediately.
The `gst_bin_get_by_interface()` method will examine the whole pipeline
The `gst_bin_get_by_interface()` method will examine the whole pipeline
and return a pointer to an element which supports the requested
interface. We are asking for the `XOverlay` interface, explained in
[Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html),
interface. We are asking for the `VideoOverlay` interface, explained in
[](sdk-basic-tutorial-toolkit-integration.md),
which controls how to perform rendering into foreign (non-GStreamer)
windows. The internal video sink instantiated by `autovideosink` is the
windows. The internal video sink instantiated by `autovideosink` is the
only element in this pipeline implementing it, so it will be returned.
Once we have the video sink, we inform it of the `UIView` to use for
rendering, through the `gst_x_overlay_set_window_handle()` method.
Once we have the video sink, we inform it of the `UIView` to use for
rendering, through the `gst_video_overlay_set_window_handle()` method.
# EaglUIView
One last detail remains. In order for `eglglessink` to be able to draw
One last detail remains. In order for `glimagesink` to be able to draw
on the
[`UIView`](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIView_Class/UIView/UIView.html),
the
[`Layer`](http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Reference/CALayer_class/Introduction/Introduction.html#//apple_ref/occ/cl/CALayer) associated
[`Layer`](http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Reference/CALayer_class/Introduction/Introduction.html#//apple_ref/occ/cl/CALayer) associated
with this view must be of the
[`CAEAGLLayer`](http://developer.apple.com/library/ios/#documentation/QuartzCore/Reference/CAEAGLLayer_Class/CAEGLLayer/CAEGLLayer.html#//apple_ref/occ/cl/CAEAGLLayer) class.
To this avail, we create the `EaglUIView` class, derived from
`UIView `and overriding the `layerClass` method:
[`CAEAGLLayer`](http://developer.apple.com/library/ios/#documentation/QuartzCore/Reference/CAEAGLLayer_Class/CAEGLLayer/CAEGLLayer.html#//apple_ref/occ/cl/CAEAGLLayer) class.
To this avail, we create the `EaglUIView` class, derived from
`UIView `and overriding the `layerClass` method:
**EaglUIView.m**
@ -554,9 +553,9 @@ To this avail, we create the `EaglUIView` class, derived from
@end
```
When creating storyboards, bear in mind that the `UIView `which should
contain the video must have `EaglUIView` as its custom class. This is
easy to setup from the Xcode interface builder. Take a look at the
When creating storyboards, bear in mind that the `UIView `which should
contain the video must have `EaglUIView` as its custom class. This is
easy to setup from the Xcode interface builder. Take a look at the
tutorial storyboard to see how to achieve this.
And this is it, using GStreamer to output video onto an iOS application
@ -566,18 +565,14 @@ is as simple as it seems.
This tutorial has shown:
- How to display video on iOS using a `UIView `and
the `XOverlay` interface.
- How to display video on iOS using a `UIView `and
the `VideoOverlay` interface.
- How to report the media size to the iOS layout engine through
runtime manipulation of width and height constraints.
The following tutorial plays an actual clip and adds a few more controls
to this tutorial in order to build a simple media player.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[ios-tutorial3-screenshot.png](attachments/3571736/3538955.png)
(image/png)
[screenshot]: images/sdk-ios-tutorial-video-screenshot.png

View file

@ -1,34 +1,32 @@
# iOS tutorials
# Welcome to the GStreamer SDK iOS tutorials
## Welcome to the GStreamer SDK iOS tutorials
These tutorials describe iOS-specific topics. General GStreamer concepts
will not be explained in these tutorials, so the [Basic
tutorials](http://docs.gstreamer.com/display/GstSDK/Basic+tutorials) should
be reviewed first. The reader should also be familiar with basic iOS
programming techniques.
These tutorials describe iOS-specific topics. General GStreamer
concepts will not be explained in these tutorials, so the
[](sdk-basic-tutorials.md) should be reviewed first. The reader should
also be familiar with basic iOS programming techniques.
The iOS tutorials have the same structure as the [Android
tutorials](Android%2Btutorials.html): Each one builds on top of the
previous one and adds progressively more functionality, until a working
media player application is obtained in [iOS tutorial 5: A Complete
media
player](http://docs.gstreamer.com/display/GstSDK/iOS+tutorial+5%3A+A+Complete+media+player).
The iOS tutorials have the same structure as the
[](sdk-android-tutorials.md): Each one builds on top of the previous
one and adds progressively more functionality, until a working media
player application is obtained in
[](sdk-ios-tutorial-a-complete-media-player.md).
Make sure to have read the instructions in [Installing for iOS
development](Installing%2Bfor%2BiOS%2Bdevelopment.html) before jumping
into the iOS tutorials.
Make sure to have read the instructions in
[](sdk-installing-for-ios-development.md) before jumping into the iOS
tutorials.
All iOS tutorials are split into the following classes:
- The `GStreamerBackend` class performs all GStreamer-related tasks
- The `GStreamerBackend` class performs all GStreamer-related tasks
and offers a simplified interface to the application, which does not
need to deal with all the GStreamer details. When it needs to
perform any UI action, it does so through a delegate, which is
expected to adhere to the `GStreamerBackendDelegate` protocol.
- The `ViewController` class manages the UI, instantiates the
`GStreamerBackend` and also performs some UI-related tasks on its
expected to adhere to the `GStreamerBackendDelegate` protocol.
- The `ViewController` class manages the UI, instantiates the
`GStreamerBackend` and also performs some UI-related tasks on its
behalf.
- The `GStreamerBackendDelegate` protocol defines which methods a
- The `GStreamerBackendDelegate` protocol defines which methods a
class can implement in order to serve as a UI delegate for the
`GStreamerBackend`.