Updated iOS tutorials

This commit is contained in:
Olivier Crête 2016-06-17 15:32:33 -04:00
parent 1912801c9c
commit 1c35f99e7a
16 changed files with 310 additions and 364 deletions

11
TODO.md
View file

@ -4,11 +4,6 @@ This is just a simple TODO list to follow progress of the port from
gstreamer.com content to hotdoc gstreamer.com content to hotdoc
Pages to review: Pages to review:
- sdk-ios-tutorials.md
- sdk-ios-tutorial-link-against-gstreamer.md
- sdk-ios-tutorial-a-running-pipeline.md
- sdk-ios-tutorial-video.md
- sdk-ios-tutorial-a-basic-media-player.md
- [installing] - [installing]
- sdk-installing-for-ios-development.md - sdk-installing-for-ios-development.md
- sdk-installing-on-linux.md - sdk-installing-on-linux.md
@ -24,6 +19,7 @@ Screenshots:
- Create new ones with the official GStreamer logo and not saying "0.10.36". Affected: - Create new ones with the official GStreamer logo and not saying "0.10.36". Affected:
- Android tutorial 1 - Android tutorial 1
- Android tutorial 2 - Android tutorial 2
- iOS tutorial 1
- Fix filenames of all attachments to make sense - Fix filenames of all attachments to make sense
Code: Code:
@ -57,6 +53,11 @@ Reviewed pages:
- sdk-android-tutorial-video.md - sdk-android-tutorial-video.md
- sdk-android-tutorial-a-complete-media-player.md - sdk-android-tutorial-a-complete-media-player.md
- sdk-android-tutorial-media-player.md - sdk-android-tutorial-media-player.md
- sdk-ios-tutorials.md
- sdk-ios-tutorial-link-against-gstreamer.md
- sdk-ios-tutorial-a-running-pipeline.md
- sdk-ios-tutorial-video.md
- sdk-ios-tutorial-a-basic-media-player.md
- sdk-playback-tutorials.md - sdk-playback-tutorials.md
- sdk-playback-tutorial-playbin-usage.md - sdk-playback-tutorial-playbin-usage.md
- sdk-playback-tutorial-subtitle-management.md - sdk-playback-tutorial-subtitle-management.md

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 191 KiB

View file

Before

Width:  |  Height:  |  Size: 400 KiB

After

Width:  |  Height:  |  Size: 400 KiB

View file

Before

Width:  |  Height:  |  Size: 191 KiB

After

Width:  |  Height:  |  Size: 191 KiB

View file

Before

Width:  |  Height:  |  Size: 387 KiB

After

Width:  |  Height:  |  Size: 387 KiB

View file

Before

Width:  |  Height:  |  Size: 20 KiB

After

Width:  |  Height:  |  Size: 20 KiB

View file

Before

Width:  |  Height:  |  Size: 21 KiB

After

Width:  |  Height:  |  Size: 21 KiB

View file

Before

Width:  |  Height:  |  Size: 115 KiB

After

Width:  |  Height:  |  Size: 115 KiB

View file

@ -1,46 +1,47 @@
# iOS tutorial 4: A basic media player # iOS tutorial 4: A basic media player
# Goal![](attachments/3571758/3539044.png) ## Goal
Enough testing with synthetic images and audio tones\! This tutorial ![screenshot]
Enough testing with synthetic images and audio tones! This tutorial
finally plays actual media, streamed directly from the Internet, in your finally plays actual media, streamed directly from the Internet, in your
iOS device. It shows: iOS device. It shows:
- How to keep the User Interface regularly updated with the current - How to keep the User Interface regularly updated with the current
playback position and duration playback position and duration
- How to implement a [Time - How to implement a [Time
Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html) Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html)
- How to report the media size to adapt the display surface - How to report the media size to adapt the display surface
It also uses the knowledge gathered in the [Basic It also uses the knowledge gathered in the [](sdk-basic-tutorials.md) regarding:
tutorials](Basic%2Btutorials.html) regarding:
- How to use `playbin` to play any kind of media - How to use `playbin` to play any kind of media
- How to handle network resilience problems - How to handle network resilience problems
# Introduction ## Introduction
From the previous tutorials, we already have almost all necessary pieces From the previous tutorials, we already have almost all necessary
to build a media player. The most complex part is assembling a pipeline pieces to build a media player. The most complex part is assembling a
which retrieves, decodes and displays the media, but we already know pipeline which retrieves, decodes and displays the media, but we
that the `playbin` element can take care of all that for us. We only already know that the `playbin` element can take care of all that for
need to replace the manual pipeline we used in [iOS tutorial 3: us. We only need to replace the manual pipeline we used in
Video](iOS%2Btutorial%2B3%253A%2BVideo.html) with a [](sdk-ios-tutorial-video.md) with a single-element `playbin` pipeline
single-element `playbin` pipeline and we are good to go\! and we are good to go!
However, we can do better than. We will add a [Time However, we can do better than. We will add a [Time
Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html), Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html),
with a moving thumb that will advance as our current position in the with a moving thumb that will advance as our current position in the
media advances. We will also allow the user to drag the thumb, to jump media advances. We will also allow the user to drag the thumb, to jump
(or *seek*) to a different position. (or *seek*) to a different position.
And finally, we will make the video surface adapt to the media size, so And finally, we will make the video surface adapt to the media size, so
the video sink is not forced to draw black borders around the clip. the video sink is not forced to draw black borders around the clip.
 This also allows the iOS layout to adapt more nicely to the actual This also allows the iOS layout to adapt more nicely to the actual
media content. You can still force the video surface to have a specific media content. You can still force the video surface to have a specific
size if you really want to. size if you really want to.
# The User Interface ## The User Interface
The User Interface from the previous tutorial is expanded again. A The User Interface from the previous tutorial is expanded again. A
`UISlider` has been added to the toolbar, to keep track of the current `UISlider` has been added to the toolbar, to keep track of the current
@ -83,15 +84,15 @@ duration.
``` ```
Note how we register callbacks for some of the Actions the Note how we register callbacks for some of the Actions the
[UISlider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html) generates. [UISlider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html) generates.
Also note that the class has been renamed from `ViewController` to Also note that the class has been renamed from `ViewController` to
`VideoViewController`, since the next tutorial adds another `VideoViewController`, since the next tutorial adds another
`ViewController` and we will need to differentiate. `ViewController` and we will need to differentiate.
# The Video View Controller ## The Video View Controller
The `ViewController `class manages the UI, instantiates The `ViewController `class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its the `GStreamerBackend` and also performs some UI-related tasks on its
behalf: behalf:
![](images/icons/grey_arrow_down.gif)Due to the extension of this code, ![](images/icons/grey_arrow_down.gif)Due to the extension of this code,
@ -300,14 +301,14 @@ this view is collapsed by default. Click here to expand…
Supporting arbitrary media URIs Supporting arbitrary media URIs
The `GStreamerBackend`  provides the `setUri()` method so we can The `GStreamerBackend` provides the `setUri()` method so we can
indicate the URI of the media to play. Since `playbin` will be taking indicate the URI of the media to play. Since `playbin` will be taking
care of retrieving the media, we can use local or remote URIs care of retrieving the media, we can use local or remote URIs
indistinctly (`file://` or `http://`, for example). From the UI code, indistinctly (`file://` or `http://`, for example). From the UI code,
though, we want to keep track of whether the file is local or remote, though, we want to keep track of whether the file is local or remote,
because we will not offer the same functionalities. We keep track of because we will not offer the same functionalities. We keep track of
this in the `is_local_media` variable, which is set when the URI is set, this in the `is_local_media` variable, which is set when the URI is set,
in the `gstreamerInitialized` method: in the `gstreamerInitialized` method:
``` ```
-(void) gstreamerInitialized -(void) gstreamerInitialized
@ -327,7 +328,7 @@ Reporting media size
Every time the size of the media changes (which could happen mid-stream, Every time the size of the media changes (which could happen mid-stream,
for some kind of streams), or when it is first detected, for some kind of streams), or when it is first detected,
`GStreamerBackend`  calls our `mediaSizeChanged()` callback: `GStreamerBackend` calls our `mediaSizeChanged()` callback:
``` ```
-(void) mediaSizeChanged:(NSInteger)width height:(NSInteger)height -(void) mediaSizeChanged:(NSInteger)width height:(NSInteger)height
@ -343,18 +344,16 @@ for some kind of streams), or when it is first detected,
``` ```
Here we simply store the new size and ask the layout to be recalculated. Here we simply store the new size and ask the layout to be recalculated.
As we have already seen in [iOS tutorial 2: A running As we have already seen in [](sdk-ios-tutorial-a-running-pipeline.md),
pipeline](iOS%2Btutorial%2B2%253A%2BA%2Brunning%2Bpipeline.html),
methods which change the UI must be called from the main thread, and we methods which change the UI must be called from the main thread, and we
are now in a callback from some GStreamer internal thread. Hence, the are now in a callback from some GStreamer internal thread. Hence, the
usage usage
of `dispatch_async()`[.](http://developer.android.com/reference/android/app/Activity.html#runOnUiThread\(java.lang.Runnable\)) of `dispatch_async()`[.](http://developer.android.com/reference/android/app/Activity.html#runOnUiThread\(java.lang.Runnable\))
### Refreshing the Time Slider ### Refreshing the Time Slider
[Basic tutorial 5: GUI toolkit [](sdk-basic-tutorial-toolkit-integration.md) has
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html) has already shown how to implement a Seek Bar (or [Time
already shown how to implement a Seek Bar (or [Time
Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html) Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html)
in this tutorial) using the GTK+ toolkit. The implementation on iOS is in this tutorial) using the GTK+ toolkit. The implementation on iOS is
very similar. very similar.
@ -363,8 +362,8 @@ The Seek Bar accomplishes to functions: First, it moves on its own to
reflect the current playback position in the media. Second, it can be reflect the current playback position in the media. Second, it can be
dragged by the user to seek to a different position. dragged by the user to seek to a different position.
To realize the first function, `GStreamerBackend`  will periodically To realize the first function, `GStreamerBackend` will periodically
call our `setCurrentPosition` method so we can update the position of call our `setCurrentPosition` method so we can update the position of
the thumb in the Seek Bar. Again we do so from the UI thread, using the thumb in the Seek Bar. Again we do so from the UI thread, using
`dispatch_async()`. `dispatch_async()`.
@ -383,15 +382,15 @@ the thumb in the Seek Bar. Again we do so from the UI thread, using
``` ```
Also note that if the user is currently dragging the slider (the Also note that if the user is currently dragging the slider (the
`dragging_slider` variable is explained below) we ignore `dragging_slider` variable is explained below) we ignore
`setCurrentPosition` calls from `GStreamerBackend`, as they would `setCurrentPosition` calls from `GStreamerBackend`, as they would
interfere with the users actions. interfere with the users actions.
To the left of the Seek Bar (refer to the screenshot at the top of this To the left of the Seek Bar (refer to the screenshot at the top of this
page), there is page), there is
a [TextField](https://developer.apple.com/library/ios/#documentation/UIKit/Reference/UITextField_Class/Reference/UITextField.html) widget a [TextField](https://developer.apple.com/library/ios/#documentation/UIKit/Reference/UITextField_Class/Reference/UITextField.html) widget
which we will use to display the current position and duration in which we will use to display the current position and duration in
"`HH:mm:ss / HH:mm:ss"` textual format. The `updateTimeWidget` method "`HH:mm:ss / HH:mm:ss"` textual format. The `updateTimeWidget` method
takes care of it, and must be called every time the Seek Bar is takes care of it, and must be called every time the Seek Bar is
updated: updated:
@ -429,7 +428,7 @@ updated:
Seeking with the Seek Bar Seeking with the Seek Bar
To perform the second function of the Seek Bar (allowing the user to To perform the second function of the Seek Bar (allowing the user to
seek by dragging the thumb), we register some callbacks through IBAction seek by dragging the thumb), we register some callbacks through IBAction
outlets. Refer to the storyboard in this tutorials project to see which outlets. Refer to the storyboard in this tutorials project to see which
outlets are connected. We will be notified when the user starts dragging outlets are connected. We will be notified when the user starts dragging
@ -444,11 +443,11 @@ the Slider.
} }
``` ```
`sliderTouchDown` is called when the user starts dragging. Here we pause `sliderTouchDown` is called when the user starts dragging. Here we pause
the pipeline because if the user is searching for a particular scene, we the pipeline because if the user is searching for a particular scene, we
do not want it to keep moving. We also mark that a drag operation is in do not want it to keep moving. We also mark that a drag operation is in
progress in the progress in the
`dragging_slider` variable. `dragging_slider` variable.
``` ```
/* Called when the time slider position has changed, either because the user dragged it or /* Called when the time slider position has changed, either because the user dragged it or
@ -462,10 +461,10 @@ progress in the
} }
``` ```
`sliderValueChanged` is called every time the Sliders thumb moves, be `sliderValueChanged` is called every time the Sliders thumb moves, be
it because the user dragged it, or because we changed its value form the it because the user dragged it, or because we changed its value form the
program. We discard the latter case using the program. We discard the latter case using the
`dragging_slider` variable. `dragging_slider` variable.
As the comment says, if this is a local media, we allow scrub seeking, As the comment says, if this is a local media, we allow scrub seeking,
this is, we jump to the indicated position as soon as the thumb moves. this is, we jump to the indicated position as soon as the thumb moves.
@ -486,24 +485,21 @@ widget.
} }
``` ```
Finally, `sliderTouchUp` is called when the thumb is released. We Finally, `sliderTouchUp` is called when the thumb is released. We
perform the seek operation if the file was non-local, restore the perform the seek operation if the file was non-local, restore the
pipeline to the desired playing state and end the dragging operation by pipeline to the desired playing state and end the dragging operation by
setting `dragging_slider` to NO. setting `dragging_slider` to NO.
This concludes the User interface part of this tutorial. Lets review This concludes the User interface part of this tutorial. Lets review
now the `GStreamerBackend`  class that allows this to work. now the `GStreamerBackend` class that allows this to work.
# The GStreamer Backend ## The GStreamer Backend
The `GStreamerBackend` class performs all GStreamer-related tasks and The `GStreamerBackend` class performs all GStreamer-related tasks and
offers a simplified interface to the application, which does not need to offers a simplified interface to the application, which does not need to
deal with all the GStreamer details. When it needs to perform any UI deal with all the GStreamer details. When it needs to perform any UI
action, it does so through a delegate, which is expected to adhere to action, it does so through a delegate, which is expected to adhere to
the `GStreamerBackendDelegate` protocol. the `GStreamerBackendDelegate` protocol.
![](images/icons/grey_arrow_down.gif)Due to the extension of this code,
this view is collapsed by default. Click here to expand…
**GStreamerBackend.m** **GStreamerBackend.m**
@ -511,7 +507,6 @@ this view is collapsed by default. Click here to expand…
#import "GStreamerBackend.h" #import "GStreamerBackend.h"
#include <gst/gst.h> #include <gst/gst.h>
#include <gst/interfaces/xoverlay.h>
#include <gst/video/video.h> #include <gst/video/video.h>
GST_DEBUG_CATEGORY_STATIC (debug_category); GST_DEBUG_CATEGORY_STATIC (debug_category);
@ -530,7 +525,7 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
@implementation GStreamerBackend { @implementation GStreamerBackend {
id ui_delegate; /* Class that we use to interact with the user interface */ id ui_delegate; /* Class that we use to interact with the user interface */
GstElement *pipeline; /* The running pipeline */ GstElement *pipeline; /* The running pipeline */
GstElement *video_sink; /* The video sink element which receives XOverlay commands */ GstElement *video_sink; /* The video sink element which receives VideoOverlay commands */
GMainContext *context; /* GLib context used to run the main loop */ GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */ GMainLoop *main_loop; /* GLib main loop */
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */ gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
@ -630,7 +625,6 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
/* If we have pipeline and it is running, query the current position and clip duration and inform /* If we have pipeline and it is running, query the current position and clip duration and inform
* the application */ * the application */
static gboolean refresh_ui (GStreamerBackend *self) { static gboolean refresh_ui (GStreamerBackend *self) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 position; gint64 position;
/* We do not want to update anything unless we have a working pipeline in the PAUSED or PLAYING state */ /* We do not want to update anything unless we have a working pipeline in the PAUSED or PLAYING state */
@ -639,10 +633,10 @@ static gboolean refresh_ui (GStreamerBackend *self) {
/* If we didn't know it yet, query the stream duration */ /* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (self->duration)) { if (!GST_CLOCK_TIME_IS_VALID (self->duration)) {
gst_element_query_duration (self->pipeline, &fmt, &self->duration); gst_element_query_duration (self->pipeline, GST_FORMAT_TIME, &self->duration);
} }
if (gst_element_query_position (self->pipeline, &fmt, &position)) { if (gst_element_query_position (self->pipeline, GST_FORMAT_TIME, &position)) {
/* The UI expects these values in milliseconds, and GStreamer provides nanoseconds */ /* The UI expects these values in milliseconds, and GStreamer provides nanoseconds */
[self setCurrentUIPosition:position / GST_MSECOND duration:self->duration / GST_MSECOND]; [self setCurrentUIPosition:position / GST_MSECOND duration:self->duration / GST_MSECOND];
} }
@ -756,9 +750,7 @@ static void check_media_size (GStreamerBackend *self) {
GstElement *video_sink; GstElement *video_sink;
GstPad *video_sink_pad; GstPad *video_sink_pad;
GstCaps *caps; GstCaps *caps;
GstVideoFormat fmt; GstVideoInfo info;
int width;
int height;
/* Retrieve the Caps at the entrance of the video sink */ /* Retrieve the Caps at the entrance of the video sink */
g_object_get (self->pipeline, "video-sink", &video_sink, NULL); g_object_get (self->pipeline, "video-sink", &video_sink, NULL);
@ -767,18 +759,15 @@ static void check_media_size (GStreamerBackend *self) {
if (!video_sink) return; if (!video_sink) return;
video_sink_pad = gst_element_get_static_pad (video_sink, "sink"); video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
caps = gst_pad_get_negotiated_caps (video_sink_pad); caps = gst_pad_get_current_caps (video_sink_pad);
if (gst_video_format_parse_caps(caps, &fmt, &width, &height)) { if (gst_video_info_from_caps(&info, caps)) {
int par_n, par_d; info.width = info.width * info.par_n / info.par_d
if (gst_video_parse_caps_pixel_aspect_ratio (caps, &par_n, &par_d)) { GST_DEBUG ("Media size is %dx%d, notifying application", info.width, info.height);
width = width * par_n / par_d;
}
GST_DEBUG ("Media size is %dx%d, notifying application", width, height);
if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:height:)]) if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:info.height:)])
{ {
[self->ui_delegate mediaSizeChanged:width height:height]; [self->ui_delegate mediaSizeChanged:info.width height:info.height];
} }
} }
@ -851,12 +840,12 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Set the pipeline to READY, so it can already accept a window handle */ /* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY); gst_element_set_state(pipeline, GST_STATE_READY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_X_OVERLAY); video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
if (!video_sink) { if (!video_sink) {
GST_ERROR ("Could not retrieve video sink"); GST_ERROR ("Could not retrieve video sink");
return; return;
} }
gst_x_overlay_set_window_handle(GST_X_OVERLAY(video_sink), (guintptr) (id) ui_video_view); gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */ /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (pipeline); bus = gst_element_get_bus (pipeline);
@ -905,7 +894,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
Supporting arbitrary media URIs Supporting arbitrary media URIs
The UI code will call `setUri` whenever it wants to change the playing The UI code will call `setUri` whenever it wants to change the playing
URI (in this tutorial the URI never changes, but it does in the next URI (in this tutorial the URI never changes, but it does in the next
one): one):
@ -918,11 +907,11 @@ one):
} }
``` ```
We first need to obtain a plain `char *` from within the `NSString *` we We first need to obtain a plain `char *` from within the `NSString *` we
get, using the `UTF8String` method. get, using the `UTF8String` method.
`playbin`s URI is exposed as a common GObject property, so we simply `playbin`s URI is exposed as a common GObject property, so we simply
set it with `g_object_set()`. set it with `g_object_set()`.
### Reporting media size ### Reporting media size
@ -930,7 +919,7 @@ Some codecs allow the media size (width and height of the video) to
change during playback. For simplicity, this tutorial assumes that they change during playback. For simplicity, this tutorial assumes that they
do not. Therefore, in the READY to PAUSED state change, once the Caps of do not. Therefore, in the READY to PAUSED state change, once the Caps of
the decoded media are known, we inspect them the decoded media are known, we inspect them
in `check_media_size()`: in `check_media_size()`:
``` ```
/* Retrieve the video sink's Caps and tell the application about the media size */ /* Retrieve the video sink's Caps and tell the application about the media size */
@ -938,9 +927,7 @@ static void check_media_size (GStreamerBackend *self) {
GstElement *video_sink; GstElement *video_sink;
GstPad *video_sink_pad; GstPad *video_sink_pad;
GstCaps *caps; GstCaps *caps;
GstVideoFormat fmt; GstVideoInfo info;
int width;
int height;
/* Retrieve the Caps at the entrance of the video sink */ /* Retrieve the Caps at the entrance of the video sink */
g_object_get (self->pipeline, "video-sink", &video_sink, NULL); g_object_get (self->pipeline, "video-sink", &video_sink, NULL);
@ -949,18 +936,15 @@ static void check_media_size (GStreamerBackend *self) {
if (!video_sink) return; if (!video_sink) return;
video_sink_pad = gst_element_get_static_pad (video_sink, "sink"); video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
caps = gst_pad_get_negotiated_caps (video_sink_pad); caps = gst_pad_get_current_caps (video_sink_pad);
if (gst_video_format_parse_caps(caps, &fmt, &width, &height)) { if (gst_video_info_from_caps(&info, caps)) {
int par_n, par_d; info.width = info.width * info.par_n / info.par_d;
if (gst_video_parse_caps_pixel_aspect_ratio (caps, &par_n, &par_d)) { GST_DEBUG ("Media size is %dx%d, notifying application", info.width, info.height);
width = width * par_n / par_d;
}
GST_DEBUG ("Media size is %dx%d, notifying application", width, height);
if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:height:)]) if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:info.height:)])
{ {
[self->ui_delegate mediaSizeChanged:width height:height]; [self->ui_delegate mediaSizeChanged:info.width height:info.height];
} }
} }
@ -971,19 +955,19 @@ static void check_media_size (GStreamerBackend *self) {
``` ```
We first retrieve the video sink element from the pipeline, using We first retrieve the video sink element from the pipeline, using
the `video-sink` property of `playbin`, and then its sink Pad. The the `video-sink` property of `playbin`, and then its sink Pad. The
negotiated Caps of this Pad, which we recover using negotiated Caps of this Pad, which we recover using
`gst_pad_get_negotiated_caps()`,  are the Caps of the decoded media. `gst_pad_get_current_caps()`, are the Caps of the decoded media.
The helper functions `gst_video_format_parse_caps()` and The helper functions `gst_video_format_parse_caps()` and
`gst_video_parse_caps_pixel_aspect_ratio()` turn the Caps into `gst_video_parse_caps_pixel_aspect_ratio()` turn the Caps into
manageable integers, which we pass to the application through manageable integers, which we pass to the application through
its `mediaSizeChanged` callback. its `mediaSizeChanged` callback.
### Refreshing the Seek Bar ### Refreshing the Seek Bar
To keep the UI updated, a GLib timer is installed in To keep the UI updated, a GLib timer is installed in
the `app_function` that fires 4 times per second (or every 250ms), the `app_function` that fires 4 times per second (or every 250ms),
right before entering the main loop: right before entering the main loop:
``` ```
@ -1001,7 +985,6 @@ method:
/* If we have pipeline and it is running, query the current position and clip duration and inform /* If we have pipeline and it is running, query the current position and clip duration and inform
* the application */ * the application */
static gboolean refresh_ui (GStreamerBackend *self) { static gboolean refresh_ui (GStreamerBackend *self) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 position; gint64 position;
/* We do not want to update anything unless we have a working pipeline in the PAUSED or PLAYING state */ /* We do not want to update anything unless we have a working pipeline in the PAUSED or PLAYING state */
@ -1010,10 +993,10 @@ static gboolean refresh_ui (GStreamerBackend *self) {
/* If we didn't know it yet, query the stream duration */ /* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (self->duration)) { if (!GST_CLOCK_TIME_IS_VALID (self->duration)) {
gst_element_query_duration (self->pipeline, &fmt, &self->duration); gst_element_query_duration (self->pipeline, GST_FORMAT_TIME, &self->duration);
} }
if (gst_element_query_position (self->pipeline, &fmt, &position)) { if (gst_element_query_position (self->pipeline, GST_FORMAT_TIME, &position)) {
/* The UI expects these values in milliseconds, and GStreamer provides nanoseconds */ /* The UI expects these values in milliseconds, and GStreamer provides nanoseconds */
[self setCurrentUIPosition:position / GST_MSECOND duration:self->duration / GST_MSECOND]; [self setCurrentUIPosition:position / GST_MSECOND duration:self->duration / GST_MSECOND];
} }
@ -1021,21 +1004,20 @@ static gboolean refresh_ui (GStreamerBackend *self) {
} }
``` ```
If it is unknown, the clip duration is retrieved, as explained in [Basic If it is unknown, the clip duration is retrieved, as explained in
tutorial 4: Time [](sdk-basic-tutorial-time-management.md). The current position is
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html). The retrieved next, and the UI is informed of both through its
current position is retrieved next, and the UI is informed of both `setCurrentUIPosition` callback.
through its `setCurrentUIPosition` callback.
Bear in mind that all time-related measures returned by GStreamer are in Bear in mind that all time-related measures returned by GStreamer are in
nanoseconds, whereas, for simplicity, we decided to make the UI code nanoseconds, whereas, for simplicity, we decided to make the UI code
work in milliseconds.  work in milliseconds.
### Seeking with the Seek Bar ### Seeking with the Seek Bar
The UI code already takes care of most of the complexity of seeking by The UI code already takes care of most of the complexity of seeking by
dragging the thumb of the Seek Bar. From the `GStreamerBackend`, we just dragging the thumb of the Seek Bar. From the `GStreamerBackend`, we just
need to honor the calls to `setPosition` and instruct the pipeline to need to honor the calls to `setPosition` and instruct the pipeline to
jump to the indicated position. jump to the indicated position.
There are, though, a couple of caveats. Firstly, seeks are only possible There are, though, a couple of caveats. Firstly, seeks are only possible
@ -1047,7 +1029,7 @@ see how to overcome these problems.
#### Delayed seeks #### Delayed seeks
In `setPosition`: In `setPosition`:
``` ```
-(void) setPosition:(NSInteger)milliseconds -(void) setPosition:(NSInteger)milliseconds
@ -1064,8 +1046,8 @@ In `setPosition`:
If we are already in the correct state for seeking, execute it right If we are already in the correct state for seeking, execute it right
away; otherwise, store the desired position in away; otherwise, store the desired position in
the `desired_position` variable. Then, in the `desired_position` variable. Then, in
the `state_changed_cb()` callback: the `state_changed_cb()` callback:
``` ```
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED)
@ -1080,7 +1062,7 @@ the `state_changed_cb()` callback:
Once the pipeline moves from the READY to the PAUSED state, we check if Once the pipeline moves from the READY to the PAUSED state, we check if
there is a pending seek operation and execute it. there is a pending seek operation and execute it.
The `desired_position` variable is reset inside `execute_seek()`. The `desired_position` variable is reset inside `execute_seek()`.
#### Seek throttling #### Seek throttling
@ -1097,11 +1079,11 @@ second one, it is up to it to finish the first one, start the second one
or abort both, which is a bad thing. A simple method to avoid this issue or abort both, which is a bad thing. A simple method to avoid this issue
is *throttling*, which means that we will only allow one seek every half is *throttling*, which means that we will only allow one seek every half
a second (for example): after performing a seek, only the last seek a second (for example): after performing a seek, only the last seek
request received during the next 500ms is stored, and will be honored request received during the next 500ms is stored, and will be honored
once this period elapses. once this period elapses.
To achieve this, all seek requests are routed through To achieve this, all seek requests are routed through
the `execute_seek()` method: the `execute_seek()` method:
``` ```
/* Perform seek, if we are not too close to the previous seek. Otherwise, schedule the seek for /* Perform seek, if we are not too close to the previous seek. Otherwise, schedule the seek for
@ -1141,34 +1123,28 @@ static void execute_seek (gint64 position, GStreamerBackend *self) {
``` ```
The time at which the last seek was performed is stored in The time at which the last seek was performed is stored in
the `last_seek_time` variable. This is wall clock time, not to be the `last_seek_time` variable. This is wall clock time, not to be
confused with the stream time carried in the media time stamps, and is confused with the stream time carried in the media time stamps, and is
obtained with `gst_util_get_timestamp()`. obtained with `gst_util_get_timestamp()`.
If enough time has passed since the last seek operation, the new one is If enough time has passed since the last seek operation, the new one is
directly executed and `last_seek_time` is updated. Otherwise, the new directly executed and `last_seek_time` is updated. Otherwise, the new
seek is scheduled for later. If there is no previously scheduled seek, a seek is scheduled for later. If there is no previously scheduled seek, a
one-shot timer is setup to trigger 500ms after the last seek operation. one-shot timer is setup to trigger 500ms after the last seek operation.
If another seek was already scheduled, its desired position is simply If another seek was already scheduled, its desired position is simply
updated with the new one. updated with the new one.
The one-shot timer calls `delayed_seek_cb()`, which simply The one-shot timer calls `delayed_seek_cb()`, which simply
calls `execute_seek()` again. calls `execute_seek()` again.
<table> > ![information]
<tbody> > Ideally, `execute_seek()` will now find that enough time has indeed passed since the last seek and the scheduled one will proceed. It might happen, though, that after 500ms of the previous seek, and before the timer wakes up, yet another seek comes through and is executed. `delayed_seek_cb()` needs to check for this condition to avoid performing two very close seeks, and therefore calls `execute_seek()` instead of performing the seek itself.
<tr class="odd"> >
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td> >This is not a complete solution: the scheduled seek will still be executed, even though a more-recent seek has already been executed that should have cancelled it. However, it is a good tradeoff between functionality and simplicity.
<td><p><span>Ideally, </span><code>execute_seek()</code><span> will now find that enough time has indeed passed since the last seek and the scheduled one will proceed. It might happen, though, that after 500ms of the previous seek, and before the timer wakes up, yet another seek comes through and is executed. </span><code>delayed_seek_cb()</code><span> needs to check for this condition to avoid performing two very close seeks, and therefore calls </span><code>execute_seek()</code><span> instead of performing the seek itself.</span></p>
<p>This is not a complete solution: the scheduled seek will still be executed, even though a more-recent seek has already been executed that should have cancelled it. However, it is a good tradeoff between functionality and simplicity.</p></td>
</tr>
</tbody>
</table>
Network resilience ### Network resilience
[Basic tutorial 12: [](sdk-basic-tutorial-streaming.md) has already
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) has already
shown how to adapt to the variable nature of the network bandwidth by shown how to adapt to the variable nature of the network bandwidth by
using buffering. The same procedure is used here, by listening to the using buffering. The same procedure is used here, by listening to the
buffering buffering
@ -1182,7 +1158,7 @@ And pausing the pipeline until buffering is complete (unless this is a
live live
source): source):
 
``` ```
/* Called when buffering messages are received. We inform the UI about the current buffering level and /* Called when buffering messages are received. We inform the UI about the current buffering level and
@ -1207,14 +1183,14 @@ static void buffering_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
} }
``` ```
`target_state` is the state in which we have been instructed to set the `target_state` is the state in which we have been instructed to set the
pipeline, which might be different to the current state, because pipeline, which might be different to the current state, because
buffering forces us to go to PAUSED. Once buffering is complete we set buffering forces us to go to PAUSED. Once buffering is complete we set
the pipeline to the `target_state`. the pipeline to the `target_state`.
### Conclusion ## Conclusion
This tutorial has shown how to embed a `playbin` pipeline into an iOS This tutorial has shown how to embed a `playbin` pipeline into an iOS
application. This, effectively, turns such application into a basic application. This, effectively, turns such application into a basic
media player, capable of streaming and decoding all the formats media player, capable of streaming and decoding all the formats
GStreamer understands. More particularly, it has shown: GStreamer understands. More particularly, it has shown:
@ -1230,8 +1206,5 @@ GStreamer understands. More particularly, it has shown:
The next tutorial adds the missing bits to turn the application built The next tutorial adds the missing bits to turn the application built
here into an acceptable iOS media player. here into an acceptable iOS media player.
## Attachments: [information]: images/icons/emoticons/information.png
[screenshot]: images/sdk-ios-tutorial-a-basic-media-player-screenshot.png
![](images/icons/bullet_blue.gif)
[ios-tutorial4-screenshot.png](attachments/3571758/3539044.png)
(image/png)

View file

@ -1,20 +1,22 @@
# iOS tutorial 5: A Complete media player # iOS tutorial 5: A Complete media player
# Goal![](attachments/3571769/3539046.png)![](attachments/3571769/3539045.png) ## Goal
![screenshot0]
![screenshot1]
This tutorial wants to be the “demo application” that showcases what can This tutorial wants to be the “demo application” that showcases what can
be done with GStreamer on the iOS platform. be done with GStreamer on the iOS platform.
It is intended to be built and run, rather than analyzed for its It is intended to be built and run, rather than analyzed for its
pedagogical value, since it adds very little GStreamer knowledge over pedagogical value, since it adds very little GStreamer knowledge over
what has already been shown in [iOS tutorial 4: A basic media what has already been shown in [](sdk-ios-tutorial-a-basic-media-player.md).
player](iOS%2Btutorial%2B4%253A%2BA%2Bbasic%2Bmedia%2Bplayer.html).
It demonstrates the main functionality that a conventional media player It demonstrates the main functionality that a conventional media player
has, but it is not a complete application yet, therefore it has not been has, but it is not a complete application yet, therefore it has not been
uploaded to the AppStore. uploaded to the AppStore.
# Introduction ## Introduction
The previous tutorial already implemented a basic media player. This one The previous tutorial already implemented a basic media player. This one
simply adds a few finishing touches. In particular, it adds the simply adds a few finishing touches. In particular, it adds the
@ -25,12 +27,11 @@ These are not features directly related to GStreamer, and are therefore
outside the scope of these tutorials. Only a few implementation pointers outside the scope of these tutorials. Only a few implementation pointers
are given here. are given here.
# Selecting the media to play ## Selecting the media to play
A new `UIView` has been added, derived from `UITableViewController` A new `UIView` has been added, derived from `UITableViewController`
which shows a list of clips. When one is selected, the which shows a list of clips. When one is selected, the
`VideoViewController` from [Tutorial `VideoViewController` from [](sdk-ios-tutorial-a-basic-media-player.md) appears
4](iOS%2Btutorial%2B4%253A%2BA%2Bbasic%2Bmedia%2Bplayer.html) appears
and its URI property is set to the URI of the selected clip. and its URI property is set to the URI of the selected clip.
The list of clips is populated from three sources: Media from the The list of clips is populated from three sources: Media from the
@ -39,17 +40,17 @@ devices Photo library, Media from the applications Documents folder
Internet addresses, selected to showcase different container and codec Internet addresses, selected to showcase different container and codec
formats, and a couple of bogus ones, to illustrate error reporting. formats, and a couple of bogus ones, to illustrate error reporting.
# Preventing the screen from turning off ## Preventing the screen from turning off
While watching a movie, there is typically no user activity. After a While watching a movie, there is typically no user activity. After a
short period of such inactivity, iOS will dim the screen, and then turn short period of such inactivity, iOS will dim the screen, and then turn
it off completely. To prevent this, the `idleTimerDisabled` property of it off completely. To prevent this, the `idleTimerDisabled` property of
the `UIApplication` class is used. The application sets it to YES the `UIApplication` class is used. The application sets it to YES
(screen locking disabled) when the Play button is pressed, so the screen (screen locking disabled) when the Play button is pressed, so the screen
is never turned off, and sets it back to NO when the Pause button is is never turned off, and sets it back to NO when the Pause button is
pressed. pressed.
# Conclusion ## Conclusion
This finishes the series of iOS tutorials. Each one of the preceding This finishes the series of iOS tutorials. Each one of the preceding
tutorials has evolved on top of the previous one, showing how to tutorials has evolved on top of the previous one, showing how to
@ -57,16 +58,7 @@ implement a particular set of features, and concluding in this Tutorial
5. The goal of Tutorial 5 is to build a complete media player which can 5. The goal of Tutorial 5 is to build a complete media player which can
already be used to showcase the integration of GStreamer and iOS. already be used to showcase the integration of GStreamer and iOS.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon!
## Attachments: [screenshot0]: images/sdk-ios-tutorial-a-complete-media-player-screenshot-0.png
[screenshot1]: images/sdk-ios-tutorial-a-complete-media-player-screenshot-1.png
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot0.png](attachments/3571769/3539071.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot1.png](attachments/3571769/3539046.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot0.png](attachments/3571769/3539045.png)
(image/png)

View file

@ -1,9 +1,11 @@
# iOS tutorial 2: A running pipeline # iOS tutorial 2: A running pipeline
# Goal![](attachments/3571718/3538953.png) ## Goal
As seen in the [Basic](Basic%2Btutorials.html) and ![screenshot]
[Playback](Playback%2Btutorials.html) tutorials, GStreamer integrates
As seen in the [Basic](sdk-basic-tutorials.md) and
[Playback](sdk-playback-tutorials.md) tutorials, GStreamer integrates
nicely with GLibs main loops, so pipeline operation and user interface nicely with GLibs main loops, so pipeline operation and user interface
can be monitored simultaneously in a very simple way. However, platforms can be monitored simultaneously in a very simple way. However, platforms
like iOS or Android do not use GLib and therefore extra care must be like iOS or Android do not use GLib and therefore extra care must be
@ -17,18 +19,18 @@ This tutorial shows:
- How to communicate between the Objective-C UI code and the C - How to communicate between the Objective-C UI code and the C
GStreamer code GStreamer code
# Introduction ## Introduction
When using a Graphical User Interface (UI), if the application waits for When using a Graphical User Interface (UI), if the application waits for
GStreamer calls to complete the user experience will suffer. The usual GStreamer calls to complete the user experience will suffer. The usual
approach, with the [GTK+ toolkit](http://www.gtk.org/) for example, is approach, with the [GTK+ toolkit](http://www.gtk.org/) for example, is
to relinquish control to a GLib `GMainLoop` and let it control the to relinquish control to a GLib `GMainLoop` and let it control the
events coming from the UI or GStreamer. events coming from the UI or GStreamer.
Other graphical toolkits that are not based on GLib, like the [Cocoa Other graphical toolkits that are not based on GLib, like the [Cocoa
Touch](https://developer.apple.com/technologies/ios/cocoa-touch.html) Touch](https://developer.apple.com/library/ios/documentation/General/Conceptual/DevPedia-CocoaCore/Cocoa.html)
framework used on iOS devices, cannot use this option, though. The framework used on iOS devices, cannot use this option, though. The
solution used in this tutorial uses a GLib `GMainLoop` for its solution used in this tutorial uses a GLib `GMainLoop` for its
simplicity, but moves it to a separate thread (a [Dispatch simplicity, but moves it to a separate thread (a [Dispatch
Queue](http://developer.apple.com/library/ios/#documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html) Queue](http://developer.apple.com/library/ios/#documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html)
different than the main one) so it does not block the user interface different than the main one) so it does not block the user interface
@ -37,12 +39,12 @@ operation.
Additionally, this tutorial shows a few places where caution has to be Additionally, this tutorial shows a few places where caution has to be
taken when calling from Objective-C to C and vice versa. taken when calling from Objective-C to C and vice versa.
The code below builds a pipeline with an `audiotestsrc` and The code below builds a pipeline with an `audiotestsrc` and
an `autoaudiosink` (it plays an audible tone). Two buttons in the UI an `autoaudiosink` (it plays an audible tone). Two buttons in the UI
allow setting the pipeline to PLAYING or PAUSED. A Label in the UI shows allow setting the pipeline to PLAYING or PAUSED. A Label in the UI shows
messages sent from the C code (for errors and state changes). messages sent from the C code (for errors and state changes).
# The User Interface ## The User Interface
A toolbar at the bottom of the screen contains a Play and a Pause A toolbar at the bottom of the screen contains a Play and a Pause
button. Over the toolbar there is a Label used to display messages from button. Over the toolbar there is a Label used to display messages from
@ -50,10 +52,10 @@ GStreamer. This tutorial does not require more elements, but the
following lessons will build their User Interfaces on top of this one, following lessons will build their User Interfaces on top of this one,
adding more components. adding more components.
# The View Controller ## The View Controller
The `ViewController` class manages the UI, instantiates The `ViewController` class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its the `GStreamerBackend` and also performs some UI-related tasks on its
behalf: behalf:
**ViewController.m** **ViewController.m**
@ -126,7 +128,7 @@ behalf:
@end @end
``` ```
An instance of the `GStreamerBackend` in stored inside the class: An instance of the `GStreamerBackend` in stored inside the class:
``` ```
@interface ViewController () { @interface ViewController () {
@ -134,8 +136,8 @@ An instance of the `GStreamerBackend` in stored inside the class:
} }
``` ```
This instance is created in the `viewDidLoad` function through a custom This instance is created in the `viewDidLoad` function through a custom
`init:` method in the `GStreamerBackend`: `init:` method in the `GStreamerBackend`:
``` ```
- (void)viewDidLoad - (void)viewDidLoad
@ -150,11 +152,11 @@ This instance is created in the `viewDidLoad` function through a custom
``` ```
This custom method is required to pass the object that has to be used as This custom method is required to pass the object that has to be used as
the UI delegate (in this case, ourselves, the `ViewController`). the UI delegate (in this case, ourselves, the `ViewController`).
The Play and Pause buttons are also disabled in the The Play and Pause buttons are also disabled in the
`viewDidLoad` function, and they are not re-enabled until the `viewDidLoad` function, and they are not re-enabled until the
`GStreamerBackend` reports that it is initialized and ready. `GStreamerBackend` reports that it is initialized and ready.
``` ```
/* Called when the Play button is pressed */ /* Called when the Play button is pressed */
@ -185,13 +187,13 @@ buttons, and simply forward the call to the appropriate method in the
} }
``` ```
The `gstreamerInitialized` method is defined in the The `gstreamerInitialized` method is defined in the
`GStreamerBackendDelegate` protocol and indicates that the backend is `GStreamerBackendDelegate` protocol and indicates that the backend is
ready to accept commands. In this case, the Play and Pause buttons are ready to accept commands. In this case, the Play and Pause buttons are
re-enabled and the Label text is set to “Ready”. This method is called re-enabled and the Label text is set to “Ready”. This method is called
from a Dispatch Queue other than the Main one; therefore the need for from a Dispatch Queue other than the Main one; therefore the need for
the the
[dispatch\_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call [dispatch_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call
wrapping all UI code. wrapping all UI code.
``` ```
@ -203,19 +205,19 @@ wrapping all UI code.
} }
``` ```
The `gstreamerSetUIMessage:` method also belongs to the The `gstreamerSetUIMessage:` method also belongs to the
`GStreamerBackendDelegate` protocol. It is called when the backend wants `GStreamerBackendDelegate` protocol. It is called when the backend wants
to report some message to the user. In this case, the message is copied to report some message to the user. In this case, the message is copied
onto the Label in the UI, again, from within a onto the Label in the UI, again, from within a
[dispatch\_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call. [dispatch_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call.
# The GStreamer Backend ## The GStreamer Backend
The `GStreamerBackend` class performs all GStreamer-related tasks and The `GStreamerBackend` class performs all GStreamer-related tasks and
offers a simplified interface to the application, which does not need to offers a simplified interface to the application, which does not need to
deal with all the GStreamer details. When it needs to perform any UI deal with all the GStreamer details. When it needs to perform any UI
action, it does so through a delegate, which is expected to adhere to action, it does so through a delegate, which is expected to adhere to
the `GStreamerBackendDelegate` protocol: the `GStreamerBackendDelegate` protocol:
**GStreamerBackend.m** **GStreamerBackend.m**
@ -398,7 +400,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
@end @end
``` ```
 
#### Interface methods: #### Interface methods:
@ -422,13 +424,13 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
} }
``` ```
The `init:` method creates the instance by calling `[super init]`, The `init:` method creates the instance by calling `[super init]`,
stores the delegate object that will handle the UI interaction and stores the delegate object that will handle the UI interaction and
launches the `app_function`, from a separate, concurrent, Dispatch launches the `app_function`, from a separate, concurrent, Dispatch
Queue. The `app_function` monitors the GStreamer bus for messages and Queue. The `app_function` monitors the GStreamer bus for messages and
warns the application when interesting things happen. warns the application when interesting things happen.
`init:` also registers a new GStreamer debug category and sets its `init:` also registers a new GStreamer debug category and sets its
threshold, so we can see the debug output from within Xcode and keep threshold, so we can see the debug output from within Xcode and keep
track of our application progress. track of our application progress.
@ -444,7 +446,7 @@ track of our application progress.
} }
``` ```
The `dealloc` method takes care of bringing the pipeline to the NULL The `dealloc` method takes care of bringing the pipeline to the NULL
state and releasing it. state and releasing it.
``` ```
@ -463,7 +465,7 @@ state and releasing it.
} }
``` ```
The `play` and `pause` methods simply try to set the pipeline to the The `play` and `pause` methods simply try to set the pipeline to the
desired state and warn the application if something fails. desired state and warn the application if something fails.
#### Private methods: #### Private methods:
@ -480,11 +482,11 @@ desired state and warn the application if something fails.
} }
``` ```
`setUIMessage:` turns the C strings that GStreamer uses (UTF8 `char *`) `setUIMessage:` turns the C strings that GStreamer uses (UTF8 `char *`)
into `NSString *` and displays them through the into `NSString *` and displays them through the
`gstreamerSetUIMessage` method of the `GStreamerBackendDelegate`. The `gstreamerSetUIMessage` method of the `GStreamerBackendDelegate`. The
implementation of this method is marked as `@optional`, and hence the implementation of this method is marked as `@optional`, and hence the
check for its existence in the delegate with `respondsToSelector:` check for its existence in the delegate with `respondsToSelector:`
``` ```
/* Retrieve errors from the bus and show them on the UI */ /* Retrieve errors from the bus and show them on the UI */
@ -517,18 +519,18 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
} }
``` ```
The `error_cb()` and `state_changed_cb()` are callbacks registered to The `error_cb()` and `state_changed_cb()` are callbacks registered to
the `error` and `state-changed` events in GStreamer, and their goal is the `error` and `state-changed` events in GStreamer, and their goal is
to inform the user about these events. These callbacks have been widely to inform the user about these events. These callbacks have been widely
used in the [Basic tutorials](Basic%2Btutorials.html) and their used in the [Basic tutorials](sdk-basic-tutorials.md) and their
implementation is very similar, except for two points: implementation is very similar, except for two points:
Firstly, the messages are conveyed to the user through the Firstly, the messages are conveyed to the user through the
`setUIMessage:` private method discussed above. `setUIMessage:` private method discussed above.
Secondly, they require an instance of a `GStreamerBackend` object in Secondly, they require an instance of a `GStreamerBackend` object in
order to call its instance method `setUIMessage:`, which is passed order to call its instance method `setUIMessage:`, which is passed
through the `userdata` pointer of the callbacks (the `self` pointer in through the `userdata` pointer of the callbacks (the `self` pointer in
these implementations). This is discussed below when registering the these implementations). This is discussed below when registering the
callbacks in the `app_function`. callbacks in the `app_function`.
@ -548,14 +550,14 @@ callbacks in the `app_function`.
} }
``` ```
`check_initialization_complete()` verifies that all conditions are met `check_initialization_complete()` verifies that all conditions are met
to consider the backend ready to accept commands and tell the to consider the backend ready to accept commands and tell the
application if so. In this simple tutorial the only conditions are that application if so. In this simple tutorial the only conditions are that
the main loop exists and that we have not already told the application the main loop exists and that we have not already told the application
about this fact. Later (more complex) tutorials include additional about this fact. Later (more complex) tutorials include additional
conditions. conditions.
Finally, most of the GStreamer work is performed in the app\_function. Finally, most of the GStreamer work is performed in the app_function.
It exists with almost identical content in the Android tutorial, which It exists with almost identical content in the Android tutorial, which
exemplifies how the same code can run on both platforms with little exemplifies how the same code can run on both platforms with little
change. change.
@ -566,11 +568,11 @@ change.
g_main_context_push_thread_default(context); g_main_context_push_thread_default(context);
``` ```
It first creates a GLib context so all `GSource`s are kept in the same It first creates a GLib context so all `GSource`s are kept in the same
place. This also helps cleaning after GSources created by other place. This also helps cleaning after GSources created by other
libraries which might not have been properly disposed of. A new context libraries which might not have been properly disposed of. A new context
is created with `g_main_context_new()` and then it is made the default is created with `g_main_context_new()` and then it is made the default
one for the thread with `g_main_context_push_thread_default()`. one for the thread with `g_main_context_push_thread_default()`.
``` ```
/* Build pipeline */ /* Build pipeline */
@ -584,9 +586,9 @@ one for the thread with `g_main_context_push_thread_default()`.
} }
``` ```
It then creates a pipeline the easy way, with `gst_parse_launch()`. In It then creates a pipeline the easy way, with `gst_parse_launch()`. In
this case, it is simply an  `audiotestsrc` (which produces a continuous this case, it is simply an `audiotestsrc` (which produces a continuous
tone) and an `autoaudiosink`, with accompanying adapter tone) and an `autoaudiosink`, with accompanying adapter
elements. elements.
``` ```
@ -603,11 +605,11 @@ elements.
These lines create a bus signal watch and connect to some interesting These lines create a bus signal watch and connect to some interesting
signals, just like we have been doing in the [Basic signals, just like we have been doing in the [Basic
tutorials](Basic%2Btutorials.html). The creation of the watch is done tutorials](sdk-basic-tutorials.md). The creation of the watch is done
step by step instead of using `gst_bus_add_signal_watch()` to exemplify step by step instead of using `gst_bus_add_signal_watch()` to exemplify
how to use a custom GLib context. The interesting bit here is the usage how to use a custom GLib context. The interesting bit here is the usage
of a of a
[\_\_bridge](http://clang.llvm.org/docs/AutomaticReferenceCounting.html#bridged-casts) [__bridge](http://clang.llvm.org/docs/AutomaticReferenceCounting.html#bridged-casts)
cast to convert an Objective-C object into a plain C pointer. In this cast to convert an Objective-C object into a plain C pointer. In this
case we do not worry much about transferal of ownership of the object, case we do not worry much about transferal of ownership of the object,
because it travels through C-land untouched. It re-emerges at the because it travels through C-land untouched. It re-emerges at the
@ -626,38 +628,32 @@ different callbacks through the userdata pointer and cast again to a
``` ```
Finally, the main loop is created and set to run. Before entering the Finally, the main loop is created and set to run. Before entering the
main loop, though, `check_initialization_complete()` is called. Upon main loop, though, `check_initialization_complete()` is called. Upon
exit, the main loop is disposed of. exit, the main loop is disposed of.
And this is it\! This has been a rather long tutorial, but we covered a And this is it! This has been a rather long tutorial, but we covered a
lot of territory. Building on top of this one, the following ones are lot of territory. Building on top of this one, the following ones are
shorter and focus only on the new topics. shorter and focus only on the new topics.
# Conclusion ## Conclusion
This tutorial has shown: This tutorial has shown:
- How to handle GStreamer code from a separate thread using a - How to handle GStreamer code from a separate thread using a
[Dispatch [Dispatch
Queue](http://developer.apple.com/library/ios/#documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html) other Queue](http://developer.apple.com/library/ios/#documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html) other
than the Main one. than the Main one.
- How to pass objects between the Objective-C UI code and the C - How to pass objects between the Objective-C UI code and the C
GStreamer code. GStreamer code.
Most of the methods introduced in this tutorial, Most of the methods introduced in this tutorial,
like `check_initialization_complete()`and `app_function()`, and the like `check_initialization_complete()`and `app_function()`, and the
interface methods `init:`, `play:`, `pause:`, interface methods `init:`, `play:`, `pause:`,
`gstreamerInitialized:` and `setUIMessage:` will continue to be used in `gstreamerInitialized:` and `setUIMessage:` will continue to be used in
the following tutorials with minimal modifications, so better get used the following tutorials with minimal modifications, so better get used
to them\! to them!
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif) [screenshot]: images/sdk-ios-tutorial-a-running-pipeline-screenshot.png
[ios-tutorial2-screenshot.png](attachments/3571718/3538954.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial2-screenshot.png](attachments/3571718/3538953.png)
(image/png)

View file

@ -1,30 +1,31 @@
# iOS tutorial 1: Link against GStreamer # iOS tutorial 1: Link against GStreamer
# Goal![](attachments/thumbnails/3014792/3113601) ## Goal
![screenshot]
The first iOS tutorial is simple. The objective is to get the GStreamer The first iOS tutorial is simple. The objective is to get the GStreamer
version and display it on screen. It exemplifies how to link against the version and display it on screen. It exemplifies how to link against the
GStreamer library from Xcode using objective-C. GStreamer library from Xcode using objective-C.
# Hello GStreamer\! ## Hello GStreamer!
The code for this project can be found in the tutorials folder of your The code for this project can be found in the tutorials folder of
GStreamer SDK iOS installation. It was created using the GStreamer **FIXME: where**. It was created using the GStreamer Single View
Single View Application template. The view contains only a Application template. The view contains only a `UILabel` that will be
`UILabel` that will be used to display the GStreamer's version to the used to display the GStreamer's version to the user.
user.
# The User Interface ## The User Interface
The UI uses storyboards and contains a single `View` with a centered The UI uses storyboards and contains a single `View` with a centered
`UILabel`. The `ViewController` for the `View` links its `UILabel`. The `ViewController` for the `View` links its
`label` variable to this `UILabel` as an `IBOutlet`. `label` variable to this `UILabel` as an `IBOutlet`.
**ViewController.h** **ViewController.h**
``` ```
#import <UIKit/UIKit.h> #import <UIKit/UIKit.h>
 
@interface ViewController : UIViewController { @interface ViewController : UIViewController {
IBOutlet UILabel *label; IBOutlet UILabel *label;
} }
@ -34,18 +35,18 @@ The UI uses storyboards and contains a single `View` with a centered
@end @end
``` ```
# The GStreamer backend ## The GStreamer backend
All GStreamer-handling code is kept in a single Objective-C class called All GStreamer-handling code is kept in a single Objective-C class called
`GStreamerBackend`. In successive tutorials it will get expanded, but, `GStreamerBackend`. In successive tutorials it will get expanded, but,
for now, it only contains a method to retrieve the GStreamer version. for now, it only contains a method to retrieve the GStreamer version.
The `GStreamerBackend` is made in Objective-C so it can take care of the The `GStreamerBackend` is made in Objective-C so it can take care of the
few C-to-Objective-C conversions that might be necessary (like `char few C-to-Objective-C conversions that might be necessary (like `char
*` to `NSString *`, for example). This eases the usage of this class by *` to `NSString *`, for example). This eases the usage of this class by
the UI code, which is typically made in pure Objective-C. the UI code, which is typically made in pure Objective-C.
`GStreamerBackend` serves exactly the same purpose as the JNI code in `GStreamerBackend` serves exactly the same purpose as the JNI code in
the [Android tutorials](Android%2Btutorials.html). the [](sdk-android-tutorials.md).
**GStreamerBackend.m** **GStreamerBackend.m**
@ -67,19 +68,19 @@ the [Android tutorials](Android%2Btutorials.html).
@end @end
``` ```
The `getGStreamerVersion()` method simply calls The `getGStreamerVersion()` method simply calls
`gst_version_string()` to obtain a string describing this version of `gst_version_string()` to obtain a string describing this version of
GStreamer. This [Modified GStreamer. This [Modified
UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) string is then UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) string is then
converted to a `NSString *` by ` NSString:stringWithUTF8String `and converted to a `NSString *` by ` NSString:stringWithUTF8String `and
returned. Objective-C will take care of freeing the memory used by the returned. Objective-C will take care of freeing the memory used by the
new `NSString *`, but we need to free the `char *` returned new `NSString *`, but we need to free the `char *` returned
by `gst_version_string()`. by `gst_version_string()`.
# The View Controller ## The View Controller
The view controller instantiates the GStremerBackend and asks it for the The view controller instantiates the GStremerBackend and asks it for the
GStreamer version to display at the label. That's it\! GStreamer version to display at the label. That's it!
**ViewController.m** **ViewController.m**
@ -115,28 +116,18 @@ GStreamer version to display at the label. That's it\!
@end @end
``` ```
# Conclusion ## Conclusion
This ends the first iOS tutorial. It has shown that, due to the This ends the first iOS tutorial. It has shown that, due to the
compatibility of C and Objective-C, adding GStreamer support to an iOS compatibility of C and Objective-C, adding GStreamer support to an iOS
app is as easy as it is on a Desktop application. An extra Objective-C app is as easy as it is on a Desktop application. An extra Objective-C
wrapper has been added (the `GStreamerBackend` class) for clarity, but wrapper has been added (the `GStreamerBackend` class) for clarity, but
calls to the GStreamer framework are valid from any part of the calls to the GStreamer framework are valid from any part of the
application code. application code.
The following tutorials detail the few places in which care has to be The following tutorials detail the few places in which care has to be
taken when developing specifically for the iOS platform. taken when developing specifically for the iOS platform.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon!
## Attachments: [screenshot]: images/sdk-ios-tutorial-link-against-gstreamer-screenshot.png
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113602.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113603.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113601.png)
(image/png)

View file

@ -1,9 +1,10 @@
# iOS tutorial 3: Video # iOS tutorial 3: Video
# Goal![](attachments/3571736/3538955.png) # Goal
Except for [Basic tutorial 5: GUI toolkit ![screenshot]
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html),
Except for [](sdk-basic-tutorial-toolkit-integration.md),
which embedded a video window on a GTK application, all tutorials so far which embedded a video window on a GTK application, all tutorials so far
relied on GStreamer video sinks to create a window to display their relied on GStreamer video sinks to create a window to display their
contents. The video sink on iOS is not capable of creating its own contents. The video sink on iOS is not capable of creating its own
@ -17,24 +18,23 @@ shows:
Since iOS does not provide a windowing system, a GStreamer video sink Since iOS does not provide a windowing system, a GStreamer video sink
cannot create pop-up windows as it would do on a Desktop platform. cannot create pop-up windows as it would do on a Desktop platform.
Fortunately, the `XOverlay` interface allows providing video sinks with Fortunately, the `VideoOverlay` interface allows providing video sinks with
an already created window onto which they can draw, as we have seen an already created window onto which they can draw, as we have seen
in [Basic tutorial 5: GUI toolkit in [](sdk-basic-tutorial-toolkit-integration.md).
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html).
In this tutorial, a `UIView` widget (actually, a subclass of it) is In this tutorial, a `UIView` widget (actually, a subclass of it) is
placed on the main storyboard. In the `viewDidLoad` method of the placed on the main storyboard. In the `viewDidLoad` method of the
`ViewController`, we pass a pointer to this `UIView `to the instance of `ViewController`, we pass a pointer to this `UIView `to the instance of
the `GStreamerBackend`, so it can tell the video sink where to draw. the `GStreamerBackend`, so it can tell the video sink where to draw.
# The User Interface # The User Interface
The storyboard from the previous tutorial is expanded: A `UIView `is The storyboard from the previous tutorial is expanded: A `UIView `is
added over the toolbar and pinned to all sides so it takes up all added over the toolbar and pinned to all sides so it takes up all
available space (`video_container_view` outlet). Inside it, another available space (`video_container_view` outlet). Inside it, another
`UIView `is added (`video_view` outlet) which contains the actual video, `UIView `is added (`video_view` outlet) which contains the actual video,
centered to its parent, and with a size that adapts to the media size centered to its parent, and with a size that adapts to the media size
(through the `video_width_constraint` and `video_height_constraint` (through the `video_width_constraint` and `video_height_constraint`
outlets): outlets):
**ViewController.h** **ViewController.h**
@ -65,8 +65,8 @@ outlets):
# The View Controller # The View Controller
The `ViewController `class manages the UI, instantiates The `ViewController `class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its the `GStreamerBackend` and also performs some UI-related tasks on its
behalf: behalf:
**ViewController.m** **ViewController.m**
@ -193,12 +193,12 @@ media is constant and initialized in `viewDidLoad`:
} }
``` ```
As shown below, the `GStreamerBackend` constructor has also been As shown below, the `GStreamerBackend` constructor has also been
expanded to accept another parameter: the `UIView *` where the video expanded to accept another parameter: the `UIView *` where the video
sink should draw. sink should draw.
The rest of the `ViewController `code is the same as the previous The rest of the `ViewController `code is the same as the previous
tutorial, except for the code that adapts the `video_view` size to the tutorial, except for the code that adapts the `video_view` size to the
media size, respecting its aspect ratio: media size, respecting its aspect ratio:
``` ```
@ -220,11 +220,11 @@ media size, respecting its aspect ratio:
} }
``` ```
The `viewDidLayoutSubviews` method is called every time the main view The `viewDidLayoutSubviews` method is called every time the main view
size has changed (for example, due to a device orientation change) and size has changed (for example, due to a device orientation change) and
the entire layout has been recalculated. At this point, we can access the entire layout has been recalculated. At this point, we can access
the `bounds` property of the `video_container_view` to retrieve its new the `bounds` property of the `video_container_view` to retrieve its new
size and change the `video_view` size accordingly. size and change the `video_view` size accordingly.
The simple algorithm above maximizes either the width or the height of The simple algorithm above maximizes either the width or the height of
the `video_view`, while changing the other axis so the aspect ratio of the `video_view`, while changing the other axis so the aspect ratio of
@ -233,18 +233,18 @@ with a surface of the correct proportions, so it does not need to add
black borders (*letterboxing*), which is a waste of processing power. black borders (*letterboxing*), which is a waste of processing power.
The final size is reported to the layout engine by changing the The final size is reported to the layout engine by changing the
`constant` field in the width and height `Constraints` of the `constant` field in the width and height `Constraints` of the
`video_view`. These constraints have been created in the storyboard and `video_view`. These constraints have been created in the storyboard and
are accessible to the `ViewController `through IBOutlets, as is usually are accessible to the `ViewController `through IBOutlets, as is usually
done with other widgets. done with other widgets.
# The GStreamer Backend # The GStreamer Backend
The `GStreamerBackend` class performs all GStreamer-related tasks and The `GStreamerBackend` class performs all GStreamer-related tasks and
offers a simplified interface to the application, which does not need to offers a simplified interface to the application, which does not need to
deal with all the GStreamer details. When it needs to perform any UI deal with all the GStreamer details. When it needs to perform any UI
action, it does so through a delegate, which is expected to adhere to action, it does so through a delegate, which is expected to adhere to
the `GStreamerBackendDelegate` protocol: the `GStreamerBackendDelegate` protocol:
**GStreamerBackend.m** **GStreamerBackend.m**
@ -252,7 +252,7 @@ the `GStreamerBackendDelegate` protocol:
#import "GStreamerBackend.h" #import "GStreamerBackend.h"
#include <gst/gst.h> #include <gst/gst.h>
#include <gst/interfaces/xoverlay.h> #include <gst/video/video.h>
GST_DEBUG_CATEGORY_STATIC (debug_category); GST_DEBUG_CATEGORY_STATIC (debug_category);
#define GST_CAT_DEFAULT debug_category #define GST_CAT_DEFAULT debug_category
@ -266,7 +266,7 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
@implementation GStreamerBackend { @implementation GStreamerBackend {
id ui_delegate; /* Class that we use to interact with the user interface */ id ui_delegate; /* Class that we use to interact with the user interface */
GstElement *pipeline; /* The running pipeline */ GstElement *pipeline; /* The running pipeline */
GstElement *video_sink;/* The video sink element which receives XOverlay commands */ GstElement *video_sink;/* The video sink element which receives VideoOverlay commands */
GMainContext *context; /* GLib context used to run the main loop */ GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */ GMainLoop *main_loop; /* GLib main loop */
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */ gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
@ -391,7 +391,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
g_main_context_push_thread_default(context); g_main_context_push_thread_default(context);
/* Build pipeline */ /* Build pipeline */
pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink", &error); pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
if (error) { if (error) {
gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message); gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
g_clear_error (&error); g_clear_error (&error);
@ -403,12 +403,12 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Set the pipeline to READY, so it can already accept a window handle */ /* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY); gst_element_set_state(pipeline, GST_STATE_READY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_X_OVERLAY); video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
if (!video_sink) { if (!video_sink) {
GST_ERROR ("Could not retrieve video sink"); GST_ERROR ("Could not retrieve video sink");
return; return;
} }
gst_x_overlay_set_window_handle(GST_X_OVERLAY(video_sink), (guintptr) (id) ui_video_view); gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */ /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (pipeline); bus = gst_element_get_bus (pipeline);
@ -442,13 +442,13 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
``` ```
The main differences with the previous tutorial are related to the The main differences with the previous tutorial are related to the
handling of the `XOverlay` interface: handling of the `VideoOverlay` interface:
``` ```
@implementation GStreamerBackend { @implementation GStreamerBackend {
id ui_delegate; /* Class that we use to interact with the user interface */ id ui_delegate; /* Class that we use to interact with the user interface */
GstElement *pipeline; /* The running pipeline */ GstElement *pipeline; /* The running pipeline */
GstElement *video_sink;/* The video sink element which receives XOverlay commands */ GstElement *video_sink;/* The video sink element which receives VideoOverlay commands */
GMainContext *context; /* GLib context used to run the main loop */ GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */ GMainLoop *main_loop; /* GLib main loop */
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */ gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
@ -457,7 +457,7 @@ handling of the `XOverlay` interface:
``` ```
The class is expanded to keep track of the video sink element in the The class is expanded to keep track of the video sink element in the
pipeline and the `UIView *` onto which rendering is to occur. pipeline and the `UIView *` onto which rendering is to occur.
``` ```
-(id) init:(id) uiDelegate videoView:(UIView *)video_view -(id) init:(id) uiDelegate videoView:(UIView *)video_view
@ -480,62 +480,61 @@ pipeline and the `UIView *` onto which rendering is to occur.
} }
``` ```
The constructor accepts the `UIView *` as a new parameter, which, at The constructor accepts the `UIView *` as a new parameter, which, at
this point, is simply remembered in `ui_video_view`. this point, is simply remembered in `ui_video_view`.
``` ```
/* Build pipeline */ /* Build pipeline */
pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink", &error); pipeline = gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);
``` ```
Then, in the `app_function`, the pipeline is constructed. This time we Then, in the `app_function`, the pipeline is constructed. This time we
build a video pipeline using a simple `videotestsrc` element with a build a video pipeline using a simple `videotestsrc` element with a
`warptv` to add some spice. The video sink is `autovideosink`, which `warptv` to add some spice. The video sink is `autovideosink`, which
choses the appropriate sink for the platform (currently, choses the appropriate sink for the platform (currently,
`eglglessink` is the only option for `glimagesink` is the only option for
iOS). iOS).
``` ```
/* Set the pipeline to READY, so it can already accept a window handle */ /* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY); gst_element_set_state(pipeline, GST_STATE_READY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_X_OVERLAY); video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
if (!video_sink) { if (!video_sink) {
GST_ERROR ("Could not retrieve video sink"); GST_ERROR ("Could not retrieve video sink");
return; return;
} }
gst_x_overlay_set_window_handle(GST_X_OVERLAY(video_sink), (guintptr) (id) ui_video_view); gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(video_sink), (guintptr) (id) ui_video_view);
``` ```
Once the pipeline is built, we set it to READY. In this state, dataflow Once the pipeline is built, we set it to READY. In this state, dataflow
has not started yet, but the caps of adjacent elements have been has not started yet, but the caps of adjacent elements have been
verified to be compatible and their pads have been linked. Also, the verified to be compatible and their pads have been linked. Also, the
`autovideosink` has already instantiated the actual video sink so we can `autovideosink` has already instantiated the actual video sink so we can
ask for it immediately. ask for it immediately.
The `gst_bin_get_by_interface()` method will examine the whole pipeline The `gst_bin_get_by_interface()` method will examine the whole pipeline
and return a pointer to an element which supports the requested and return a pointer to an element which supports the requested
interface. We are asking for the `XOverlay` interface, explained in interface. We are asking for the `VideoOverlay` interface, explained in
[Basic tutorial 5: GUI toolkit [](sdk-basic-tutorial-toolkit-integration.md),
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html),
which controls how to perform rendering into foreign (non-GStreamer) which controls how to perform rendering into foreign (non-GStreamer)
windows. The internal video sink instantiated by `autovideosink` is the windows. The internal video sink instantiated by `autovideosink` is the
only element in this pipeline implementing it, so it will be returned. only element in this pipeline implementing it, so it will be returned.
Once we have the video sink, we inform it of the `UIView` to use for Once we have the video sink, we inform it of the `UIView` to use for
rendering, through the `gst_x_overlay_set_window_handle()` method. rendering, through the `gst_video_overlay_set_window_handle()` method.
# EaglUIView # EaglUIView
One last detail remains. In order for `eglglessink` to be able to draw One last detail remains. In order for `glimagesink` to be able to draw
on the on the
[`UIView`](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIView_Class/UIView/UIView.html), [`UIView`](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIView_Class/UIView/UIView.html),
the the
[`Layer`](http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Reference/CALayer_class/Introduction/Introduction.html#//apple_ref/occ/cl/CALayer) associated [`Layer`](http://developer.apple.com/library/ios/#documentation/GraphicsImaging/Reference/CALayer_class/Introduction/Introduction.html#//apple_ref/occ/cl/CALayer) associated
with this view must be of the with this view must be of the
[`CAEAGLLayer`](http://developer.apple.com/library/ios/#documentation/QuartzCore/Reference/CAEAGLLayer_Class/CAEGLLayer/CAEGLLayer.html#//apple_ref/occ/cl/CAEAGLLayer) class. [`CAEAGLLayer`](http://developer.apple.com/library/ios/#documentation/QuartzCore/Reference/CAEAGLLayer_Class/CAEGLLayer/CAEGLLayer.html#//apple_ref/occ/cl/CAEAGLLayer) class.
To this avail, we create the `EaglUIView` class, derived from To this avail, we create the `EaglUIView` class, derived from
`UIView `and overriding the `layerClass` method: `UIView `and overriding the `layerClass` method:
**EaglUIView.m** **EaglUIView.m**
@ -554,9 +553,9 @@ To this avail, we create the `EaglUIView` class, derived from
@end @end
``` ```
When creating storyboards, bear in mind that the `UIView `which should When creating storyboards, bear in mind that the `UIView `which should
contain the video must have `EaglUIView` as its custom class. This is contain the video must have `EaglUIView` as its custom class. This is
easy to setup from the Xcode interface builder. Take a look at the easy to setup from the Xcode interface builder. Take a look at the
tutorial storyboard to see how to achieve this. tutorial storyboard to see how to achieve this.
And this is it, using GStreamer to output video onto an iOS application And this is it, using GStreamer to output video onto an iOS application
@ -566,18 +565,14 @@ is as simple as it seems.
This tutorial has shown: This tutorial has shown:
- How to display video on iOS using a `UIView `and - How to display video on iOS using a `UIView `and
the `XOverlay` interface. the `VideoOverlay` interface.
- How to report the media size to the iOS layout engine through - How to report the media size to the iOS layout engine through
runtime manipulation of width and height constraints. runtime manipulation of width and height constraints.
The following tutorial plays an actual clip and adds a few more controls The following tutorial plays an actual clip and adds a few more controls
to this tutorial in order to build a simple media player. to this tutorial in order to build a simple media player.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon!
## Attachments: [screenshot]: images/sdk-ios-tutorial-video-screenshot.png
![](images/icons/bullet_blue.gif)
[ios-tutorial3-screenshot.png](attachments/3571736/3538955.png)
(image/png)

View file

@ -1,34 +1,32 @@
# iOS tutorials # iOS tutorials
# Welcome to the GStreamer SDK iOS tutorials ## Welcome to the GStreamer SDK iOS tutorials
These tutorials describe iOS-specific topics. General GStreamer concepts These tutorials describe iOS-specific topics. General GStreamer
will not be explained in these tutorials, so the [Basic concepts will not be explained in these tutorials, so the
tutorials](http://docs.gstreamer.com/display/GstSDK/Basic+tutorials) should [](sdk-basic-tutorials.md) should be reviewed first. The reader should
be reviewed first. The reader should also be familiar with basic iOS also be familiar with basic iOS programming techniques.
programming techniques.
The iOS tutorials have the same structure as the [Android The iOS tutorials have the same structure as the
tutorials](Android%2Btutorials.html): Each one builds on top of the [](sdk-android-tutorials.md): Each one builds on top of the previous
previous one and adds progressively more functionality, until a working one and adds progressively more functionality, until a working media
media player application is obtained in [iOS tutorial 5: A Complete player application is obtained in
media [](sdk-ios-tutorial-a-complete-media-player.md).
player](http://docs.gstreamer.com/display/GstSDK/iOS+tutorial+5%3A+A+Complete+media+player).
Make sure to have read the instructions in [Installing for iOS Make sure to have read the instructions in
development](Installing%2Bfor%2BiOS%2Bdevelopment.html) before jumping [](sdk-installing-for-ios-development.md) before jumping into the iOS
into the iOS tutorials. tutorials.
All iOS tutorials are split into the following classes: All iOS tutorials are split into the following classes:
- The `GStreamerBackend` class performs all GStreamer-related tasks - The `GStreamerBackend` class performs all GStreamer-related tasks
and offers a simplified interface to the application, which does not and offers a simplified interface to the application, which does not
need to deal with all the GStreamer details. When it needs to need to deal with all the GStreamer details. When it needs to
perform any UI action, it does so through a delegate, which is perform any UI action, it does so through a delegate, which is
expected to adhere to the `GStreamerBackendDelegate` protocol. expected to adhere to the `GStreamerBackendDelegate` protocol.
- The `ViewController` class manages the UI, instantiates the - The `ViewController` class manages the UI, instantiates the
`GStreamerBackend` and also performs some UI-related tasks on its `GStreamerBackend` and also performs some UI-related tasks on its
behalf. behalf.
- The `GStreamerBackendDelegate` protocol defines which methods a - The `GStreamerBackendDelegate` protocol defines which methods a
class can implement in order to serve as a UI delegate for the class can implement in order to serve as a UI delegate for the
`GStreamerBackend`. `GStreamerBackend`.