Finish updating Android tutorials:

This commit is contained in:
Olivier Crête 2016-06-16 20:01:54 -04:00
parent 39eedcc9d7
commit 1912801c9c
22 changed files with 293 additions and 365 deletions

12
TODO.md
View file

@ -4,10 +4,6 @@ This is just a simple TODO list to follow progress of the port from
gstreamer.com content to hotdoc
Pages to review:
- sdk-android-tutorials.md
- sdk-android-tutorial-video.md
- sdk-android-tutorial-media-player.md
- sdk-android-tutorial-a-complete-media-player.md
- sdk-ios-tutorials.md
- sdk-ios-tutorial-link-against-gstreamer.md
- sdk-ios-tutorial-a-running-pipeline.md
@ -55,8 +51,12 @@ Reviewed pages:
- sdk-building-from-source-using-cerbero.md
- sdk-table-of-concepts.md
- sdk-tutorials.md
- sdk-android-tutorial-link-against-gstreamer.md
- sdk-android-tutorial-a-running-pipeline.md
- sdk-android-tutorials.md
- sdk-android-tutorial-link-against-gstreamer.md
- sdk-android-tutorial-a-running-pipeline.md
- sdk-android-tutorial-video.md
- sdk-android-tutorial-a-complete-media-player.md
- sdk-android-tutorial-media-player.md
- sdk-playback-tutorials.md
- sdk-playback-tutorial-playbin-usage.md
- sdk-playback-tutorial-subtitle-management.md

Binary file not shown.

Before

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 142 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 142 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 32 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 142 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 358 B

View file

Before

Width:  |  Height:  |  Size: 358 B

After

Width:  |  Height:  |  Size: 358 B

View file

Before

Width:  |  Height:  |  Size: 416 KiB

After

Width:  |  Height:  |  Size: 416 KiB

View file

Before

Width:  |  Height:  |  Size: 32 KiB

After

Width:  |  Height:  |  Size: 32 KiB

View file

Before

Width:  |  Height:  |  Size: 35 KiB

After

Width:  |  Height:  |  Size: 35 KiB

View file

Before

Width:  |  Height:  |  Size: 290 KiB

After

Width:  |  Height:  |  Size: 290 KiB

View file

Before

Width:  |  Height:  |  Size: 142 KiB

After

Width:  |  Height:  |  Size: 142 KiB

View file

@ -1,30 +1,20 @@
# Android tutorial 5: A Complete media player
# Goal![](attachments/thumbnails/2687069/2654436)
## Goal!
![screenshot]
This tutorial wants to be the “demo application” that showcases what can
be done with GStreamer in the Android platform.
It is intended to be downloaded in final, compiled, form rather than
analyzed for its pedagogical value, since it adds very little GStreamer
knowledge over what has already been shown in [Android tutorial 4: A
basic media
player](Android%2Btutorial%2B4%253A%2BA%2Bbasic%2Bmedia%2Bplayer.html).
knowledge over what has already been shown in [](sdk-android-tutorial-media-player.md).
<table>
<thead>
<tr class="header">
<th>Tutorial 5</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td><a href="http://cdn.gstreamer.com/android/arm/com.gst_sdk_tutorials.tutorial_5.Tutorial5-2012.11.apk" class="external-link">GStreamer SDK 2013.6 (Congo) for Android ARM (Tutorial 5 Installable APK)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/android/arm/com.gst_sdk_tutorials.tutorial_5.Tutorial5-2012.11.apk" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/android/arm/com.gst_sdk_tutorials.tutorial_5.Tutorial5-2012.11.apk.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/android/arm/com.gst_sdk_tutorials.tutorial_5.Tutorial5-2012.11.apk.sha1" class="external-link">sha1</a></td>
</tr>
</tbody>
</table>
# Introduction
**FIXME: Do we want to provide a binary of the app?**
## Introduction
The previous tutorial already implemented a basic media player. This one
simply adds a few finishing touches. In particular, it adds the
@ -35,24 +25,24 @@ These are not features directly related to GStreamer, and are therefore
outside the scope of these tutorials. Only a few implementation pointers
are given here.
# Registering as a media player
## Registering as a media player
The `AndroidManifest.xml` tells the Android system the capabilities of
the application. By specifying in the `intent-filter` of the activity
that it understands the `audio/*`, `video/*` and `image/*` MIME types,
The `AndroidManifest.xml` tells the Android system the capabilities of
the application. By specifying in the `intent-filter` of the activity
that it understands the `audio/*`, `video/*` and `image/*` MIME types,
the tutorial will be offered as an option whenever an application
requires such medias to be viewed.
“Unfortunately”, GStreamer knows more file formats than Android does,
so, for some files, Android will not provide a MIME type. For these
cases, a new `intent-filter` has to be provided which ignores MIME types
cases, a new `intent-filter` has to be provided which ignores MIME types
and focuses only in the filename extension. This is inconvenient because
the list of extensions can be large, but there does not seem to be
another option. In this tutorial, only a very short list of extensions
is provided, for simplicity.
Finally, GStreamer can also playback remote files, so URI schemes like
`http` are supported in another `intent-filter`. Android does not
`http` are supported in another `intent-filter`. Android does not
provide MIME types for remote files, so the filename extension list has
to be provided again.
@ -60,14 +50,13 @@ Once we have informed the system of our capabilities, it will start
sending
[Intents](http://developer.android.com/reference/android/content/Intent.html)
to invoke our activity, which will contain the desired URI to play. In
the `onCreate()` method the intent that invoked the activity is
the `onCreate()` method the intent that invoked the activity is
retrieved and checked for such URI.
# Implementing a file chooser dialog
## Implementing a file chooser dialog
The UI includes a new button ![](attachments/2687069/2654437.png) which
was not present in [Android tutorial 4: A basic media
player](Android%2Btutorial%2B4%253A%2BA%2Bbasic%2Bmedia%2Bplayer.html). It
The UI includes a new button ![media-next) which
was not present in [](sdk-android-tutorial-media-player.md). It
invokes a file chooser dialog (based on the [Android File
Dialog](http://code.google.com/p/android-file-dialog/) project) that
allows you to choose a local media file, no matter what extension or
@ -78,7 +67,7 @@ will set the pipeline to READY, pass the URI onto `playbin`, and bring
the pipeline back to the previous state). The current position is also
reset, so the new clip does not start in the previous position.
# Preventing the screen from turning off
## Preventing the screen from turning off
While watching a movie, there is typically no user activity. After a
short period of such inactivity, Android will dim the screen, and then
@ -88,22 +77,16 @@ is used. The application acquires the lock when the Play button is
pressed, so the screen is never turned off, and releases it when the
Pause button is pressed.
# Conclusion
## Conclusion
This finishes the series of Android tutorials. Each one of the preceding
tutorials has evolved on top of the previous one, showing how to
implement a particular set of features, and concluding in this tutorial
5. The goal of tutorial 5 is to build a complete media player which can
already be used to showcase the integration of GStreamer and Android.
This finishes the series of Android tutorials. Each one of the
preceding tutorials has evolved on top of the previous one, showing
how to implement a particular set of features, and concluding in this
tutorial 5. The goal of tutorial 5 is to build a complete media player
which can already be used to showcase the integration of GStreamer and
Android.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[tutorial5-screenshot.png](attachments/2687069/2654436.png)
(image/png)
![](images/icons/bullet_blue.gif)
[ic\_media\_next.png](attachments/2687069/2654438.png) (image/png)
![](images/icons/bullet_blue.gif)
[ic\_media\_next.png](attachments/2687069/2654437.png) (image/png)
[screenshot]: images/sdk-android-tutorial-a-complete-media-player-screenshot.png
[media-next]: images/media-next.png

View file

@ -1,9 +1,11 @@
# Android tutorial 2: A running pipeline
# Goal ![](attachments/thumbnails/2687063/2654324)
## Goal
The tutorials seen in the [Basic](Basic%2Btutorials.html) and
[Playback](Playback%2Btutorials.html) sections are intended for Desktop
![screenshot]
The tutorials seen in the [Basic](sdk-basic-tutorials.md) and
[Playback](sdk-playback-tutorials.md) sections are intended for Desktop
platforms and, therefore, their main thread is allowed to block (using
`gst_bus_pop_filtered()`) or relinquish control to a GLib main loop. On
Android this would lead to the application being tagged as
@ -15,15 +17,15 @@ learn:
- How to move the native code to its own thread
- How to allow threads created from C code to communicate with Java
- How to access Java code from C
- How to allocate a `CustomData` structure from C and have Java host
- How to allocate a `CustomData` structure from C and have Java host
it
# Introduction
## Introduction
When using a Graphical User Interface (UI), if the application waits for
GStreamer calls to complete the user experience will suffer. The usual
approach, with the [GTK+ toolkit](http://www.gtk.org) for example, is to
relinquish control to a GLib `GMainLoop` and let it control the events
relinquish control to a GLib `GMainLoop` and let it control the events
coming from the UI or GStreamer.
This approach can be very cumbersome when GStreamer and the Android UI
@ -45,12 +47,12 @@ code, which involves locating the desired methods ID in the class.
These IDs never change, so they are cached as global variables in the C
code and obtained in the static initializer of the class.
The code below builds a pipeline with an `audiotestsrc` and an
`autoaudiosink` (it plays an audible tone). Two buttons in the UI allow
The code below builds a pipeline with an `audiotestsrc` and an
`autoaudiosink` (it plays an audible tone). Two buttons in the UI allow
setting the pipeline to PLAYING or PAUSED. A TextView in the UI shows
messages sent from the C code (for errors and state changes).
# A pipeline on Android \[Java code\]
## A pipeline on Android \[Java code\]
**src/org/freedesktop/gstreamer/tutorials/tutorial\_2/Tutorial2.java**
@ -188,11 +190,11 @@ static {
```
As explained in the previous tutorial, the two native libraries are
loaded and their `JNI_OnLoad()` methods are executed. Here, we also call
loaded and their `JNI_OnLoad()` methods are executed. Here, we also call
the native method `nativeClassInit()`, previously declared with the
`native` keyword in line 19. We will later see what its purpose is
`native` keyword in line 19. We will later see what its purpose is
In the `onCreate()` method GStreamer is initialized as in the previous
In the `onCreate()` method GStreamer is initialized as in the previous
tutorial with `GStreamer.init(this)`, and then the layout is inflated
and listeners are setup for the two UI buttons:
@ -215,7 +217,7 @@ pause.setOnClickListener(new OnClickListener() {
Each button instructs the native code to set the pipeline to the desired
state, and also remembers this state in the
`is_playing_desired` variable.  This is required so, when the
`is_playing_desired` variable. This is required so, when the
application is restarted (for example, due to an orientation change), it
can set the pipeline again to the desired state. This approach is easier
and safer than tracking the actual pipeline state, because orientation
@ -241,14 +243,14 @@ native code reports itself as initialized we will use
nativeInit();
```
As will be shown in the C code, `nativeInit()` creates a dedicated
As will be shown in the C code, `nativeInit()` creates a dedicated
thread, a GStreamer pipeline, a GLib main loop, and, right before
calling `g_main_loop_run()` and going to sleep, it warns the Java code
calling `g_main_loop_run()` and going to sleep, it warns the Java code
that the native code is initialized and ready to accept commands.
This finishes the `onCreate()` method and the Java initialization. The
This finishes the `onCreate()` method and the Java initialization. The
UI buttons are disabled, so nothing will happen until native code is
ready and `onGStreamerInitialized()` is called:
ready and `onGStreamerInitialized()` is called:
``` java
private void onGStreamerInitialized () {
@ -291,11 +293,11 @@ method which lets bits of code to be executed from the correct thread. A
instance has to be constructed and any parameter can be passed either by
sub-classing
[Runnable](http://developer.android.com/reference/java/lang/Runnable.html)
and adding a dedicated constructor, or by using the `final` modifier, as
and adding a dedicated constructor, or by using the `final` modifier, as
shown in the above snippet.
The same problem exists when the native code wants to output a string in
our TextView using the `setMessage()` method: it has to be done from the
our TextView using the `setMessage()` method: it has to be done from the
UI thread. The solution is the same:
``` java
@ -309,7 +311,7 @@ private void setMessage(final String message) {
}
```
Finally, a few remaining bits:
Finally, a few remaining bits:
``` java
protected void onSaveInstanceState (Bundle outState) {
@ -335,7 +337,7 @@ all allocated resources.
This concludes the UI part of the tutorial.
# A pipeline on Android \[C code\]
## A pipeline on Android \[C code\]
**jni/tutorial-2.c**
@ -617,7 +619,7 @@ jint JNI_OnLoad(JavaVM *vm, void *reserved) {
}
```
Lets start with the `CustomData` structure. We have seen it in most of
Lets start with the `CustomData` structure. We have seen it in most of
the basic tutorials, and it is used to hold all our information in one
place, so we can easily pass it around to
callbacks:
@ -634,9 +636,9 @@ typedef struct _CustomData {
```
We will see the meaning of each member as we go. What is interesting now
is that `CustomData` belongs to the application, so a pointer is kept in
is that `CustomData` belongs to the application, so a pointer is kept in
the Tutorial2 Java class in the `private long
native_custom_data` attribute. Java only holds this pointer for us; it
native_custom_data` attribute. Java only holds this pointer for us; it
is completely handled in C code.
From C, this pointer can be set and retrieved with the
@ -644,8 +646,8 @@ From C, this pointer can be set and retrieved with the
and
[GetLongField()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp16572)
JNI functions, but two convenience macros have been defined,
`SET_CUSTOM_DATA` and `GET_CUSTOM_DATA`. These macros are handy because
the `long` type used in Java is always 64 bits wide, but the pointer
`SET_CUSTOM_DATA` and `GET_CUSTOM_DATA`. These macros are handy because
the `long` type used in Java is always 64 bits wide, but the pointer
used in C can be either 32 or 64 bits wide. The macros take care of the
conversion without warnings.
@ -669,10 +671,10 @@ jint JNI_OnLoad(JavaVM *vm, void *reserved) {
}
```
The `JNI_OnLoad` function is almost the same as the previous tutorial.
The `JNI_OnLoad` function is almost the same as the previous tutorial.
It registers the list of native methods (which is longer in this
tutorial). It also
uses [pthread\_key\_create()](http://pubs.opengroup.org/onlinepubs/9699919799/functions/pthread_key_create.html)
uses [pthread\_key\_create()](http://pubs.opengroup.org/onlinepubs/9699919799/functions/pthread_key_create.html)
to be able to store per-thread information, which is crucial to properly
manage the JNI Environment, as shown later.
@ -701,7 +703,7 @@ order for C code to be able to call a Java method, it needs to know the
methods
[MethodID](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/types.html#wp1064).
This ID is obtained from the methods name and signature and can be
cached. The purpose of the `gst_native_class_init()` function is to
cached. The purpose of the `gst_native_class_init()` function is to
obtain the IDs of all the methods and fields that the C code will need.
If some ID cannot be retrieved, the calling Java class does not offer
the expected interface and execution should halt (which is not currently
@ -720,15 +722,15 @@ static void gst_native_init (JNIEnv* env, jobject thiz) {
SET_CUSTOM_DATA (env, thiz, custom_data_field_id, data);
```
It first allocates memory for the `CustomData` structure and passes the
It first allocates memory for the `CustomData` structure and passes the
pointer to the Java class with `SET_CUSTOM_DATA`, so it is remembered.
``` c
data->app = (*env)->NewGlobalRef (env, thiz);
```
A pointer to the application class (the `Tutorial2` class) is also kept
in `CustomData` (a [Global
A pointer to the application class (the `Tutorial2` class) is also kept
in `CustomData` (a [Global
Reference](http://developer.android.com/guide/practices/jni.html#local_and_global_references)
is used) so its methods can be called later.
@ -737,7 +739,7 @@ pthread_create (&gst_app_thread, NULL, &app_function, data);
```
Finally, a thread is created and it starts running the
`app_function()` method.
`app_function()` method.
### `app_function()`
@ -757,10 +759,10 @@ static void *app_function (void *userdata) {
g_main_context_push_thread_default(data->context);
```
It first creates a GLib context so all `GSource` are kept in the same
It first creates a GLib context so all `GSource` are kept in the same
place. This also helps cleaning after GSources created by other
libraries which might not have been properly disposed of. A new context
is created with `g_main_context_new()` and then it is made the default
is created with `g_main_context_new()` and then it is made the default
one for the thread with
`g_main_context_push_thread_default()`.
@ -776,7 +778,7 @@ if (error) {
```
It then creates a pipeline the easy way, with `gst-parse-launch()`. In
this case, it is simply an `audiotestsrc` (which produces a continuous
this case, it is simply an `audiotestsrc` (which produces a continuous
tone) and an `autoaudiosink`, with accompanying adapter elements.
``` c
@ -793,7 +795,7 @@ gst_object_unref (bus);
These lines create a bus signal watch and connect to some interesting
signals, just like we have been doing in the basic tutorials. The
creation of the watch is done step by step instead of using
`gst_bus_add_signal_watch()` to exemplify how to use a custom GLib
`gst_bus_add_signal_watch()` to exemplify how to use a custom GLib
context.
``` c
@ -809,10 +811,10 @@ data->main_loop = NULL;
Finally, the main loop is created and set to run. When it exits (because
somebody else calls `g_main_loop_quit()`) the main loop is disposed of.
Before entering the main loop, though,
`check_initialization_complete()` is called. This method checks if all
`check_initialization_complete()` is called. This method checks if all
conditions are met to consider the native code “ready” to accept
commands. Since having a running main loop is one of the conditions,
`check_initialization_complete()` is called here. This method is
`check_initialization_complete()` is called here. This method is
reviewed below.
Once the main loop has quit, all resources are freed in lines 178 to
@ -842,24 +844,24 @@ if so, notify the UI code.
In tutorial 2, the only conditions are 1) the code is not already
initialized and 2) the main loop is running. If these two are met, the
Java `onGStreamerInitialized()` method is called via the
Java `onGStreamerInitialized()` method is called via the
[CallVoidMethod()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp4256)
JNI call.
Here comes a tricky bit. JNI calls require a JNI Environment, **which is
different for every thread**. C methods called from Java receive a
`JNIEnv` pointer as a parameter, but this is not the situation with
`JNIEnv` pointer as a parameter, but this is not the situation with
`check_initialization_complete()`. Here, we are in a thread which has
never been called from Java, so we have no `JNIEnv`. We need to use the
`JavaVM` pointer (passed to us in the `JNI_OnLoad()` method, and shared
`JavaVM` pointer (passed to us in the `JNI_OnLoad()` method, and shared
among all threads) to attach this thread to the Java Virtual Machine and
obtain a `JNIEnv`. This `JNIEnv` is stored in the [Thread-Local
Storage](http://en.wikipedia.org/wiki/Thread-local_storage) (TLS) using
obtain a `JNIEnv`. This `JNIEnv` is stored in the [Thread-Local
Storage](http://en.wikipedia.org/wiki/Thread-local_storage) (TLS) using
the pthread key we created in `JNI_OnLoad()`, so we do not need to
attach the thread anymore.
This behavior is implemented in the `get_jni_env()` method, used for
example in `check_initialization_complete()` as we have just seen. Lets
This behavior is implemented in the `get_jni_env()` method, used for
example in `check_initialization_complete()` as we have just seen. Lets
see how it works, step by step:
### `get_jni_env()`
@ -875,12 +877,12 @@ static JNIEnv *get_jni_env (void) {
}
```
It first retrieves the current `JNIEnv` from the TLS using
It first retrieves the current `JNIEnv` from the TLS using
[pthread\_getspecific()](http://pubs.opengroup.org/onlinepubs/9699919799/functions/pthread_getspecific.html)
and the key we obtained from
[pthread\_key\_create()](http://pubs.opengroup.org/onlinepubs/9699919799/functions/pthread_key_create.html).
If it returns NULL, we never attached this thread, so we do now with
`attach_current_thread()` and then store the new `JNIEnv` into the TLS
`attach_current_thread()` and then store the new `JNIEnv` into the TLS
with
[pthread\_setspecific()](http://pubs.opengroup.org/onlinepubs/9699919799/functions/pthread_setspecific.html).
@ -926,27 +928,27 @@ about to be destroyed. Here, we:
earliest convenience.
- Wait for the thread to finish with
[pthread\_join()](http://pubs.opengroup.org/onlinepubs/9699919799/functions/pthread_join.html).
This call blocks until the `app_function()` method returns, meaning
This call blocks until the `app_function()` method returns, meaning
that the main loop has exited, and the thread has been destroyed.
- Dispose of the global reference we kept for the Java application
class (`Tutorial2`) in `CustomData`.
- Free `CustomData` and set the Java pointer inside the
`Tutorial2` class to NULL with
- Free `CustomData` and set the Java pointer inside the
`Tutorial2` class to NULL with
`SET_CUSTOM_DATA()`.
### `gst_native_play` and `gst_native_pause()` (`nativePlay` and `nativePause()` from Java)
### `gst_native_play` and `gst_native_pause()` (`nativePlay` and `nativePause()` from Java)
These two simple methods retrieve `CustomData` from the passed-in object
with `GET_CUSTOM_DATA()` and set the pipeline found inside `CustomData`
These two simple methods retrieve `CustomData` from the passed-in object
with `GET_CUSTOM_DATA()` and set the pipeline found inside `CustomData`
to the desired state, returning immediately.
Finally, lets see how the GStreamer callbacks are handled:
### `error_cb` and `state_changed_cb`
### `error_cb` and `state_changed_cb`
This tutorial does not do much in these callbacks. They simply parse the
error or state changed message and display a message in the UI using the
`set_ui_message()` method:
`set_ui_message()` method:
### `set_ui_message()`
@ -964,11 +966,11 @@ static void set_ui_message (const gchar *message, CustomData *data) {
}
```
 
This is the other method (besides `check_initialization_complete()`) 
This is the other method (besides `check_initialization_complete()`)
that needs to call a Java function from a thread which never received an
`JNIEnv` pointer directly. Notice how all the complexities of attaching
`JNIEnv` pointer directly. Notice how all the complexities of attaching
the thread to the JavaVM and storing the JNI environment in the TLS are
hidden in the simple call to `get_jni_env()`.
@ -980,10 +982,10 @@ Java using the
[NewStringUTF()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp17220)
JNI call.
The `setMessage()` Java method is called via the JNI
The `setMessage()` Java method is called via the JNI
[CallVoidMethod()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp4256)
using the global reference to the class we are keeping in
`CustomData` (`data->app`) and the `set_message_method_id` we cached in
`CustomData` (`data->app`) and the `set_message_method_id` we cached in
`gst_native_class_init()`.
We check for exceptions with the JNI
@ -991,7 +993,7 @@ We check for exceptions with the JNI
method and free the UTF16 message with
[DeleteLocalRef()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#DeleteLocalRef).
# A pipeline on Android \[Android.mk\]
## A pipeline on Android \[Android.mk\]
**jni/Android.mk**
@ -1018,44 +1020,29 @@ GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_SYS)
include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer-1.0.mk
```
Notice how the required `GSTREAMER_PLUGINS` are now
`$(GSTREAMER_PLUGINS_CORE)` (For the test source and converter elements)
and `$(GSTREAMER_PLUGINS_SYS)` (for the audio sink).
Notice how the required `GSTREAMER_PLUGINS` are now
`$(GSTREAMER_PLUGINS_CORE)` (For the test source and converter elements)
and `$(GSTREAMER_PLUGINS_SYS)` (for the audio sink).
And this is it\! This has been a rather long tutorial, but we covered a
lot of territory. Building on top of this one, the following ones are
shorter and focus only on the new topics.
# Conclusion
## Conclusion
This tutorial has shown:
- How to manage multiple threads from C code and have them interact
with java.
- How to access Java code from any C thread
using [AttachCurrentThread()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/invocation.html#attach_current_thread).
using [AttachCurrentThread()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/invocation.html#attach_current_thread).
- How to allocate a CustomData structure from C and have Java host it,
so it is available to all threads.
Most of the methods introduced in this tutorial, like `get_jni_env()`,
`check_initialization_complete()`, `app_function()` and the API methods
`gst_native_init()`, `gst_native_finalize()` and
`gst_native_class_init()` will continue to be used in the following
`check_initialization_complete()`, `app_function()` and the API methods
`gst_native_init()`, `gst_native_finalize()` and
`gst_native_class_init()` will continue to be used in the following
tutorials with minimal modifications, so better get used to them\!
As usual, it has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[tutorial2-screenshot.png](attachments/2687063/2654325.png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial2-screenshot.png](attachments/2687063/2654412.png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial2-screenshot.png](attachments/2687063/2654417.png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial2-screenshot.png](attachments/2687063/2654324.png)
(image/png)

View file

@ -1,28 +1,28 @@
# Android tutorial 1: Link against GStreamer
# Goal![](attachments/thumbnails/2687057/2654326)
## Goal!
![screenshot]
This first Android tutorial is extremely simple: it just retrieves the
GStreamer version and displays it on the screen. It exemplifies how to
access GStreamer C code from Java and verifies that there have been no
linkage problems. 
linkage problems.
# Hello GStreamer \[Java code\]
## Hello GStreamer \[Java code\]
In the `share/gst-sdk/tutorials` folder of your GStreamer SDK
installation path you should find an `android-tutorial-1` directory,
with the usual Android NDK structure: a `src` folder for the Java code,
a `jni` folder for the C code and a `res` folder for UI resources.
At **FIXME: add path** folder you should find an `android-tutorial-1` directory,
with the usual Android NDK structure: a `src` folder for the Java code,
a `jni` folder for the C code and a `res` folder for UI resources.
We recommend that you open this project in Eclipse (as explained
in [Installing for Android
development](Installing%2Bfor%2BAndroid%2Bdevelopment.html)) so you can
in [](sdk-installing-for-android-development.md)) so you can
easily see how all the pieces fit together.
Lets first introduce the Java code, then the C code and finally the
makefile that allows GStreamer integration.
**src/org/freedesktop/gstreamer/tutorials/tutorial\_1/Tutorial1.java**
**src/org/freedesktop/gstreamer/tutorials/tutorial_1/Tutorial1.java**
``` java
package org.freedesktop.gstreamer.tutorials.tutorial_1;
@ -91,9 +91,9 @@ It loads `libgstreamer_android.so`, which contains all GStreamer
methods, and `libtutorial-1.so`, which contains the C part of this
tutorial, explained below.
Upon loading, each of these libraries `JNI_OnLoad()` method is
Upon loading, each of these libraries `JNI_OnLoad()` method is
executed. It basically registers the native methods that these libraries
expose. The GStreamer library only exposes a `init()` method, which
expose. The GStreamer library only exposes a `init()` method, which
initializes GStreamer and registers all plugins (The tutorial library is
explained later below).
@ -107,7 +107,7 @@ try {
}
```
Next, in the `OnCreate()` method of the
Next, in the `OnCreate()` method of the
[Activity](http://developer.android.com/reference/android/app/Activity.html)
we actually initialize GStreamer by calling `GStreamer.init()`. This
method requires a
@ -116,7 +116,7 @@ so it cannot be called from the static initializer, but there is no
danger in calling it multiple times, as all but the first time the calls
will be ignored.
Should initialization fail, the `init()` method would throw an
Should initialization fail, the `init()` method would throw an
[Exception](http://developer.android.com/reference/java/lang/Exception.html)
with the details provided by the GStreamer library.
@ -133,7 +133,7 @@ in the UI.
This finishes the UI part of this tutorial. Lets take a look at the C
code:
# Hello GStreamer \[C code\]
## Hello GStreamer \[C code\]
**jni/tutorial-1.c**
@ -171,7 +171,7 @@ jint JNI_OnLoad(JavaVM *vm, void *reserved) {
}
```
The `JNI_OnLoad()` method is executed every time the Java Virtual
The `JNI_OnLoad()` method is executed every time the Java Virtual
Machine (VM) loads a library.
Here, we retrieve the JNI environment needed to make calls that interact
@ -183,7 +183,7 @@ JNIEnv *env = NULL;
if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) {
__android_log_print (ANDROID_LOG_ERROR, "tutorial-1", "Could not retrieve JNIEnv");
return 0;
} 
}
```
And then locate the class containing the UI part of this tutorial using
@ -194,17 +194,17 @@ FindClass()`:
jclass klass = (*env)->FindClass (env, "org/freedesktop/gstreamer/tutorials/tutorial_1/Tutorial1");
```
Finally, we register our native methods with `RegisterNatives()`, this
Finally, we register our native methods with `RegisterNatives()`, this
is, we provide the code for the methods we advertised in Java using the
**`native`**
 keyword:
keyword:
``` c
(*env)->RegisterNatives (env, klass, native_methods, G_N_ELEMENTS(native_methods));
```
The `native_methods` array describes each one of the methods to register
(only one in this tutorial).  For each method, it provides its Java
The `native_methods` array describes each one of the methods to register
(only one in this tutorial). For each method, it provides its Java
name, its [type
signature](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/types.html#wp276)
and a pointer to the C function implementing it:
@ -216,7 +216,7 @@ static JNINativeMethod native_methods[] = {
```
The only native method used in this tutorial
is `nativeGetGStreamerInfo()`:
is `nativeGetGStreamerInfo()`:
``` c
jstring gst_native_get_gstreamer_info (JNIEnv* env, jobject thiz) {
@ -227,15 +227,15 @@ jstring gst_native_get_gstreamer_info (JNIEnv* env, jobject thiz) {
}
```
It simply calls `gst_version_string()` to obtain a string describing
It simply calls `gst_version_string()` to obtain a string describing
this version of GStreamer. This [Modified
UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) string is then
converted to [UTF16](http://en.wikipedia.org/wiki/UTF-16) by `
NewStringUTF()` as required by Java and returned. Java will be
NewStringUTF()` as required by Java and returned. Java will be
responsible for freeing the memory used by the new UTF16 String, but we
must free the `char *` returned by `gst_version_string()`.
# Hello GStreamer \[Android.mk\]
## Hello GStreamer \[Android.mk\]
**jni/Android.mk**
@ -262,12 +262,12 @@ include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer-1.0.mk
```
This is a barebones makefile for a project with GStreamer support. It
simply states that it depends on the `libgstreamer_android.so` library
(line 7), and requires the `coreelements` plugin (line 18). More complex
simply states that it depends on the `libgstreamer_android.so` library
(line 7), and requires the `coreelements` plugin (line 18). More complex
applications will probably add more libraries and plugins
to `Android.mk`
to `Android.mk`
# Conclusion
## Conclusion
This ends the first Android tutorial. It has shown that, besides the
interconnection between Java and C (which abides to the standard JNI
@ -279,14 +279,4 @@ taken when developing specifically for the Android platform.
As usual, it has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[tutorial1-screenshot.png](attachments/2687057/2654411.png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial1-screenshot.png](attachments/2687057/2654416.png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial1-screenshot.png](attachments/2687057/2654326.png)
(image/png)
[screenshot]: images/sdk-android-tutorial-link-against-gstreamer-screenshot.png

View file

@ -1,8 +1,10 @@
# Android tutorial 4: A basic media player
# Goal![](attachments/thumbnails/2687067/2654419)
## Goal
Enough testing with synthetic images and audio tones\! This tutorial
![screenshot]
Enough testing with synthetic images and audio tones! This tutorial
finally plays actual media, streamed directly from the Internet, in your
Android device. It shows:
@ -12,21 +14,20 @@ Android device. It shows:
Bar](http://developer.android.com/reference/android/widget/SeekBar.html)
- How to report the media size to adapt the display surface
It also uses the knowledge gathered in the [Basic
tutorials](Basic%2Btutorials.html) regarding:
It also uses the knowledge gathered in the [](sdk-basic-tutorials.md) regarding:
- How to use `playbin` to play any kind of media
- How to use `playbin` to play any kind of media
- How to handle network resilience problems
# Introduction
## Introduction
From the previous tutorials, we already have almost all necessary pieces
to build a media player. The most complex part is assembling a pipeline
which retrieves, decodes and displays the media, but we already know
that the `playbin` element can take care of all that for us. We only
need to replace the manual pipeline we used in [Android tutorial 3:
Video](Android%2Btutorial%2B3%253A%2BVideo.html) with a single-element
`playbin` pipeline and we are good to go\!
that the `playbin` element can take care of all that for us. We only
need to replace the manual pipeline we used in
[](sdk-android-tutorial-video.md) with a single-element
`playbin` pipeline and we are good to go!
However, we can do better than. We will add a [Seek
Bar](http://developer.android.com/reference/android/widget/SeekBar.html),
@ -36,14 +37,11 @@ media advances. We will also allow the user to drag the thumb, to jump
And finally, we will make the video surface adapt to the media size, so
the video sink is not forced to draw black borders around the clip.
 This also allows the Android layout to adapt more nicely to the actual
This also allows the Android layout to adapt more nicely to the actual
media content. You can still force the video surface to have a specific
size if you really want to.
# A basic media player \[Java code\]
![](images/icons/grey_arrow_down.gif)Due to the extension of this code,
this view is collapsed by default. Click here to expand…
## A basic media player \[Java code\]
**src/com/gst\_sdk\_tutorials/tutorial\_4/Tutorial4.java**
@ -67,7 +65,7 @@ import android.widget.SeekBar.OnSeekBarChangeListener;
import android.widget.TextView;
import android.widget.Toast;
import com.gstreamer.GStreamer;
import org.freedesktop.gstreamer.GStreamer;
public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSeekBarChangeListener {
private native void nativeInit(); // Initialize native code, build pipeline, etc
@ -302,13 +300,13 @@ public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSee
### Supporting arbitrary media URIs
The C code provides the `nativeSetUri()` method so we can indicate the
URI of the media to play. Since `playbin` will be taking care of
The C code provides the `nativeSetUri()` method so we can indicate the
URI of the media to play. Since `playbin` will be taking care of
retrieving the media, we can use local or remote URIs indistinctly
(`file://` or `http://`, for example). From Java, though, we want to
(`file://` or `http://`, for example). From Java, though, we want to
keep track of whether the file is local or remote, because we will not
offer the same functionalities. We keep track of this in the
`is_local_media` variable, and update it every time we change the media
`is_local_media` variable, and update it every time we change the media
URI:
``` java
@ -318,14 +316,14 @@ private void setMediaUri() {
}
```
We call `setMediaUri()` in the `onGStreamerInitialized()` callback, once
We call `setMediaUri()` in the `onGStreamerInitialized()` callback, once
the pipeline is ready to accept commands.
### Reporting media size
Every time the size of the media changes (which could happen mid-stream,
for some kind of streams), or when it is first detected, C code calls
our `onMediaSizeChanged()` callback:
our `onMediaSizeChanged()` callback:
``` java
private void onMediaSizeChanged (int width, int height) {
@ -341,31 +339,29 @@ private void onMediaSizeChanged (int width, int height) {
}
```
Here we simply pass the new size onto the `GStreamerSurfaceView` in
Here we simply pass the new size onto the `GStreamerSurfaceView` in
charge of displaying the media, and ask the Android layout to be
recalculated. Eventually, the `onMeasure()` method in
GStreamerSurfaceView will be called and the new size will be taken into
account. As we have already seen in [Android tutorial 2: A running
pipeline](Android%2Btutorial%2B2%253A%2BA%2Brunning%2Bpipeline.html),
methods which change the UI must be called from the main thread, and we
are now in a callback from some GStreamer internal thread. Hence, the
usage of
recalculated. Eventually, the `onMeasure()` method in
GStreamerSurfaceView will be called and the new size will be taken
into account. As we have already seen in
[](sdk-android-tutorial-a-running-pipeline.md), methods which change
the UI must be called from the main thread, and we are now in a
callback from some GStreamer internal thread. Hence, the usage of
[runOnUiThread()](http://developer.android.com/reference/android/app/Activity.html#runOnUiThread\(java.lang.Runnable\)).
### Refreshing the Seek Bar
[Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html)
[](sdk-basic-tutorial-toolkit-integration.md)
has already shown how to implement a [Seek
Bar](http://developer.android.com/reference/android/widget/SeekBar.html) using
Bar](http://developer.android.com/reference/android/widget/SeekBar.html) using
the GTK+ toolkit. The implementation on Android is very similar.
The Seek Bar accomplishes to functions: First, it moves on its own to
reflect the current playback position in the media. Second, it can be
dragged by the user to seek to a different position.
To realize the first function, C code will periodically call our
`setCurrentPosition()` method so we can update the position of the thumb
To realize the first function, C code will periodically call our
`setCurrentPosition()` method so we can update the position of the thumb
in the Seek Bar. Again we do so from the UI thread, using
`RunOnUiThread()`.
@ -392,7 +388,7 @@ To the left of the Seek Bar (refer to the screenshot at the top of this
page), there is a
[TextView](http://developer.android.com/reference/android/widget/TextView.html)
widget which we will use to display the current position and duration in
`HH:mm:ss / HH:mm:ss` textual format. The `updateTimeWidget()` method
`HH:mm:ss / HH:mm:ss` textual format. The `updateTimeWidget()` method
takes care of it, and must be called every time the Seek Bar is updated:
``` java
@ -411,7 +407,7 @@ private void updateTimeWidget () {
### Seeking with the Seek Bar
To perform the second function of the [Seek
Bar](http://developer.android.com/reference/android/widget/SeekBar.html) (allowing
Bar](http://developer.android.com/reference/android/widget/SeekBar.html) (allowing
the user to seek by dragging the thumb), we implement the
[OnSeekBarChangeListener](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html)
interface in the
@ -437,7 +433,7 @@ the user:
``` java
public void onStartTrackingTouch(SeekBar sb) {
nativePause();
} 
}
```
[onStartTrackingTouch()](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html#onStartTrackingTouch\(android.widget.SeekBar\))
@ -453,13 +449,13 @@ public void onProgressChanged(SeekBar sb, int progress, boolean fromUser) {
// If this is a local file, allow scrub seeking, this is, seek soon as the slider is moved.
if (is_local_media) nativeSetPosition(desired_position);
updateTimeWidget();
} 
}
```
[onProgressChanged()](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html#onProgressChanged\(android.widget.SeekBar,%20int,%20boolean\)) is
[onProgressChanged()](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html#onProgressChanged\(android.widget.SeekBar,%20int,%20boolean\)) is
called every time the thumb moves, be it because the user dragged it, or
because we called `setProgress()` on the Seek Bar. We discard the latter
case with the handy `fromUser` parameter.
because we called `setProgress()` on the Seek Bar. We discard the latter
case with the handy `fromUser` parameter.
As the comment says, if this is a local media, we allow scrub seeking,
this is, we jump to the indicated position as soon as the thumb moves.
@ -475,18 +471,15 @@ public void onStopTrackingTouch(SeekBar sb) {
}
```
Finally, [onStopTrackingTouch()](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html#onStopTrackingTouch\(android.widget.SeekBar\))
is called when the thumb is released. We simply perform the seek
Finally, [onStopTrackingTouch()](http://developer.android.com/reference/android/widget/SeekBar.OnSeekBarChangeListener.html#onStopTrackingTouch\(android.widget.SeekBar\))
is called when the thumb is released. We simply perform the seek
operation if the file was non-local, and restore the pipeline to the
desired playing state.
This concludes the User interface part of this tutorial. Lets review
now the under-the-hood C code that allows this to work.
# A basic media player \[C code\]
![](images/icons/grey_arrow_down.gif)Due to the extension of this code,
this view is collapsed by default. Click here to expand…
## A basic media player \[C code\]
**jni/tutorial-4.c**
@ -1063,7 +1056,7 @@ jint JNI_OnLoad(JavaVM *vm, void *reserved) {
### Supporting arbitrary media URIs
Java code will call `gst_native_set_uri()` whenever it wants to change
Java code will call `gst_native_set_uri()` whenever it wants to change
the playing URI (in this tutorial the URI never changes, but it could):
``` c
@ -1084,17 +1077,17 @@ void gst_native_set_uri (JNIEnv* env, jobject thiz, jstring uri) {
We first need to convert between the
[UTF16](http://en.wikipedia.org/wiki/UTF-16) encoding used by Java and
the [Modified
UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) used by
UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) used by
GStreamer with
[GetStringUTFChars()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp17265)
and
[ReleaseStringUTFChars()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp17294).
`playbin` will only care about URI changes in the READY to PAUSED state
`playbin` will only care about URI changes in the READY to PAUSED state
change, because the new URI might need a completely different playback
pipeline (think about switching from a local Matroska file to a remote
OGG file: this would require, at least, different source and demuxing
elements). Thus, before passing the new URI to `playbin` we set its
elements). Thus, before passing the new URI to `playbin` we set its
state to READY (if we were in PAUSED or PLAYING).
`playbin`s URI is exposed as a common GObject property, so we simply
@ -1105,7 +1098,7 @@ the pipeline to the playing state it had before. In this last step, we
also take note of whether the new URI corresponds to a live source or
not. Live sources must not use buffering (otherwise latency is
introduced which is inacceptable for them), so we keep track of this
information in the `is_live` variable.
information in the `is_live` variable.
### Reporting media size
@ -1150,26 +1143,26 @@ static void check_media_size (CustomData *data) {
```
We first retrieve the video sink element from the pipeline, using the
`video-sink` property of `playbin`, and then its sink Pad. The
`video-sink` property of `playbin`, and then its sink Pad. The
negotiated Caps of this Pad, which we recover using
`gst_pad_get_negotiated_caps()`,  are the Caps of the decoded media.
`gst_pad_get_negotiated_caps()`, are the Caps of the decoded media.
The helper functions `gst_video_format_parse_caps()` and
`gst_video_parse_caps_pixel_aspect_ratio()` turn the Caps into
The helper functions `gst_video_format_parse_caps()` and
`gst_video_parse_caps_pixel_aspect_ratio()` turn the Caps into
manageable integers, which we pass to Java through
its `onMediaSizeChanged()` callback.
its `onMediaSizeChanged()` callback.
### Refreshing the Seek Bar
To keep the UI updated, a GLib timer is installed in the
`app_function()` that fires 4 times per second (or every 250ms), right
`app_function()` that fires 4 times per second (or every 250ms), right
before entering the main loop:
``` c
timeout_source = g_timeout_source_new (250);
g_source_set_callback (timeout_source, (GSourceFunc)refresh_ui, data, NULL);
g_source_attach (timeout_source, data->context);
g_source_unref (timeout_source); 
g_source_unref (timeout_source);
```
Then, in the refresh\_ui method:
@ -1203,23 +1196,23 @@ If it is unknown, the clip duration is retrieved, as explained in [Basic
tutorial 4: Time
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html). The
current position is retrieved next, and the UI is informed of both
through its `setCurrentPosition()` callback.
through its `setCurrentPosition()` callback.
Bear in mind that all time-related measures returned by GStreamer are in
nanoseconds, whereas, for simplicity, we decided to make the UI code
work in milliseconds. 
work in milliseconds.
### Seeking with the Seek Bar
The Java UI code already takes care of most of the complexity of seeking
by dragging the thumb of the Seek Bar. From C code, we just need to
honor the calls to `nativeSetPosition()` and instruct the pipeline to
honor the calls to `nativeSetPosition()` and instruct the pipeline to
jump to the indicated position.
There are, though, a couple of caveats. Firstly, seeks are only possible
when the pipeline is in the PAUSED or PLAYING state, and we might
receive seek requests before that happens. Secondly, dragging the Seek
Bar can generate a very high number of seek requests in a short period
Bar can generate a very high number of seek requests in a short period
of time, which is visually useless and will impair responsiveness. Lets
see how to overcome these problems.
@ -1239,13 +1232,13 @@ void gst_native_set_position (JNIEnv* env, jobject thiz, int milliseconds) {
GST_DEBUG ("Scheduling seek to %" GST_TIME_FORMAT " for later", GST_TIME_ARGS (desired_position));
data->desired_position = desired_position;
}
} 
}
```
If we are already in the correct state for seeking, execute it right
away; otherwise, store the desired position in the
`desired_position` variable. Then, in the
`state_changed_cb()` callback:
`desired_position` variable. Then, in the
`state_changed_cb()` callback:
``` c
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
@ -1261,7 +1254,7 @@ if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
Once the pipeline moves from the READY to the PAUSED state, we check if
there is a pending seek operation and execute it. The
`desired_position` variable is reset inside `execute_seek()`.
`desired_position` variable is reset inside `execute_seek()`.
#### Seek throttling
@ -1278,11 +1271,11 @@ second one, it is up to it to finish the first one, start the second one
or abort both, which is a bad thing. A simple method to avoid this issue
is *throttling*, which means that we will only allow one seek every half
a second (for example): after performing a seek, only the last seek
request received during the next 500ms is stored, and will be honored
request received during the next 500ms is stored, and will be honored
once this period elapses.
To achieve this, all seek requests are routed through the
`execute_seek()` method:
To achieve this, all seek requests are routed through the
`execute_seek()` method:
``` c
static void execute_seek (gint64 desired_position, CustomData *data) {
@ -1320,34 +1313,28 @@ static void execute_seek (gint64 desired_position, CustomData *data) {
```
The time at which the last seek was performed is stored in the
`last_seek_time` variable. This is wall clock time, not to be confused
`last_seek_time` variable. This is wall clock time, not to be confused
with the stream time carried in the media time stamps, and is obtained
with `gst_util_get_timestamp()`.
If enough time has passed since the last seek operation, the new one is
directly executed and `last_seek_time` is updated. Otherwise, the new
directly executed and `last_seek_time` is updated. Otherwise, the new
seek is scheduled for later. If there is no previously scheduled seek, a
one-shot timer is setup to trigger 500ms after the last seek operation.
If another seek was already scheduled, its desired position is simply
updated with the new one.
The one-shot timer calls `delayed_seek_cb()`, which simply calls
`execute_seek()` again.
`execute_seek()` again.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>Ideally, <code>execute_seek()</code> will now find that enough time has indeed passed since the last seek and the scheduled one will proceed. It might happen, though, that after 500ms of the previous seek, and before the timer wakes up, yet another seek comes through and is executed. <code>delayed_seek_cb()</code> needs to check for this condition to avoid performing two very close seeks, and therefore calls <code>execute_seek()</code> instead of performing it itself.</p>
<p>This is not a complete solution: the scheduled seek will still be executed, even though a more-recent seek has already been executed that should have cancelled it. However, it is a good tradeoff between functionality and simplicity.</p></td>
</tr>
</tbody>
</table>
> ![information]
> Ideally, `execute_seek()` will now find that enough time has indeed passed since the last seek and the scheduled one will proceed. It might happen, though, that after 500ms of the previous seek, and before the timer wakes up, yet another seek comes through and is executed. `delayed_seek_cb()` needs to check for this condition to avoid performing two very close seeks, and therefore calls `execute_seek()` instead of performing it itself.
>
> This is not a complete solution: the scheduled seek will still be executed, even though a more-recent seek has already been executed that should have cancelled it. However, it is a good tradeoff between functionality and simplicity.
### Network resilience
[Basic tutorial 12:
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) has already
[](sdk-basic-tutorial-streaming.md) has already
shown how to adapt to the variable nature of the network bandwidth by
using buffering. The same procedure is used here, by listening to the
buffering
@ -1382,15 +1369,15 @@ static void buffering_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
}
```
`target_state` is the state in which we have been instructed to set the
`target_state` is the state in which we have been instructed to set the
pipeline, which might be different to the current state, because
buffering forces us to go to PAUSED. Once buffering is complete we set
the pipeline to the `target_state`.
# A basic media player \[Android.mk\]
## A basic media player \[Android.mk\]
The only line worth mentioning in the makefile
is `GSTREAMER_PLUGINS`:
is `GSTREAMER_PLUGINS`:
**jni/Android.mk**
@ -1402,9 +1389,9 @@ In which all plugins required for playback are loaded, because it is not
known at build time what would be needed for an unspecified URI (again,
in this tutorial the URI does not change, but it will in the next one).
# Conclusion
## Conclusion
This tutorial has shown how to embed a `playbin` pipeline into an
This tutorial has shown how to embed a `playbin` pipeline into an
Android application. This, effectively, turns such application into a
basic media player, capable of streaming and decoding all the formats
GStreamer understands. More particularly, it has shown:
@ -1420,10 +1407,7 @@ GStreamer understands. More particularly, it has shown:
The next tutorial adds the missing bits to turn the application built
here into an acceptable Android media player.
As usual, it has been a pleasure having you here, and see you soon\!
As usual, it has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[tutorial4-screenshot.png](attachments/2687067/2654419.png)
(image/png)
[screenshot]: images/sdk-android-tutorial-media-player-screenshot.png
[information]: images/icons/emoticons/information.png

View file

@ -1,9 +1,10 @@
# Android tutorial 3: Video
# Goal ![](attachments/thumbnails/2687065/2654413)
## Goal
Except for [Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html),
![screenshot]
Except for [](sdk-basic-tutorial-toolkit-integration.md),
which embedded a video window on a GTK application, all tutorials so far
relied on GStreamer video sinks to create a window to display their
contents. The video sink on Android is not capable of creating its own
@ -14,25 +15,24 @@ shows:
to GStreamer
- How to keep GStreamer posted on changes to the surface
# Introduction
## Introduction
Since Android does not provide a windowing system, a GStreamer video
sink cannot create pop-up windows as it would do on a Desktop platform.
Fortunately, the `XOverlay` interface allows providing video sinks with
Fortunately, the `VideoOverlay` interface allows providing video sinks with
an already created window onto which they can draw, as we have seen in
[Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html).
[](sdk-basic-tutorial-toolkit-integration).
In this tutorial, a
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html)
widget (actually, a subclass of it) is placed on the main layout. When
Android informs the application that a surface has been created for this
widget, we pass it to the C code which stores it. The
`check_initialization_complete()` method explained in the previous
`check_initialization_complete()` method explained in the previous
tutorial is extended so that GStreamer is not considered initialized
until a main loop is running and a drawing surface has been received.
# A video surface on Android \[Java code\]
## A video surface on Android \[Java code\]
**src/com/gst\_sdk\_tutorials/tutorial\_3/Tutorial3.java**
@ -50,7 +50,7 @@ import android.widget.ImageButton;
import android.widget.TextView;
import android.widget.Toast;
import com.gstreamer.GStreamer;
import org.freedesktop.gstreamer.GStreamer;
public class Tutorial3 extends Activity implements SurfaceHolder.Callback {
private native void nativeInit(); // Initialize native code, build pipeline, etc
@ -193,7 +193,7 @@ private native void nativeSurfaceFinalize();
```
Two new entry points to the C code are defined,
`nativeSurfaceInit()` and `nativeSurfaceFinalize()`, which we will call
`nativeSurfaceInit()` and `nativeSurfaceFinalize()`, which we will call
when the video surface becomes available and when it is about to be
destroyed, respectively.
@ -232,25 +232,25 @@ public void surfaceDestroyed(SurfaceHolder holder) {
This interface is composed of the three methods above, which get called
when the geometry of the surface changes, when the surface is created
and when it is about to be destroyed. `surfaceChanged()` always gets
and when it is about to be destroyed. `surfaceChanged()` always gets
called at least once, right after `surfaceCreated()`, so we will use it
to notify GStreamer about the new surface. We use
`surfaceDestroyed()` to tell GStreamer to stop using this surface.
`surfaceDestroyed()` to tell GStreamer to stop using this surface.
Lets review the C code to see what these functions do.
# A video surface on Android \[C code\]
## A video surface on Android \[C code\]
**jni/tutorial-3.c**
``` c
#include <string.h>
#include <stdint.h>
#include <jni.h>
#include <android/log.h>
#include <android/native_window.h>
#include <android/native_window_jni.h>
#include <gst/gst.h>
#include <gst/interfaces/xoverlay.h>
#include <gst/video/video.h>
#include <pthread.h>
@ -276,7 +276,7 @@ typedef struct _CustomData {
GMainContext *context; /* GLib context used to run the main loop */
GMainLoop *main_loop; /* GLib main loop */
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
GstElement *video_sink; /* The video sink element which receives XOverlay commands */
GstElement *video_sink; /* The video sink element which receives VideoOverlay commands */
ANativeWindow *native_window; /* The Android native window where video will be rendered */
} CustomData;
@ -376,7 +376,7 @@ static void check_initialization_complete (CustomData *data) {
GST_DEBUG ("Initialization complete, notifying application. native_window:%p main_loop:%p", data->native_window, data->main_loop);
/* The main loop is running and we received a native window, inform the sink about it */
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->video_sink), (guintptr)data->native_window);
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->video_sink), (guintptr)data->native_window);
(*env)->CallVoidMethod (env, data->app, on_gstreamer_initialized_method_id);
if ((*env)->ExceptionCheck (env)) {
@ -414,7 +414,7 @@ static void *app_function (void *userdata) {
/* Set the pipeline to READY, so it can already accept a window handle, if we have one */
gst_element_set_state(data->pipeline, GST_STATE_READY);
data->video_sink = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_X_OVERLAY);
data->video_sink = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_VIDEO_OVERLAY);
if (!data->video_sink) {
GST_ERROR ("Could not retrieve video sink");
return NULL;
@ -524,8 +524,8 @@ static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface)
if (data->native_window == new_native_window) {
GST_DEBUG ("New native window is the same as the previous one", data->native_window);
if (data->video_sink) {
gst_x_overlay_expose(GST_X_OVERLAY (data->video_sink));
gst_x_overlay_expose(GST_X_OVERLAY (data->video_sink));
gst_video_overlay_expose(GST_VIDEO_OVERLAY (data->video_sink));
gst_video_overlay_expose(GST_VIDEO_OVERLAY (data->video_sink));
}
return;
} else {
@ -544,7 +544,7 @@ static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
GST_DEBUG ("Releasing Native Window %p", data->native_window);
if (data->video_sink) {
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->video_sink), (guintptr)NULL);
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->video_sink), (guintptr)NULL);
gst_element_set_state (data->pipeline, GST_STATE_READY);
}
@ -583,16 +583,16 @@ jint JNI_OnLoad(JavaVM *vm, void *reserved) {
}
```
First, our `CustomData` structure is augmented to keep a pointer to the
First, our `CustomData` structure is augmented to keep a pointer to the
video sink element and the native window
handle:
``` c
GstElement *video_sink; /* The video sink element which receives XOverlay commands */
GstElement *video_sink; /* The video sink element which receives VideoOverlay commands */
ANativeWindow *native_window; /* The Android native window where video will be rendered */
```
The `check_initialization_complete()` method is also augmented so that
The `check_initialization_complete()` method is also augmented so that
it requires a native window before considering GStreamer to be
initialized:
@ -603,7 +603,7 @@ static void check_initialization_complete (CustomData *data) {
GST_DEBUG ("Initialization complete, notifying application. native_window:%p main_loop:%p", data->native_window, data->main_loop);
/* The main loop is running and we received a native window, inform the sink about it */
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->video_sink), (guintptr)data->native_window);
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->video_sink), (guintptr)data->native_window);
(*env)->CallVoidMethod (env, data->app, on_gstreamer_initialized_method_id);
if ((*env)->ExceptionCheck (env)) {
@ -617,12 +617,12 @@ static void check_initialization_complete (CustomData *data) {
Also, once the pipeline has been built and a native window has been
received, we inform the video sink of the window handle to use via the
`gst_x_overlay_set_window_handle()` method.
`gst_video_overlay_set_window_handle()` method.
The GStreamer pipeline for this tutorial involves a `videotestsrc`, a
`warptv` psychedelic distorter effect (check out other cool video
effects in the `GSTREAMER_PLUGINS_EFFECTS` package), and an
`autovideosink` which will instantiate the adequate video sink for the
`warptv` psychedelic distorter effect (check out other cool video
effects in the `GSTREAMER_PLUGINS_EFFECTS` package), and an
`autovideosink` which will instantiate the adequate video sink for the
platform:
``` c
@ -636,7 +636,7 @@ interesting:
/* Set the pipeline to READY, so it can already accept a window handle, if we have one */
gst_element_set_state(data->pipeline, GST_STATE_READY);
data->video_sink = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_X_OVERLAY);
data->video_sink = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_VIDEO_OVERLAY);
if (!data->video_sink) {
GST_ERROR ("Could not retrieve video sink");
return NULL;
@ -644,16 +644,15 @@ if (!data->video_sink) {
```
We start by setting the pipeline to the READY state. No data flow occurs
yet, but the `autovideosink` will instantiate the actual sink so we can
yet, but the `autovideosink` will instantiate the actual sink so we can
ask for it immediately.
The `gst_bin_get_by_interface()` method will examine the whole pipeline
The `gst_bin_get_by_interface()` method will examine the whole pipeline
and return a pointer to an element which supports the requested
interface. We are asking for the `XOverlay` interface, explained in
[Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html),
interface. We are asking for the `VideoOverlay` interface, explained in
[](sdk-basic-tutorial-toolkit-integration.md),
which controls how to perform rendering into foreign (non-GStreamer)
windows. The internal video sink instantiated by `autovideosink` is the
windows. The internal video sink instantiated by `autovideosink` is the
only element in this pipeline implementing it, so it will be returned.
Now we will implement the two native functions called by the Java code
@ -672,8 +671,8 @@ static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface)
if (data->native_window == new_native_window) {
GST_DEBUG ("New native window is the same as the previous one", data->native_window);
if (data->video_sink) {
gst_x_overlay_expose(GST_X_OVERLAY (data->video_sink));
gst_x_overlay_expose(GST_X_OVERLAY (data->video_sink));
gst_video_overlay_expose(GST_VIDEO_OVERLAY (data->video_sink));
gst_video_overlay_expose(GST_VIDEO_OVERLAY (data->video_sink));
}
return;
} else {
@ -690,20 +689,20 @@ static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface)
This method is responsible for providing the video sink with the window
handle coming from the Java code. We are passed a
[Surface](http://developer.android.com/reference/android/view/Surface.html)
object, and we use `ANativeWindow_fromSurface()` to obtain the
object, and we use `ANativeWindow_fromSurface()` to obtain the
underlying native window pointer. There is no official online
documentation for the NDK, but fortunately the header files are well
commented. Native window management functions can be found in
`$(ANDROID_NDK_ROOT)\platforms\android-9\arch-arm\usr\include\android\native_window.h` and `native_window_jni.h`
commented. Native window management functions can be found in
`$(ANDROID_NDK_ROOT)\platforms\android-9\arch-arm\usr\include\android\native_window.h` and `native_window_jni.h`
If we had already stored a native window, the one we just received can
either be a new one, or just an update of the one we have. If the
pointers are the same, we assume the geometry of the surface has
changed, and simply instruct the video sink to redraw itself, via the
`gst_x_overlay_expose()` method. The video sink will recover the new
`gst_video_overlay_expose()` method. The video sink will recover the new
size from the surface itself, so we do not need to bother about it
here. We need to call `gst_x_overlay_expose()` twice because of the way
the surface changes propagate down the OpenGL ES / EGL pipeline (The
here. We need to call `gst_video_overlay_expose()` twice because of the way
the surface changes propagate down the OpenGL ES / EGL pipeline (The
only video sink available for Android in the GStreamer SDK uses OpenGL
ES). By the time we call the first expose, the surface that the sink
will pick up still contains the old size.
@ -714,7 +713,7 @@ not being initialized. Next time we call
the new window handle.
We finally store the new window handle and call
`check_initialization_complete()` to inform the Java code that
`check_initialization_complete()` to inform the Java code that
everything is set up, if that is the case.
``` c
@ -724,7 +723,7 @@ static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
GST_DEBUG ("Releasing Native Window %p", data->native_window);
if (data->video_sink) {
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->video_sink), (guintptr)NULL);
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->video_sink), (guintptr)NULL);
gst_element_set_state (data->pipeline, GST_STATE_READY);
}
@ -734,21 +733,21 @@ static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
}
```
The complementary function, `gst_native_surface_finalize()` is called
The complementary function, `gst_native_surface_finalize()` is called
when a surface is about to be destroyed and should not be used anymore.
Here, we simply instruct the video sink to stop using the window handle
and set the pipeline to READY so no rendering occurs. We release the
window pointer we had stored with `ANativeWindow_release()`, and mark
window pointer we had stored with `ANativeWindow_release()`, and mark
GStreamer as not being initialized anymore.
And this is all there is to it, regarding the main code. Only a couple
of details remain, the subclass we made for SurfaceView and the
`Android.mk` file.
`Android.mk` file.
# GStreamerSurfaceView, a convenient SurfaceView wrapper \[Java code\]
## GStreamerSurfaceView, a convenient SurfaceView wrapper \[Java code\]
By default,
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) does
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) does
not have any particular size, so it expands to use all the space the
layout can give it. While this might be convenient sometimes, it does
not allow a great deal of control. In particular, when the surface does
@ -757,9 +756,9 @@ borders (the known “letterbox” or “pillarbox” effect), which is an
unnecessary work (and a waste of battery).
The subclass of
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) presented
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) presented
here overrides the
[onMeasure()](http://developer.android.com/reference/android/view/SurfaceView.html#onMeasure\(int,%20int\)) method
[onMeasure()](http://developer.android.com/reference/android/view/SurfaceView.html#onMeasure\(int,%20int\)) method
to report the actual media size, so the surface can adapt to any layout
while preserving the media aspect ratio.
@ -858,7 +857,7 @@ public class GStreamerSurfaceView extends SurfaceView {
}
```
# A video surface on Android \[Android.mk\]
## A video surface on Android \[Android.mk\]
**/jni/Android.mk**
@ -882,25 +881,24 @@ endif
GSTREAMER_NDK_BUILD_PATH := $(GSTREAMER_SDK_ROOT)/share/gst-android/ndk-build/
include $(GSTREAMER_NDK_BUILD_PATH)/plugins.mk
GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_SYS) $(GSTREAMER_PLUGINS_EFFECTS)
GSTREAMER_EXTRA_DEPS := gstreamer-interfaces-1.0 gstreamer-video-1.0
GSTREAMER_EXTRA_DEPS := gstreamer-video-1.0
include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer.mk
```
Worth mentioning is the `-landroid` library being used to allow
Worth mentioning is the `-landroid` library being used to allow
interaction with the native windows, and the different plugin
packages: `GSTREAMER_PLUGINS_SYS` for the system-dependent video sink
and `GSTREAMER_PLUGINS_EFFECTS` for the `warptv` element. This tutorial
requires the `gstreamer-interfaces` library to use the
`XOverlay` interface, and the `gstreamer-video` library to use the
video helper methods.
packages: `GSTREAMER_PLUGINS_SYS` for the system-dependent video sink
and `GSTREAMER_PLUGINS_EFFECTS` for the `warptv` element. This tutorial
requires the `gstreamer-video` library to use the
`VideoOverlay` interface and the video helper methods.
# Conclusion
## Conclusion
This tutorial has shown:
- How to display video on Android using a
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) and
the `XOverlay` interface.
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) and
the `VideoOverlay` interface.
- How to be aware of changes in the surfaces size using
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html)s
callbacks.
@ -911,17 +909,5 @@ to this tutorial in order to build a simple media player.
It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[tutorial3-screenshot.png](attachments/2687065/2654414.png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial3-screenshot.png](attachments/2687065/2654415.png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial3-screenshot.png](attachments/2687065/2654418.png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial3-screenshot.png](attachments/2687065/2654413.png)
(image/png)
[screenshot]: images/sdk-android-tutorial-video-screenshot.png

View file

@ -1,24 +1,22 @@
# Android tutorials
# Welcome to the GStreamer SDK Android tutorials
## Welcome to the GStreamer SDK Android tutorials
These tutorials describe Android-specific topics. General GStreamer
concepts will not be explained in these tutorials, so the [Basic
tutorials](Basic%2Btutorials.html) should be reviewed first. The reader
should also be familiar with basic Android programming techniques.
concepts will not be explained in these tutorials, so the
[](sdk-basic-tutorials.md) should be reviewed first. The reader should
also be familiar with basic Android programming techniques.
Each Android tutorial builds on top of the previous one and adds
progressively more functionality, until a working media player
application is obtained in [Android tutorial 5: A Complete media
player](Android%2Btutorial%2B5%253A%2BA%2BComplete%2Bmedia%2Bplayer.html).
This is the same media player application used to advertise the
GStreamer SDK on Android, and the download link can be found in
the [Android tutorial 5: A Complete media
player](Android%2Btutorial%2B5%253A%2BA%2BComplete%2Bmedia%2Bplayer.html) page.
application is obtained in [](sdk-android-tutorial-a-complete-media-player.md).
This is the same media player application used to advertise
GStreamer on Android, and the download link can be found in
the [](sdk-android-tutorial-a-complete-media-player.md) page.
Make sure to have read the instructions in [Installing for Android
development](Installing%2Bfor%2BAndroid%2Bdevelopment.html) before
jumping into the Android tutorials.
Make sure to have read the instructions in
[](sdk-installing-for-android-development.md) before jumping into the
Android tutorials.
### A note on the documentation
@ -27,7 +25,7 @@ the [Android reference
site](http://developer.android.com/reference/packages.html).
Unfortunately, there is no official online documentation for the NDK.
The header files, though, are well commented. If you installed the
Android NDK in the `$(ANDROID_NDK_ROOT)` folder, you can find the header
The header files, though, are well commented. If you installed the
Android NDK in the `$(ANDROID_NDK_ROOT)` folder, you can find the header
files
in `$(ANDROID_NDK_ROOT)\platforms\android-9\arch-arm\usr\include\android`.
in `$(ANDROID_NDK_ROOT)\platforms\android-9\arch-arm\usr\include\android`.