Merge remote-tracking branch 'thib/master'

This commit is contained in:
Olivier Crête 2016-05-27 10:49:04 -04:00
commit 0267b3fc7d
76 changed files with 1462 additions and 1516 deletions

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : 2012.11 Brahmaputra
# 2012.11 Brahmaputra
This page last changed on Nov 28, 2012 by slomo.
@ -74,11 +74,11 @@ the following development environments
- Microsoft Visual Studio 2010 or 2012 (including the free Visual C++
Express
edition)
  <http://www.microsoft.com/visualstudio/eng/products/visual-studio-overview>
- MinGW/MSYS
  [http://mingw.org](http://mingw.org/)
For installation instructions and development environment setup
@ -282,4 +282,3 @@ Bug
tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : 2012.5 Amazon
# 2012.5 Amazon
This page last changed on Jun 15, 2012 by slomo.
@ -64,11 +64,11 @@ the following development environments
- Microsoft Visual Studio 2010 (including the free Visual C++ Express
edition)
  <http://www.microsoft.com/visualstudio/en-us/products/2010-editions>
- MinGW/MSYS
  [http://mingw.org](http://mingw.org/)
For installation instructions and development environment setup
@ -121,8 +121,6 @@ These use-cases are currently not officially supported by the GStreamer
SDK but will usually work and will be officially supported in future
releases of the GStreamer SDK.
The GStreamer SDK Amazon contains the following major components, some
of them being optional or not used on some platforms. 
@ -246,4 +244,3 @@ tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
 
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : 2012.7 Amazon (Bugfix Release 1)
# 2012.7 Amazon (Bugfix Release 1)
This page last changed on Jul 11, 2012 by slomo.
@ -64,11 +64,11 @@ the following development environments
- Microsoft Visual Studio 2010 (including the free Visual C++ Express
edition)
  <http://www.microsoft.com/visualstudio/en-us/products/2010-editions>
- MinGW/MSYS
  [http://mingw.org](http://mingw.org/)
For installation instructions and development environment setup
@ -138,8 +138,6 @@ These use-cases are currently not officially supported by the GStreamer
SDK but will usually work and will be officially supported in future
releases of the GStreamer SDK.
The GStreamer SDK Amazon contains the following major components, some
of them being optional or not used on some platforms. 
@ -264,4 +262,3 @@ tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
 
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : 2012.9 Amazon (Bugfix Release 2)
# 2012.9 Amazon (Bugfix Release 2)
This page last changed on Sep 18, 2012 by ylatuya.
@ -64,11 +64,11 @@ the following development environments
- Microsoft Visual Studio 2010 (including the free Visual C++ Express
edition)
  <http://www.microsoft.com/visualstudio/en-us/products/2010-editions>
- MinGW/MSYS
  [http://mingw.org](http://mingw.org/)
For installation instructions and development environment setup
@ -136,8 +136,6 @@ These use-cases are currently not officially supported by the GStreamer
SDK but will usually work and will be officially supported in future
releases of the GStreamer SDK.
The GStreamer SDK Amazon contains the following major components, some
of them being optional or not used on some platforms. 
@ -262,4 +260,3 @@ tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
 
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : 2013.6 Congo
# 2013.6 Congo
This page last changed on Jun 11, 2013 by ylatuya.
@ -76,11 +76,11 @@ the following development environments
- Microsoft Visual Studio 2010 or 2012 (including the free Visual C++
Express
edition)
  <http://www.microsoft.com/visualstudio/eng/products/visual-studio-overview>
- MinGW/MSYS
  [http://mingw.org](http://mingw.org/)
For installation instructions and development environment setup
@ -303,4 +303,3 @@ Bug
tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Android tutorial 1: Link against GStreamer
# Android tutorial 1: Link against GStreamer
This page last changed on May 02, 2013 by xartigas.
@ -26,51 +26,51 @@ makefile that allows GStreamer integration.
**src/com/gst\_sdk\_tutorials/tutorial\_1/Tutorial1.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_1;
import android.app.Activity;
import android.os.Bundle;
import android.widget.TextView;
import android.widget.Toast;
import com.gstreamer.GStreamer;
public class Tutorial1 extends Activity {
private native String nativeGetGStreamerInfo();
// Called when the activity is first created.
@Override
public void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
try {
GStreamer.init(this);
} catch (Exception e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
finish();
finish();
return;
}
setContentView(R.layout.main);
TextView tv = (TextView)findViewById(R.id.textview_info);
tv.setText("Welcome to " + nativeGetGStreamerInfo() + " !");
}
static {
System.loadLibrary("gstreamer_android");
System.loadLibrary("tutorial-1");
}
}
```
Calls from Java to C happen through native methods, like the one
declared here:
``` first-line: 11; theme: Default; brush: java; gutter: true
``` lang=java
private native String nativeGetGStreamerInfo();
```
@ -82,7 +82,7 @@ shown later.
The first bit of code that gets actually executed is the static
initializer of the class:
``` first-line: 33; theme: Default; brush: java; gutter: true
``` lang=java
static {
System.loadLibrary("gstreamer_android");
System.loadLibrary("tutorial-1");
@ -99,14 +99,14 @@ expose. The GStreamer library only exposes a `init()` method, which
initializes GStreamer and registers all plugins (The tutorial library is
explained later below).
``` first-line: 19; theme: Default; brush: java; gutter: true
try {
GStreamer.init(this);
} catch (Exception e) {
``` lang=java
try {
GStreamer.init(this);
} catch (Exception e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
finish();
return;
}
finish();
return;
}
```
Next, in the `OnCreate()` method of the
@ -122,9 +122,9 @@ Should initialization fail, the `init()` method would throw an
[Exception](http://developer.android.com/reference/java/lang/Exception.html)
with the details provided by the GStreamer library.
``` first-line: 29; theme: Default; brush: java; gutter: true
TextView tv = (TextView)findViewById(R.id.textview_info);
tv.setText("Welcome to " + nativeGetGStreamerInfo() + " !");
``` lang=java
TextView tv = (TextView)findViewById(R.id.textview_info);
tv.setText("Welcome to " + nativeGetGStreamerInfo() + " !");
```
Then, the native method `nativeGetGStreamerInfo()` is called and a
@ -139,7 +139,7 @@ code:
**jni/tutorial-1.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <jni.h>
#include <android/log.h>
@ -179,9 +179,9 @@ Machine (VM) loads a library.
Here, we retrieve the JNI environment needed to make calls that interact
with Java:
``` first-line: 21; theme: Default; brush: cpp; gutter: true
``` lang=c
JNIEnv *env = NULL;
if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) {
__android_log_print (ANDROID_LOG_ERROR, "tutorial-1", "Could not retrieve JNIEnv");
return 0;
@ -192,7 +192,7 @@ And then locate the class containing the UI part of this tutorial using
`
FindClass()`:
``` first-line: 27; theme: Default; brush: cpp; gutter: true
``` lang=c
jclass klass = (*env)->FindClass (env, "com/gst_sdk_tutorials/tutorial_1/Tutorial1");
```
@ -201,7 +201,7 @@ is, we provide the code for the methods we advertised in Java using the
**`native`**
 keyword:
``` first-line: 28; theme: Default; brush: cpp; gutter: true
``` lang=c
(*env)->RegisterNatives (env, klass, native_methods, G_N_ELEMENTS(native_methods));
```
@ -211,7 +211,7 @@ name, its [type
signature](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/types.html#wp276)
and a pointer to the C function implementing it:
``` first-line: 16; theme: Default; brush: cpp; gutter: true
``` lang=c
static JNINativeMethod native_methods[] = {
{ "nativeGetGStreamerInfo", "()Ljava/lang/String;", (void *) gst_native_get_gstreamer_info}
};
@ -220,7 +220,7 @@ static JNINativeMethod native_methods[] = {
The only native method used in this tutorial
is `nativeGetGStreamerInfo()`:
``` first-line: 9; theme: Default; brush: cpp; gutter: true
``` lang=c
jstring gst_native_get_gstreamer_info (JNIEnv* env, jobject thiz) {
char *version_utf8 = gst_version_string();
jstring *version_jstring = (*env)->NewStringUTF(env, version_utf8);
@ -241,7 +241,7 @@ must free the `char *` returned by `gst_version_string()`.
**jni/Android.mk**
``` theme: Default; brush: ruby; gutter: true
``` lang=ruby
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
@ -285,13 +285,12 @@ As usual, it has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[tutorial1-screenshot.png](attachments/2687057/2654411.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial1-screenshot.png](attachments/2687057/2654416.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial1-screenshot.png](attachments/2687057/2654326.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Android tutorial 2: A running pipeline
# Android tutorial 2: A running pipeline
This page last changed on May 07, 2013 by xartigas.
@ -56,7 +56,7 @@ messages sent from the C code (for errors and state changes).
**src/com/gst\_sdk\_tutorials/tutorial\_2/Tutorial2.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_2;
import android.app.Activity;
@ -91,7 +91,7 @@ public class Tutorial2 extends Activity {
GStreamer.init(this);
} catch (Exception e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
finish();
finish();
return;
}
@ -181,7 +181,7 @@ public class Tutorial2 extends Activity {
As usual, the first bit that gets executed is the static initializer of
the class:
``` first-line: 113; theme: Default; brush: java; gutter: true
``` lang=java
static {
System.loadLibrary("gstreamer_android");
System.loadLibrary("tutorial-2");
@ -198,7 +198,7 @@ In the `onCreate()` method GStreamer is initialized as in the previous
tutorial with `GStreamer.init(this)`, and then the layout is inflated
and listeners are setup for the two UI buttons:
``` first-line: 41; theme: Default; brush: java; gutter: true
``` lang=java
ImageButton play = (ImageButton) this.findViewById(R.id.button_play);
play.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
@ -224,7 +224,7 @@ and safer than tracking the actual pipeline state, because orientation
changes can happen before the pipeline has moved to the desired state,
for example.
``` first-line: 57; theme: Default; brush: java; gutter: true
``` lang=java
if (savedInstanceState != null) {
is_playing_desired = savedInstanceState.getBoolean("playing");
Log.i ("GStreamer", "Activity created. Saved state is playing:" + is_playing_desired);
@ -239,7 +239,7 @@ We will first build the GStreamer pipeline (below) and only when the
native code reports itself as initialized we will use
`is_playing_desired`.
``` first-line: 69; theme: Default; brush: java; gutter: true
``` lang=java
nativeInit();
```
@ -252,7 +252,7 @@ This finishes the `onCreate()` method and the Java initialization. The
UI buttons are disabled, so nothing will happen until native code is
ready and `onGStreamerInitialized()` is called:
``` first-line: 94; theme: Default; brush: java; gutter: true
``` lang=java
private void onGStreamerInitialized () {
Log.i ("GStreamer", "Gst initialized. Restoring state, playing:" + is_playing_desired);
```
@ -261,7 +261,7 @@ This is called by the native code when its main loop is finally running.
We first retrieve the desired playing state from `is_playing_desired`,
and then set that state:
``` first-line: 96; theme: Default; brush: java; gutter: true
``` lang=java
// Restore previous playing state
if (is_playing_desired) {
nativePlay();
@ -272,7 +272,7 @@ if (is_playing_desired) {
Here comes the first caveat, when re-enabling the UI buttons:
``` first-line: 103; theme: Default; brush: java; gutter: true
``` lang=java
// Re-enable buttons, now that GStreamer is initialized
final Activity activity = this;
runOnUiThread(new Runnable() {
@ -300,7 +300,7 @@ The same problem exists when the native code wants to output a string in
our TextView using the `setMessage()` method: it has to be done from the
UI thread. The solution is the same:
``` first-line: 83; theme: Default; brush: java; gutter: true
``` lang=java
private void setMessage(final String message) {
final TextView tv = (TextView) this.findViewById(R.id.textview_message);
runOnUiThread (new Runnable() {
@ -313,7 +313,7 @@ private void setMessage(final String message) {
Finally, a few remaining bits:
``` first-line: 72; theme: Default; brush: java; gutter: true
``` lang=java
protected void onSaveInstanceState (Bundle outState) {
Log.d ("GStreamer", "Saving state, playing:" + is_playing_desired);
outState.putBoolean("playing", is_playing_desired);
@ -324,7 +324,7 @@ This method stores the currently desired playing state when Android is
about to shut us down, so next time it restarts (after an orientation
change, for example), it can restore the same state.
``` first-line: 77; theme: Default; brush: java; gutter: true
``` lang=java
protected void onDestroy() {
nativeFinalize();
super.onDestroy();
@ -341,7 +341,7 @@ This concludes the UI part of the tutorial.
**jni/tutorial-2.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <jni.h>
#include <android/log.h>
@ -624,7 +624,7 @@ the basic tutorials, and it is used to hold all our information in one
place, so we can easily pass it around to
callbacks:
``` first-line: 22; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
jobject app; /* Application instance, used to call its methods. A global reference is kept. */
@ -651,7 +651,7 @@ the `long` type used in Java is always 64 bits wide, but the pointer
used in C can be either 32 or 64 bits wide. The macros take care of the
conversion without warnings.
``` first-line: 259; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Library initializer */
jint JNI_OnLoad(JavaVM *vm, void *reserved) {
JNIEnv *env = NULL;
@ -678,13 +678,13 @@ uses [pthread\_key\_create()](http://pubs.opengroup.org/onlinepubs/9699919799/f
to be able to store per-thread information, which is crucial to properly
manage the JNI Environment, as shown later.
``` first-line: 234; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Static class initializer: retrieve method and field IDs */
static jboolean gst_native_class_init (JNIEnv* env, jclass klass) {
custom_data_field_id = (*env)->GetFieldID (env, klass, "native_custom_data", "J");
set_message_method_id = (*env)->GetMethodID (env, klass, "setMessage", "(Ljava/lang/String;)V");
on_gstreamer_initialized_method_id = (*env)->GetMethodID (env, klass, "onGStreamerInitialized", "()V");
if (!custom_data_field_id || !set_message_method_id || !on_gstreamer_initialized_method_id) {
/* We emit this message through the Android log instead of the GStreamer log because the later
* has not been initialized yet.
@ -716,7 +716,7 @@ from Java:
This method is called at the end of Java's `onCreate()`.
``` first-line: 191; theme: Default; brush: cpp; gutter: true
``` lang=c
static void gst_native_init (JNIEnv* env, jobject thiz) {
CustomData *data = g_new0 (CustomData, 1);
SET_CUSTOM_DATA (env, thiz, custom_data_field_id, data);
@ -725,7 +725,7 @@ static void gst_native_init (JNIEnv* env, jobject thiz) {
It first allocates memory for the `CustomData` structure and passes the
pointer to the Java class with `SET_CUSTOM_DATA`, so it is remembered.
``` first-line: 197; theme: Default; brush: cpp; gutter: true
``` lang=c
data->app = (*env)->NewGlobalRef (env, thiz);
```
@ -734,16 +734,16 @@ in `CustomData` (a [Global
Reference](http://developer.android.com/guide/practices/jni.html#local_and_global_references)
is used) so its methods can be called later.
``` first-line: 199; theme: Default; brush: cpp; gutter: true
``` lang=c
pthread_create (&gst_app_thread, NULL, &app_function, data);
```
Finally, a thread is created and it starts running the
`app_function()` method.
### `app_function()`
### `app_function()`
``` first-line: 134; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Main method for the native code. This is executed on its own thread. */
static void *app_function (void *userdata) {
JavaVMAttachArgs args;
@ -751,9 +751,9 @@ static void *app_function (void *userdata) {
CustomData *data = (CustomData *)userdata;
GSource *bus_source;
GError *error = NULL;
GST_DEBUG ("Creating pipeline in CustomData at %p", data);
/* Create our own GLib Main Context and make it the default one */
data->context = g_main_context_new ();
g_main_context_push_thread_default(data->context);
@ -766,7 +766,7 @@ is created with `g_main_context_new()` and then it is made the default
one for the thread with
`g_main_context_push_thread_default()`.
``` first-line: 149; theme: Default; brush: cpp; gutter: true
``` lang=c
data->pipeline = gst_parse_launch("audiotestsrc ! audioconvert ! audioresample ! autoaudiosink", &error);
if (error) {
gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
@ -781,7 +781,7 @@ It then creates a pipeline the easy way, with `gst-parse-launch()`. In
this case, it is simply an `audiotestsrc` (which produces a continuous
tone) and an `autoaudiosink`, with accompanying adapter elements.
``` first-line: 159; theme: Default; brush: cpp; gutter: true
``` lang=c
bus = gst_element_get_bus (data->pipeline);
bus_source = gst_bus_create_watch (bus);
g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, NULL, NULL);
@ -798,7 +798,7 @@ creation of the watch is done step by step instead of using
`gst_bus_add_signal_watch()` to exemplify how to use a custom GLib
context.
``` first-line: 169; theme: Default; brush: cpp; gutter: true
``` lang=c
GST_DEBUG ("Entering main loop... (CustomData:%p)", data);
data->main_loop = g_main_loop_new (data->context, FALSE);
check_initialization_complete (data);
@ -822,7 +822,7 @@ Once the main loop has quit, all resources are freed in lines 178 to
### `check_initialization_complete()`
``` first-line: 121; theme: Default; brush: cpp; gutter: true
``` lang=c
static void check_initialization_complete (CustomData *data) {
JNIEnv *env = get_jni_env ();
if (!data->initialized && data->main_loop) {
@ -866,7 +866,7 @@ see how it works, step by step:
### `get_jni_env()`
``` first-line: 68; theme: Default; brush: cpp; gutter: true
``` lang=c
static JNIEnv *get_jni_env (void) {
JNIEnv *env;
if ((env = pthread_getspecific (current_jni_env)) == NULL) {
@ -903,7 +903,7 @@ Let's now review the rest of the native methods accessible from Java:
### `gst_native_finalize()` (`nativeFinalize()` from Java)
``` first-line: 203; theme: Default; brush: cpp; gutter: true
``` lang=c
static void gst_native_finalize (JNIEnv* env, jobject thiz) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
@ -952,7 +952,7 @@ error or state changed message and display a message in the UI using the
### `set_ui_message()`
``` first-line: 80; theme: Default; brush: cpp; gutter: true
``` lang=c
static void set_ui_message (const gchar *message, CustomData *data) {
JNIEnv *env = get_jni_env ();
GST_DEBUG ("Setting message to: %s", message);
@ -997,7 +997,7 @@ method and free the UTF16 message with
**jni/Android.mk**
``` theme: Default; brush: ruby; gutter: true
``` lang=ruby
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
@ -1051,16 +1051,15 @@ As usual, it has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[tutorial2-screenshot.png](attachments/2687063/2654325.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial2-screenshot.png](attachments/2687063/2654412.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial2-screenshot.png](attachments/2687063/2654417.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial2-screenshot.png](attachments/2687063/2654324.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Android tutorial 3: Video
# Android tutorial 3: Video
This page last changed on Nov 05, 2012 by xartigas.
@ -38,7 +38,7 @@ until a main loop is running and a drawing surface has been received.
**src/com/gst\_sdk\_tutorials/tutorial\_3/Tutorial3.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_3;
import android.app.Activity;
@ -77,7 +77,7 @@ public class Tutorial3 extends Activity implements SurfaceHolder.Callback {
GStreamer.init(this);
} catch (Exception e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
finish();
finish();
return;
}
@ -189,7 +189,7 @@ surface to the layout and changing the GStreamer pipeline to produce
video instead of audio. Only the parts of the code that are new will be
discussed.
``` first-line: 22; theme: Default; brush: java; gutter: true
``` lang=java
private native void nativeSurfaceInit(Object surface);
private native void nativeSurfaceFinalize();
```
@ -199,7 +199,7 @@ Two new entry points to the C code are defined,
when the video surface becomes available and when it is about to be
destroyed, respectively.
``` first-line: 61; theme: Default; brush: java; gutter: true
``` lang=java
SurfaceView sv = (SurfaceView) this.findViewById(R.id.surface_video);
SurfaceHolder sh = sv.getHolder();
sh.addCallback(this);
@ -214,7 +214,7 @@ interface. This is why we declared this Activity as implementing the
[SurfaceHolder.Callback](http://developer.android.com/reference/android/view/SurfaceHolder.Callback.html)
interface in line 16.
``` first-line: 127; theme: Default; brush: java; gutter: true
``` lang=java
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
Log.d("GStreamer", "Surface changed to format " + format + " width "
@ -245,7 +245,7 @@ Lets review the C code to see what these functions do.
**jni/tutorial-3.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <jni.h>
#include <android/log.h>
@ -589,7 +589,7 @@ First, our `CustomData` structure is augmented to keep a pointer to the
video sink element and the native window
handle:
``` first-line: 33; theme: Default; brush: cpp; gutter: true
``` lang=c
GstElement *video_sink; /* The video sink element which receives XOverlay commands */
ANativeWindow *native_window; /* The Android native window where video will be rendered */
```
@ -598,7 +598,7 @@ The `check_initialization_complete()` method is also augmented so that
it requires a native window before considering GStreamer to be
initialized:
``` first-line: 127; theme: Default; brush: cpp; gutter: true
``` lang=c
static void check_initialization_complete (CustomData *data) {
JNIEnv *env = get_jni_env ();
if (!data->initialized && data->native_window && data->main_loop) {
@ -627,14 +627,14 @@ effects in the `GSTREAMER_PLUGINS_EFFECTS` package), and an
`autovideosink` which will instantiate the adequate video sink for the
platform:
``` first-line: 159; theme: Default; brush: cpp; gutter: true
``` lang=c
data->pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink ", &error);
```
Here things start to get more
interesting:
``` first-line: 168; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set the pipeline to READY, so it can already accept a window handle, if we have one */
gst_element_set_state(data->pipeline, GST_STATE_READY);
@ -662,7 +662,7 @@ Now we will implement the two native functions called by the Java code
when the drawing surface becomes available or is about to be
destroyed:
``` first-line: 270; theme: Default; brush: cpp; gutter: true
``` lang=c
static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
@ -719,7 +719,7 @@ We finally store the new window handle and call
`check_initialization_complete()` to inform the Java code that
everything is set up, if that is the case.
``` first-line: 295; theme: Default; brush: cpp; gutter: true
``` lang=c
static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
@ -772,7 +772,7 @@ surface.
**src/com/gst\_sdk\_tutorials/tutorial\_3/GStreamerSurfaceView.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_3;
import android.content.Context;
@ -864,7 +864,7 @@ public class GStreamerSurfaceView extends SurfaceView {
**/jni/Android.mk**
``` theme: Default; brush: ruby; gutter: true
``` lang=ruby
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
@ -917,16 +917,15 @@ It has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[tutorial3-screenshot.png](attachments/2687065/2654414.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial3-screenshot.png](attachments/2687065/2654415.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial3-screenshot.png](attachments/2687065/2654418.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[tutorial3-screenshot.png](attachments/2687065/2654413.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Android tutorial 4: A basic media player
# Android tutorial 4: A basic media player
This page last changed on May 21, 2013 by xartigas.
@ -49,7 +49,7 @@ this view is collapsed by default. Click here to expand…
**src/com/gst\_sdk\_tutorials/tutorial\_4/Tutorial4.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_4;
import java.text.SimpleDateFormat;
@ -103,7 +103,7 @@ public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSee
GStreamer.init(this);
} catch (Exception e) {
Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
finish();
finish();
return;
}
@ -313,7 +313,7 @@ offer the same functionalities. We keep track of this in the
`is_local_media` variable, and update it every time we change the media
URI:
``` first-line: 132; theme: Default; brush: java; gutter: true
``` lang=java
private void setMediaUri() {
nativeSetUri (mediaUri);
is_local_media = mediaUri.startsWith("file://");
@ -329,7 +329,7 @@ Every time the size of the media changes (which could happen mid-stream,
for some kind of streams), or when it is first detected, C code calls
our `onMediaSizeChanged()` callback:
``` first-line: 217; theme: Default; brush: java; gutter: true
``` lang=java
private void onMediaSizeChanged (int width, int height) {
Log.i ("GStreamer", "Media size changed to " + width + "x" + height);
final GStreamerSurfaceView gsv = (GStreamerSurfaceView) this.findViewById(R.id.surface_video);
@ -371,7 +371,7 @@ To realize the first function, C code will periodically call our
in the Seek Bar. Again we do so from the UI thread, using
`RunOnUiThread()`.
``` first-line: 176; theme: Default; brush: java; gutter: true
``` lang=java
private void setCurrentPosition(final int position, final int duration) {
final SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
@ -397,12 +397,12 @@ widget which we will use to display the current position and duration in
`HH:mm:ss / HH:mm:ss` textual format. The `updateTimeWidget()` method
takes care of it, and must be called every time the Seek Bar is updated:
``` first-line: 164; theme: Default; brush: java; gutter: true
``` lang=java
private void updateTimeWidget () {
final TextView tv = (TextView) this.findViewById(R.id.textview_time);
final SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
final int pos = sb.getProgress();
SimpleDateFormat df = new SimpleDateFormat("HH:mm:ss");
df.setTimeZone(TimeZone.getTimeZone("UTC"));
final String message = df.format(new Date (pos)) + " / " + df.format(new Date (duration));
@ -419,7 +419,7 @@ the user to seek by dragging the thumb), we implement the
interface in the
Activity:
``` first-line: 22; theme: Default; brush: java; gutter: true
``` lang=java
public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSeekBarChangeListener {
```
@ -427,7 +427,7 @@ And we register the Activity as the listener for the [Seek
Bar](http://developer.android.com/reference/android/widget/SeekBar.html)s
events in the `onCreate()` method:
``` first-line: 80; theme: Default; brush: java; gutter: true
``` lang=java
SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
sb.setOnSeekBarChangeListener(this);
```
@ -436,7 +436,7 @@ We will now be notified of three events: When the user starts dragging
the thumb, every time the thumb moves and when the thumb is released by
the user:
``` first-line: 239; theme: Default; brush: java; gutter: true
``` lang=java
public void onStartTrackingTouch(SeekBar sb) {
nativePause();
} 
@ -448,7 +448,7 @@ pause the pipeline. If the user is searching for a particular scene, we
do not want it to keep
moving.
``` first-line: 230; theme: Default; brush: java; gutter: true
``` lang=java
public void onProgressChanged(SeekBar sb, int progress, boolean fromUser) {
if (fromUser == false) return;
desired_position = progress;
@ -468,7 +468,7 @@ this is, we jump to the indicated position as soon as the thumb moves.
Otherwise, the seek will be performed when the thumb is released, and
the only thing we do here is update the textual time widget.
``` first-line: 244; theme: Default; brush: java; gutter: true
``` lang=java
public void onStopTrackingTouch(SeekBar sb) {
// If this is a remote file, scrub seeking is probably not going to work smoothly enough.
// Therefore, perform only the seek when the slider is released.
@ -492,7 +492,7 @@ this view is collapsed by default. Click here to expand…
**jni/tutorial-4.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <jni.h>
#include <android/log.h>
@ -1068,7 +1068,7 @@ jint JNI_OnLoad(JavaVM *vm, void *reserved) {
Java code will call `gst_native_set_uri()` whenever it wants to change
the playing URI (in this tutorial the URI never changes, but it could):
``` first-line: 436; theme: Default; brush: cpp; gutter: true
``` lang=c
void gst_native_set_uri (JNIEnv* env, jobject thiz, jstring uri) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data || !data->pipeline) return;
@ -1116,7 +1116,7 @@ change during playback. For simplicity, this tutorial assumes that they
do not. Therefore, in the READY to PAUSED state change, once the Caps of
the decoded media are known, we inspect them in `check_media_size()`:
``` first-line: 252; theme: Default; brush: cpp; gutter: true
``` lang=c
static void check_media_size (CustomData *data) {
JNIEnv *env = get_jni_env ();
GstElement *video_sink;
@ -1167,7 +1167,7 @@ To keep the UI updated, a GLib timer is installed in the
`app_function()` that fires 4 times per second (or every 250ms), right
before entering the main loop:
``` first-line: 377; theme: Default; brush: cpp; gutter: true
``` lang=c
timeout_source = g_timeout_source_new (250);
g_source_set_callback (timeout_source, (GSourceFunc)refresh_ui, data, NULL);
g_source_attach (timeout_source, data->context);
@ -1176,7 +1176,7 @@ g_source_unref (timeout_source); 
Then, in the refresh\_ui method:
``` first-line: 126; theme: Default; brush: cpp; gutter: true
``` lang=c
static gboolean refresh_ui (CustomData *data) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 current = -1;
@ -1230,7 +1230,7 @@ see how to overcome these problems.
In
`gst_native_set_position()`:
``` first-line: 468; theme: Default; brush: cpp; gutter: true
``` lang=c
void gst_native_set_position (JNIEnv* env, jobject thiz, int milliseconds) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
@ -1249,7 +1249,7 @@ away; otherwise, store the desired position in the
`desired_position` variable. Then, in the
`state_changed_cb()` callback:
``` first-line: 297; theme: Default; brush: cpp; gutter: true
``` lang=c
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
/* By now the sink already knows the media size */
check_media_size(data);
@ -1286,7 +1286,7 @@ once this period elapses.
To achieve this, all seek requests are routed through the
`execute_seek()` method:
``` first-line: 154; theme: Default; brush: cpp; gutter: true
``` lang=c
static void execute_seek (gint64 desired_position, CustomData *data) {
gint64 diff;
@ -1355,7 +1355,7 @@ using buffering. The same procedure is used here, by listening to the
buffering
messages:
``` first-line: 372; theme: Default; brush: cpp; gutter: true
``` lang=c
g_signal_connect (G_OBJECT (bus), "message::buffering", (GCallback)buffering_cb, data);
```
@ -1363,7 +1363,7 @@ And pausing the pipeline until buffering is complete (unless this is a
live
source):
``` first-line: 224; theme: Default; brush: cpp; gutter: true
``` lang=c
static void buffering_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
gint percent;
@ -1396,7 +1396,7 @@ is `GSTREAMER_PLUGINS`:
**jni/Android.mk**
``` first-line: 19; theme: Default; brush: plain; gutter: true
```
GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_PLAYBACK) $(GSTREAMER_PLUGINS_CODECS) $(GSTREAMER_PLUGINS_NET) $(GSTREAMER_PLUGINS_SYS)
```
@ -1428,7 +1428,6 @@ As usual, it has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[tutorial4-screenshot.png](attachments/2687067/2654419.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Android tutorial 5: A Complete media player
# Android tutorial 5: A Complete media player
This page last changed on Nov 28, 2012 by xartigas.
@ -104,11 +104,10 @@ It has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[tutorial5-screenshot.png](attachments/2687069/2654436.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[ic\_media\_next.png](attachments/2687069/2654438.png) (image/png)
[ic\_media\_next.png](attachments/2687069/2654438.png) (image/png)
![](images/icons/bullet_blue.gif)
[ic\_media\_next.png](attachments/2687069/2654437.png) (image/png)
[ic\_media\_next.png](attachments/2687069/2654437.png) (image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Android tutorials
# Android tutorials
This page last changed on May 02, 2013 by xartigas.
@ -35,4 +35,3 @@ files
in `$(ANDROID_NDK_ROOT)\platforms\android-9\arch-arm\usr\include\android`.
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic Media Player
# Basic Media Player
This page last changed on May 24, 2013 by xartigas.
@ -28,7 +28,7 @@ each file to expand.
**CMakeLists.txt**
``` theme: Default; brush: plain; gutter: true
```
project(qtgst-example-player)
find_package(QtGStreamer REQUIRED)
# automoc is now a built-in tool since CMake 2.8.6.
@ -52,7 +52,7 @@ target_link_libraries(player ${QTGSTREAMER_UI_LIBRARIES} ${QT_QTOPENGL_LIBRARIES
**main.cpp**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include "mediaapp.h"
#include <QtWidgets/QApplication>
#include <QGst/Init>
@ -73,7 +73,7 @@ int main(int argc, char *argv[])
**mediaapp.h**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#ifndef MEDIAAPP_H
#define MEDIAAPP_H
#include <QtCore/QTimer>
@ -126,7 +126,7 @@ private:
**mediaapp.cpp**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include "mediaapp.h"
#include "player.h"
#if (QT_VERSION >= QT_VERSION_CHECK(5, 0, 0))
@ -326,7 +326,7 @@ void MediaApp::createUI(QBoxLayout *appLayout)
**player.h**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#ifndef PLAYER_H
#define PLAYER_H
#include <QtCore/QTimer>
@ -374,7 +374,7 @@ private:
**player.cpp**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include "player.h"
#include <QtCore/QDir>
#include <QtCore/QUrl>
@ -555,7 +555,7 @@ We begin by looking at `main()`:
**main.cpp**
``` first-line: 4; theme: Default; brush: cpp; gutter: true
``` lang=c
int main(int argc, char *argv[])
{
QApplication app(argc, argv);
@ -584,7 +584,7 @@ the UI:
**MediaApp::MediaApp()**
``` first-line: 20; theme: Default; brush: cpp; gutter: true
``` lang=c
//create the player
m_player = new Player(this);
connect(m_player, SIGNAL(positionChanged()), this, SLOT(onPositionChanged()));
@ -596,7 +596,7 @@ line, if any:
**MediaApp::openFile()**
``` first-line: 43; theme: Default; brush: cpp; gutter: true
``` lang=c
void MediaApp::openFile(const QString & fileName)
{
m_baseDir = QFileInfo(fileName).path();
@ -610,7 +610,7 @@ This in turn instructs the `Player` to construct our GStreamer pipeline:
**Player::setUri()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::setUri(const QString & uri)
{
QString realUri = uri;
@ -650,7 +650,7 @@ rendering. For clarity, here is a portion of the implementation:
**prepare-xwindow-id handling**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
QGlib::connect(pipeline->bus(), "sync-message",
this, &PipelineWatch::onBusSyncMessage);
...
@ -666,7 +666,7 @@ void PipelineWatch::onBusSyncMessage(const MessagePtr & msg)
Once the pipeline is created, we connect to the bus' message signal (via
`QGlib::connect()`) to dispatch state change signals:
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::onBusMessage(const QGst::MessagePtr & message)
{
switch (message->type()) {
@ -708,7 +708,7 @@ void Player::handlePipelineStateChange(const QGst::StateChangedMessagePtr & scm)
Finally, we tell `playbin2` what to play by setting the `uri` property:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
m_pipeline->setProperty("uri", realUri);
```
@ -719,7 +719,7 @@ After `Player::setUri()` is called, `MediaApp::openFile()` calls
**Player::play()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::play()
{
if (m_pipeline) {
@ -732,7 +732,7 @@ The other state control methods are equally simple:
**Player state functions**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::pause()
{
if (m_pipeline) {
@ -756,7 +756,7 @@ is emitted on the GStreamer bus which gets picked up by the `Player`:
**Player::onBusMessage()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::onBusMessage(const QGst::MessagePtr & message)
{
switch (message->type()) {
@ -783,7 +783,7 @@ handled:
**MediaApp::onStateChanged()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void MediaApp::onStateChanged()
{
QGst::State newState = m_player->state();
@ -812,7 +812,7 @@ UI to handle:
**MediaApp::onPositionChanged()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void MediaApp::onPositionChanged()
{
QTime length(0,0);
@ -844,7 +844,7 @@ to `gst_element_query_position()`:
**Player::position()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
QTime Player::position() const
{
if (m_pipeline) {
@ -874,4 +874,3 @@ This tutorial has shown:
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# Basic tutorial 1: Hello world!
# Basic tutorial 1: Hello world!
## Goal
@ -23,25 +23,25 @@ in the SDK installation).
```
#include <gst/gst.h>
int main(int argc, char *argv[]) {
GstElement *pipeline;
GstBus *bus;
GstMessage *msg;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Build the pipeline */
pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
/* Free resources */
if (msg != NULL)
gst_message_unref (msg);
@ -240,4 +240,3 @@ The next tutorial will keep introducing more basic GStreamer elements,
and show you how to build a pipeline manually.
It has been a pleasure having you here, and see you soon!

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic tutorial 10: GStreamer tools
# Basic tutorial 10: GStreamer tools
This page last changed on Jun 01, 2012 by xartigas.
@ -74,7 +74,7 @@ In simple form, a PIPELINE-DESCRIPTION is a list of element types
separated by exclamation marks (\!). Go ahead and type in the following
command:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
```
@ -98,7 +98,7 @@ spaces). Use the `gst-inspect` tool (explained next) to find out the
available properties for an
element.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 videotestsrc pattern=11 ! ffmpegcolorspace ! autovideosink
```
@ -115,7 +115,7 @@ example.
Named elements are referred to using their name followed by a
dot.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! tee name=t ! queue ! autovideosink t. ! queue ! autovideosink
```
@ -149,7 +149,7 @@ This is useful, for example, when you want to retrieve one particular
stream out of a
demuxer:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
```
@ -169,7 +169,7 @@ All in all, we took a webm file, stripped it of audio, and generated a
new matroska file with the video. If we wanted to keep only the
audio:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.audio_00 ! vorbisparse ! matroskamux ! filesink location=sintel_audio.mka
```
@ -195,7 +195,7 @@ saying that GStreamer will choose one output pad at random.
Consider the following
pipeline:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! filesink location=test
```
@ -209,7 +209,7 @@ You can remove this ambiguity, though, by using named pads, as in the
previous sub-section, or by using **Caps
Filters**:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! video/x-vp8 ! matroskamux ! filesink location=sintel_video.mkv
```
@ -230,7 +230,7 @@ producing for a particular pipeline, run `gst-launch` as usual, with the
Play a media file using `playbin2` (as in [Basic tutorial 1: Hello
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html)):
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm
```
@ -238,7 +238,7 @@ A fully operation playback pipeline, with audio and video (more or less
the same pipeline that `playbin2` will create
internally):
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d ! queue ! vp8dec ! ffmpegcolorspace ! autovideosink d. ! queue ! vorbisdec ! audioconvert ! audioresample ! autoaudiosink
```
@ -248,7 +248,7 @@ with a different codec, and puts them back together in an Ogg container
(just for the sake of
it).
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm name=d ! queue ! theoraenc ! oggmux name=m ! filesink location=sintel.ogg d. ! queue ! audioconvert ! audioresample ! flacenc ! m.
```
@ -257,7 +257,7 @@ operation whenever the frame size is different in the input and the
output caps. The output caps are set by the Caps Filter to
320x200.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! queue ! videoscale ! video/x-raw-yuv,width=320,height=200 ! ffmpegcolorspace ! autovideosink
```
@ -279,7 +279,7 @@ This tool has three modes of operation:
Let's see an example of the third mode:
``` theme: Default; brush: plain; gutter: true
```
gst-inspect-0.10 vp8dec
 
Factory Details:
@ -364,7 +364,7 @@ Element Properties:
(0x00000004): addnoise - Add noise
deblocking-level : Deblocking level
flags: readable, writable
Unsigned Integer. Range: 0 - 16 Default: 4
Unsigned Integer. Range: 0 - 16 Default: 4
noise-level : Noise level
flags: readable, writable
Unsigned Integer. Range: 0 - 16 Default: 0  
@ -400,7 +400,7 @@ which basically control the amount of verbosity of the output.
Let's see an
example:
``` theme: Default; brush: plain; gutter: false
```
gst-discoverer-0.10 http://docs.gstreamer.com/media/sintel_trailer-480p.webm -v
Analyzing http://docs.gstreamer.com/media/sintel_trailer-480p.webm
@ -465,4 +465,3 @@ This tutorial has shown:
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic tutorial 11: Debugging tools
# Basic tutorial 11: Debugging tools
This page last changed on Jun 04, 2012 by xartigas.
@ -28,7 +28,7 @@ The debug output is controlled with the `GST_DEBUG` environment
variable. Heres an example with
`GST_DEBUG=2`:
``` theme: Default; brush: plain; gutter: false
```
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
```
@ -97,7 +97,7 @@ specific messages.
The content of each line in the debug output
is:
``` theme: Default; brush: plain; gutter: false
```
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
```
@ -159,7 +159,7 @@ as the Debug category in the output log).
To change the category to something more meaningful, add these two lines
at the top of your code:
``` theme: Default; brush: cpp; gutter: true
``` lang=c
GST_DEBUG_CATEGORY_STATIC (my_category);
#define GST_CAT_DEFAULT my_category
```
@ -167,7 +167,7 @@ GST_DEBUG_CATEGORY_STATIC (my_category);
And then this one after you have initialized GStreamer with
`gst_init()`:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
GST_DEBUG_CATEGORY_INIT (my_category, "my category", 0, "This is my very own");
```
@ -226,7 +226,6 @@ It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[playbin2.png](attachments/327830/2424840.png) (image/png)
[playbin2.png](attachments/327830/2424840.png) (image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic tutorial 12: Streaming
# Basic tutorial 12: Streaming
This page last changed on Sep 28, 2012 by xartigas.
@ -62,28 +62,28 @@ Copy this code into a text file named `basic-tutorial-12.c`.
**basic-tutorial-12.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
#include <string.h>
typedef struct _CustomData {
gboolean is_live;
GstElement *pipeline;
GMainLoop *loop;
} CustomData;
static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (msg, &err, &debug);
g_print ("Error: %s\n", err->message);
g_error_free (err);
g_free (debug);
gst_element_set_state (data->pipeline, GST_STATE_READY);
g_main_loop_quit (data->loop);
break;
@ -95,10 +95,10 @@ static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
break;
case GST_MESSAGE_BUFFERING: {
gint percent = 0;
/* If the stream is live, we do not care about buffering. */
if (data->is_live) break;
gst_message_parse_buffering (msg, &percent);
g_print ("Buffering (%3d%%)\r", percent);
/* Wait until buffering is complete before start/resume playing */
@ -118,24 +118,24 @@ static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
break;
}
}
int main(int argc, char *argv[]) {
GstElement *pipeline;
GstBus *bus;
GstStateChangeReturn ret;
GMainLoop *main_loop;
CustomData data;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Initialize our data structure */
memset (&data, 0, sizeof (data));
/* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
bus = gst_element_get_bus (pipeline);
/* Start playing */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -145,16 +145,16 @@ int main(int argc, char *argv[]) {
} else if (ret == GST_STATE_CHANGE_NO_PREROLL) {
data.is_live = TRUE;
}
main_loop = g_main_loop_new (NULL, FALSE);
data.loop = main_loop;
data.pipeline = pipeline;
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message", G_CALLBACK (cb_message), &data);
g_main_loop_run (main_loop);
/* Free resources */
g_main_loop_unref (main_loop);
gst_object_unref (bus);
@ -195,7 +195,7 @@ therefore, the initialization code is very simple and should be
self-explanative by now. The only new bit is the detection of live
streams:
``` first-line: 74; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Start playing */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -221,13 +221,13 @@ them, so we take note of the result of `gst_element_set_state()` in the
Lets now review the interesting parts of the message parsing callback:
``` first-line: 31; theme: Default; brush: cpp; gutter: true
``` lang=c
case GST_MESSAGE_BUFFERING: {
gint percent = 0;
/* If the stream is live, we do not care about buffering. */
if (data->is_live) break;
gst_message_parse_buffering (msg, &percent);
g_print ("Buffering (%3d%%)\r", percent);
/* Wait until buffering is complete before start/resume playing */
@ -254,7 +254,7 @@ network becomes slow or unresponsive and our buffer depletes, we will
receive new buffering messages with levels below 100% so we will pause
the pipeline again until enough buffer has been built up.
``` theme: Default; brush: cpp; gutter: false
``` lang=c
case GST_MESSAGE_CLOCK_LOST:
/* Get a new clock */
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
@ -282,9 +282,8 @@ It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[basic-tutorial-12.c](attachments/327806/2424843.c) (text/plain)
[basic-tutorial-12.c](attachments/327806/2424843.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/327806/2424844.zip) (application/zip)
[vs2010.zip](attachments/327806/2424844.zip) (application/zip)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic tutorial 13: Playback speed
# Basic tutorial 13: Playback speed
This page last changed on Jul 06, 2012 by xartigas.
@ -69,31 +69,31 @@ Copy this code into a text file named `basic-tutorial-13.c`.
**basic-tutorial-13.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <gst/gst.h>
typedef struct _CustomData {
GstElement *pipeline;
GstElement *video_sink;
GMainLoop *loop;
gboolean playing; /* Playing or Paused */
gdouble rate; /* Current playback rate (can be negative) */
} CustomData;
/* Send seek event to change rate */
static void send_seek_event (CustomData *data) {
gint64 position;
GstFormat format = GST_FORMAT_TIME;
GstEvent *seek_event;
/* Obtain the current position, needed for the seek event */
if (!gst_element_query_position (data->pipeline, &format, &position)) {
g_printerr ("Unable to retrieve current position.\n");
return;
}
/* Create the seek event */
if (data->rate > 0) {
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
@ -102,26 +102,26 @@ static void send_seek_event (CustomData *data) {
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
GST_SEEK_TYPE_SET, 0, GST_SEEK_TYPE_SET, position);
}
if (data->video_sink == NULL) {
/* If we have not done so, obtain the sink through which we will send the seek events */
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
}
/* Send the event */
gst_element_send_event (data->video_sink, seek_event);
g_print ("Current rate: %g\n", data->rate);
}
/* Process keyboard input */
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
gchar *str = NULL;
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) != G_IO_STATUS_NORMAL) {
return TRUE;
}
switch (g_ascii_tolower (str[0])) {
case 'p':
data->playing = !data->playing;
@ -145,7 +145,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
/* If we have not done so, obtain the sink through which we will send the step events */
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
}
gst_element_send_event (data->video_sink,
gst_event_new_step (GST_FORMAT_BUFFERS, 1, data->rate, TRUE, FALSE));
g_print ("Stepping one frame\n");
@ -156,23 +156,23 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
default:
break;
}
g_free (str);
return TRUE;
}
int main(int argc, char *argv[]) {
CustomData data;
GstStateChangeReturn ret;
GIOChannel *io_stdin;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Initialize our data structure */
memset (&data, 0, sizeof (data));
/* Print usage map */
g_print (
"USAGE: Choose one of the following options, then press enter:\n"
@ -181,10 +181,10 @@ int main(int argc, char *argv[]) {
" 'D' to toggle playback direction\n"
" 'N' to move to next frame (in the current direction, better in PAUSE)\n"
" 'Q' to quit\n");
/* Build the pipeline */
data.pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
@ -192,7 +192,7 @@ int main(int argc, char *argv[]) {
io_stdin = g_io_channel_unix_new (fileno (stdin));
#endif
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
/* Start playing */
ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -202,11 +202,11 @@ int main(int argc, char *argv[]) {
}
data.playing = TRUE;
data.rate = 1.0;
/* Create a GLib Main Loop and set it to run */
data.loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.loop);
/* Free resources */
g_main_loop_unref (data.loop);
g_io_channel_unref (io_stdin);
@ -250,15 +250,15 @@ keystrokes and a GLib main loop is executed.
Then, in the keyboard handler function:
``` first-line: 45; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Process keyboard input */
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
gchar *str = NULL;
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) != G_IO_STATUS_NORMAL) {
return TRUE;
}
switch (g_ascii_tolower (str[0])) {
case 'p':
data->playing = !data->playing;
@ -270,7 +270,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
Pause / Playing toggle is handled with `gst_element_set_state()` as in
previous tutorials.
``` first-line: 59; theme: Default; brush: cpp; gutter: true
``` lang=c
case 's':
if (g_ascii_isupper (str[0])) {
data->rate *= 2.0;
@ -290,13 +290,13 @@ reverse the current playback direction. In both cases, the
`rate` variable is updated and `send_seek_event` is called. Lets
review this function.
``` first-line: 13; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Send seek event to change rate */
static void send_seek_event (CustomData *data) {
gint64 position;
GstFormat format = GST_FORMAT_TIME;
GstEvent *seek_event;
/* Obtain the current position, needed for the seek event */
if (!gst_element_query_position (data->pipeline, &format, &position)) {
g_printerr ("Unable to retrieve current position.\n");
@ -312,7 +312,7 @@ want to move, we jump to the current position. Using a Step Event would
be simpler, but this event is not currently fully functional, as
explained in the Introduction.
``` first-line: 25; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create the seek event */
if (data->rate > 0) {
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
@ -329,7 +329,7 @@ position. Regardless of the playback direction, the start position must
be smaller than the stop position, so the two playback directions are
treated differently.
``` first-line: 34; theme: Default; brush: cpp; gutter: true
``` lang=c
if (data->video_sink == NULL) {
/* If we have not done so, obtain the sink through which we will send the seek events */
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
@ -343,7 +343,7 @@ at this time instead at initialization time because the actual sink may
change depending on the media contents, and this wont be known until
the pipeline is PLAYING and some media has been read.
``` first-line: 39; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Send the event */
gst_element_send_event (data->video_sink, seek_event);
```
@ -354,13 +354,13 @@ The new Event is finally sent to the selected sink with
Back to the keyboard handler, we still miss the frame stepping code,
which is really simple:
``` first-line: 71; theme: Default; brush: cpp; gutter: true
``` lang=c
case 'n':
if (data->video_sink == NULL) {
/* If we have not done so, obtain the sink through which we will send the step events */
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
}
gst_element_send_event (data->video_sink,
gst_event_new_step (GST_FORMAT_BUFFERS, 1, data->rate, TRUE, FALSE));
g_print ("Stepping one frame\n");
@ -401,9 +401,8 @@ It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[basic-tutorial-13.c](attachments/327800/2424883.c) (text/plain)
[basic-tutorial-13.c](attachments/327800/2424883.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/327800/2424884.zip) (application/zip)
[vs2010.zip](attachments/327800/2424884.zip) (application/zip)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic tutorial 14: Handy elements
# Basic tutorial 14: Handy elements
This page last changed on May 13, 2014 by xartigas.
@ -37,11 +37,11 @@ a `decodebin2` element. It acts like a demuxer, so it offers as many
source pads as streams are found in the
media.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! ffmpegcolorspace ! autovideosink
```
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! audioconvert ! autoaudiosink
```
@ -55,7 +55,7 @@ replaces the old `decodebin` element. It acts like a demuxer, so it
offers as many source pads as streams are found in the
media.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin2 ! autovideosink
```
@ -69,7 +69,7 @@ using a `typefind` element or by setting the `typefind` property
of `filesrc` to
`TRUE`.
``` theme: Default; brush: cpp; gutter: false
``` lang=c
gst-launch-0.10 filesrc location=f:\\media\\sintel\\sintel_trailer-480p.webm ! decodebin2 ! autovideosink
```
@ -79,7 +79,7 @@ This element writes to a file all the media it receives. Use the
`location` property to specify the file
name.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 audiotestsrc ! vorbisenc ! oggmux ! filesink location=test.ogg
```
@ -91,7 +91,7 @@ This element receives data as a client over the network via HTTP using
the SOUP library. Set the URL to retrieve through the `location`
property.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin2 ! autovideosink
```
@ -106,7 +106,7 @@ are “guaranteed” to work.
This element produces a video pattern (selectable among many different
options with the `pattern` property). Use it to test video pipelines.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
```
@ -115,7 +115,7 @@ gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
This element produces an audio wave (selectable among many different
options with the `wave` property). Use it to test video pipelines.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 audiotestsrc ! audioconvert ! autoaudiosink
```
@ -137,7 +137,7 @@ elements whose Caps are unknown at design time, like `autovideosink`, or
that can vary depending on external factors, like decoding a
user-provided file.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
```
@ -157,7 +157,7 @@ It is therefore a good idea to always use it whenever the actual frame
rate is unknown at design time, just in
case.
``` theme: Default; brush: cpp; gutter: false
``` lang=c
gst-launch-0.10 videotestsrc ! video/x-raw-rgb,framerate=30/1 ! videorate ! video/x-raw-rgb,framerate=1/1 ! ffmpegcolorspace ! autovideosink
```
@ -178,7 +178,7 @@ user, it is a good idea to use a `videoscale` element, since not all
video sinks are capable of performing scaling
operations.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! videoscale ! video/x-raw-yuv,width=178,height=100 ! ffmpegcolorspace ! autovideosink
```
@ -195,7 +195,7 @@ Like `ffmpegcolorspace` does for video, you use this to solve
negotiation problems with audio, and it is generally safe to use it
liberally, since this element does nothing if it is not needed.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 audiotestsrc ! audioconvert ! autoaudiosink
```
@ -208,7 +208,7 @@ Again, use it to solve negotiation problems regarding sampling rates and
do not fear to use it
generously.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! audioresample ! audio/x-raw-float,rate=4000 ! audioconvert ! autoaudiosink
```
@ -295,7 +295,7 @@ separate threads for each branch. Otherwise a blocked dataflow in one
branch would stall the other
branches.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 audiotestsrc ! tee name=t ! queue ! audioconvert ! autoaudiosink t. ! queue ! wavescope ! ffmpegcolorspace ! autovideosink
```
@ -311,7 +311,7 @@ the `capsfilter` element. This element does not modify data as such,
but enforces limitations on the data
format.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 videotestsrc ! video/x-raw-gray ! ffmpegcolorspace ! autovideosink
```
@ -338,7 +338,7 @@ equation. It can be very verbose when combined with the `-v` switch
of `gst-launch`, so use the `silent` property to remove any unwanted
noise.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 audiotestsrc num-buffers=1000 ! fakesink sync=false
```
@ -350,7 +350,7 @@ checking, or buffer dropping. Read its documentation to learn all the
things this seemingly harmless element can
do.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 audiotestsrc ! identity drop-probability=0.1 ! audioconvert ! autoaudiosink
```
@ -364,4 +364,3 @@ debugging purposes.
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic tutorial 15: Clutter integration
# Basic tutorial 15: Clutter integration
This page last changed on Jul 11, 2012 by xartigas.
@ -36,33 +36,33 @@ Copy this code into a text file named `basic-tutorial-15.c`..
**basic-tutorial-15.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <clutter-gst/clutter-gst.h>
/* Setup the video texture once its size is known */
void size_change (ClutterActor *texture, gint width, gint height, gpointer user_data) {
ClutterActor *stage;
gfloat new_x, new_y, new_width, new_height;
gfloat stage_width, stage_height;
ClutterAnimation *animation = NULL;
stage = clutter_actor_get_stage (texture);
if (stage == NULL)
return;
clutter_actor_get_size (stage, &stage_width, &stage_height);
/* Center video on window and calculate new size preserving aspect ratio */
new_height = (height * stage_width) / width;
if (new_height <= stage_height) {
new_width = stage_width;
new_x = 0;
new_y = (stage_height - new_height) / 2;
} else {
new_width = (width * stage_height) / height;
new_height = stage_height;
new_x = (stage_width - new_width) / 2;
new_y = 0;
}
@ -73,31 +73,31 @@ void size_change (ClutterActor *texture, gint width, gint height, gpointer user_
animation = clutter_actor_animate (texture, CLUTTER_LINEAR, 10000, "rotation-angle-y", 360.0, NULL);
clutter_animation_set_loop (animation, TRUE);
}
int main(int argc, char *argv[]) {
GstElement *pipeline, *sink;
ClutterTimeline *timeline;
ClutterActor *stage, *texture;
/* clutter-gst takes care of initializing Clutter and GStreamer */
if (clutter_gst_init (&argc, &argv) != CLUTTER_INIT_SUCCESS) {
g_error ("Failed to initialize clutter\n");
return -1;
}
stage = clutter_stage_get_default ();
/* Make a timeline */
timeline = clutter_timeline_new (1000);
g_object_set(timeline, "loop", TRUE, NULL);
/* Create new texture and disable slicing so the video is properly mapped onto it */
texture = CLUTTER_ACTOR (g_object_new (CLUTTER_TYPE_TEXTURE, "disable-slicing", TRUE, NULL));
g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL);
/* Build the GStreamer pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Instantiate the Clutter sink */
sink = gst_element_factory_make ("autocluttersink", NULL);
if (sink == NULL) {
@ -108,25 +108,25 @@ int main(int argc, char *argv[]) {
g_printerr ("Unable to find a Clutter sink.\n");
return -1;
}
/* Link GStreamer with Clutter by passing the Clutter texture to the Clutter sink*/
g_object_set (sink, "texture", texture, NULL);
/* Add the Clutter sink to the pipeline */
g_object_set (pipeline, "video-sink", sink, NULL);
/* Start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* start the timeline */
clutter_timeline_start (timeline);
/* Add texture to the stage, and show it */
clutter_group_add (CLUTTER_GROUP (stage), texture);
clutter_actor_show_all (stage);
clutter_main();
/* Free resources */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
@ -165,7 +165,7 @@ how to integrate GStreamer with it. This is accomplished through the
clutter-gst library, so its header must be included (and the program
must link against it):
``` first-line: 1; theme: Default; brush: cpp; gutter: true
``` lang=c
#include <clutter-gst/clutter-gst.h>
```
@ -173,7 +173,7 @@ The first thing this library does is initialize both GStreamer and
Clutter, so you must call ` clutter-gst-init()` instead of initializing
these libraries yourself.
``` first-line: 43; theme: Default; brush: cpp; gutter: true
``` lang=c
/* clutter-gst takes care of initializing Clutter and GStreamer */
if (clutter_gst_init (&argc, &argv) != CLUTTER_INIT_SUCCESS) {
g_error ("Failed to initialize clutter\n");
@ -186,7 +186,7 @@ create a texture. Just remember to disable texture slicing to allow for
proper
integration:
``` first-line: 55; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create new texture and disable slicing so the video is properly mapped onto it */
texture = CLUTTER_ACTOR (g_object_new (CLUTTER_TYPE_TEXTURE, "disable-slicing", TRUE, NULL));
g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL);
@ -195,7 +195,7 @@ g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL);
We connect to the size-change signal so we can perform final setup once
the video size is known.
``` theme: Default; brush: cpp; gutter: true
``` lang=c
/* Instantiate the Clutter sink */
sink = gst_element_factory_make ("autocluttersink", NULL);
if (sink == NULL) {
@ -216,14 +216,14 @@ release of the SDK, so, if it cannot be found, the
simpler `cluttersink` element is created
instead.
``` first-line: 73; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Link GStreamer with Clutter by passing the Clutter texture to the Clutter sink*/
g_object_set (sink, "texture", texture, NULL);
```
This texture is everything GStreamer needs to know about Clutter.
``` first-line: 76; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Add the Clutter sink to the pipeline */
g_object_set (pipeline, "video-sink", sink, NULL);
```
@ -252,4 +252,3 @@ This tutorial has shown:
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic tutorial 16: Platform-specific elements
# Basic tutorial 16: Platform-specific elements
This page last changed on May 30, 2013 by xartigas.
@ -209,4 +209,3 @@ instancing them manually.
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# Basic tutorial 2: GStreamer concepts
# Basic tutorial 2: GStreamer concepts
## Goal
@ -24,28 +24,28 @@ in the SDK installation).
```
#include <gst/gst.h>
int main(int argc, char *argv[]) {
GstElement *pipeline, *source, *sink;
GstBus *bus;
GstMessage *msg;
GstStateChangeReturn ret;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
source = gst_element_factory_make ("videotestsrc", "source");
sink = gst_element_factory_make ("autovideosink", "sink");
/* Create the empty pipeline */
pipeline = gst_pipeline_new ("test-pipeline");
if (!pipeline || !source || !sink) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Build the pipeline */
gst_bin_add_many (GST_BIN (pipeline), source, sink, NULL);
if (gst_element_link (source, sink) != TRUE) {
@ -53,10 +53,10 @@ int main(int argc, char *argv[]) {
gst_object_unref (pipeline);
return -1;
}
/* Modify the source's properties */
g_object_set (source, "pattern", 0, NULL);
/* Start playing */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -64,16 +64,16 @@ int main(int argc, char *argv[]) {
gst_object_unref (pipeline);
return -1;
}
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
/* Parse message */
if (msg != NULL) {
GError *err;
gchar *debug_info;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR:
gst_message_parse_error (msg, &err, &debug_info);
@ -92,7 +92,7 @@ int main(int argc, char *argv[]) {
}
gst_message_unref (msg);
}
/* Free resources */
gst_object_unref (bus);
gst_element_set_state (pipeline, GST_STATE_NULL);
@ -101,7 +101,6 @@ int main(int argc, char *argv[]) {
}
```
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
@ -253,12 +252,12 @@ pipelines](Basic+tutorial+3+Dynamic+pipelines.markdown).
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
/* Parse message */
if (msg != NULL) {
GError *err;
gchar *debug_info;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR:
gst_message_parse_error (msg, &err, &debug_info);
@ -294,7 +293,7 @@ In this case, once we know the message contains an error (by using the
`GST_MESSAGE_TYPE()` macro), we can use
`gst_message_parse_error()` which returns a GLib `GError` error
structure and a string useful for debugging. Examine the code to see how
these are used and freed afterward.
these are used and freed afterward.
### The GStreamer bus
@ -349,4 +348,4 @@ concepts. The second one comes next.
Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon!
It has been a pleasure having you here, and see you soon!

View file

@ -1,4 +1,4 @@
# Basic tutorial 3: Dynamic pipelines
# Basic tutorial 3: Dynamic pipelines
## Goal
@ -87,7 +87,7 @@ in the SDK installation).
```
#include <gst/gst.h>
/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
GstElement *pipeline;
@ -95,33 +95,33 @@ typedef struct _CustomData {
GstElement *convert;
GstElement *sink;
} CustomData;
/* Handler for the pad-added signal */
static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data);
int main(int argc, char *argv[]) {
CustomData data;
GstBus *bus;
GstMessage *msg;
GstStateChangeReturn ret;
gboolean terminate = FALSE;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
data.source = gst_element_factory_make ("uridecodebin", "source");
data.convert = gst_element_factory_make ("audioconvert", "convert");
data.sink = gst_element_factory_make ("autoaudiosink", "sink");
/* Create the empty pipeline */
data.pipeline = gst_pipeline_new ("test-pipeline");
if (!data.pipeline || !data.source || !data.convert || !data.sink) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Build the pipeline. Note that we are NOT linking the source at this
* point. We will do it later. */
gst_bin_add_many (GST_BIN (data.pipeline), data.source, data.convert , data.sink, NULL);
@ -130,13 +130,13 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.pipeline);
return -1;
}
/* Set the URI to play */
g_object_set (data.source, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Connect to the pad-added signal */
g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data);
/* Start playing */
ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -144,18 +144,18 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.pipeline);
return -1;
}
/* Listen to the bus */
bus = gst_element_get_bus (data.pipeline);
do {
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE,
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
/* Parse message */
if (msg != NULL) {
GError *err;
gchar *debug_info;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR:
gst_message_parse_error (msg, &err, &debug_info);
@ -186,14 +186,14 @@ int main(int argc, char *argv[]) {
gst_message_unref (msg);
}
} while (!terminate);
/* Free resources */
gst_object_unref (bus);
gst_element_set_state (data.pipeline, GST_STATE_NULL);
gst_object_unref (data.pipeline);
return 0;
}
/* This function will be called by the pad-added signal */
static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) {
GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");
@ -201,15 +201,15 @@ static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *dat
GstCaps *new_pad_caps = NULL;
GstStructure *new_pad_struct = NULL;
const gchar *new_pad_type = NULL;
g_print ("Received new pad '%s' from '%s':\n", GST_PAD_NAME (new_pad), GST_ELEMENT_NAME (src));
/* If our converter is already linked, we have nothing to do here */
if (gst_pad_is_linked (sink_pad)) {
g_print (" We are already linked. Ignoring.\n");
goto exit;
}
/* Check the new pad's type */
new_pad_caps = gst_pad_query_caps (new_pad, NULL);
new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
@ -218,7 +218,7 @@ static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *dat
g_print (" It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type);
goto exit;
}
/* Attempt the link */
ret = gst_pad_link (new_pad, sink_pad);
if (GST_PAD_LINK_FAILED (ret)) {
@ -226,18 +226,17 @@ static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *dat
} else {
g_print (" Link succeeded (type '%s').\n", new_pad_type);
}
exit:
/* Unreference the new pad's caps, if we got them */
if (new_pad_caps != NULL)
gst_caps_unref (new_pad_caps);
/* Unreference the sink pad */
gst_object_unref (sink_pad);
}
```
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
@ -519,5 +518,5 @@ to the [Playback tutorials](Playback+tutorials.markdown), and gain more
insight about the `playbin2` element.
Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon!
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon!

View file

@ -1,4 +1,4 @@
# Basic tutorial 4: Time management
# Basic tutorial 4: Time management
## Goal
@ -35,9 +35,9 @@ in the SDK installation).
**basic-tutorial-4.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin; /* Our one and only element */
@ -47,36 +47,36 @@ typedef struct _CustomData {
gboolean seek_done; /* Have we performed the seek already? */
gint64 duration; /* How long does this media last, in nanoseconds */
} CustomData;
/* Forward definition of the message processing function */
static void handle_message (CustomData *data, GstMessage *msg);
int main(int argc, char *argv[]) {
CustomData data;
GstBus *bus;
GstMessage *msg;
GstStateChangeReturn ret;
data.playing = FALSE;
data.terminate = FALSE;
data.seek_enabled = FALSE;
data.seek_done = FALSE;
data.duration = GST_CLOCK_TIME_NONE;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Start playing */
ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -84,13 +84,13 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.playbin);
return -1;
}
/* Listen to the bus */
bus = gst_element_get_bus (data.playbin);
do {
msg = gst_bus_timed_pop_filtered (bus, 100 * GST_MSECOND,
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS | GST_MESSAGE_DURATION);
/* Parse message */
if (msg != NULL) {
handle_message (&data, msg);
@ -98,23 +98,23 @@ int main(int argc, char *argv[]) {
/* We got no message, this means the timeout expired */
if (data.playing) {
gint64 current = -1;
/* Query the current position of the stream */
if (!gst_element_query_position (data.playbin, GST_TIME_FORMAT, &current)) {
g_printerr ("Could not query current position.\n");
}
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data.duration)) {
if (!gst_element_query_duration (data.playbin, GST_TIME_FORMAT, &data.duration)) {
g_printerr ("Could not query current duration.\n");
}
}
/* Print current position and total duration */
g_print ("Position %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
GST_TIME_ARGS (current), GST_TIME_ARGS (data.duration));
/* If seeking is enabled, we have not done it yet, and the time is right, seek */
if (data.seek_enabled && !data.seek_done && current > 10 * GST_SECOND) {
g_print ("\nReached 10s, performing seek...\n");
@ -125,18 +125,18 @@ int main(int argc, char *argv[]) {
}
}
} while (!data.terminate);
/* Free resources */
gst_object_unref (bus);
gst_element_set_state (data.playbin, GST_STATE_NULL);
gst_object_unref (data.playbin);
return 0;
}
static void handle_message (CustomData *data, GstMessage *msg) {
GError *err;
gchar *debug_info;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR:
gst_message_parse_error (msg, &err, &debug_info);
@ -160,10 +160,10 @@ static void handle_message (CustomData *data, GstMessage *msg) {
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
g_print ("Pipeline state changed from %s to %s:\n",
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
/* Remember whether we are in the PLAYING state or not */
data->playing = (new_state == GST_STATE_PLAYING);
if (data->playing) {
/* We just moved to PLAYING. Check if seeking is possible */
GstQuery *query;
@ -219,7 +219,7 @@ typedef struct _CustomData {
gboolean seek_done; /* Have we performed the seek already? */
gint64 duration; /* How long does this media last, in nanoseconds */
} CustomData;
/* Forward definition of the message processing function */
static void handle_message (CustomData *data, GstMessage *msg);
```
@ -378,7 +378,7 @@ case GST_MESSAGE_STATE_CHANGED: {
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->pipeline)) {
g_print ("Pipeline state changed from %s to %s:\n",
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
/* Remember whether we are in the PLAYING state or not */
data->playing = (new_state == GST_STATE_PLAYING);
```
@ -414,7 +414,6 @@ if (data->playing) {
}
```
`gst_query_new_seeking()` creates a new query object of the "seeking"
type, with `GST_FORMAT_TIME` format. This indicates that we are
interested in seeking by specifying the new time to which we want to
@ -455,4 +454,3 @@ Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon!

View file

@ -1,4 +1,4 @@
# Basic tutorial 5: GUI toolkit integration
# Basic tutorial 5: GUI toolkit integration
## Goal
@ -43,7 +43,6 @@ rendering.
>
> A GObject *interface* (which GStreamer uses) is a set of functions that an element can implement. If it does, then it is said to support that particular interface. For example, video sinks usually create their own windows to display video, but, if they are also capable of rendering to an external window, they can choose to implement the `GstVideoOverlay` interface and provide functions to specify this external window. From the application developer point of view, if a certain interface is supported, you can use it and forget about which kind of element is implementing it. Moreover, if you are using `playbin`, it will automatically expose some of the interfaces supported by its internal elements: You can use your interface functions directly on `playbin` without knowing who is implementing them!
Another issue is that GUI toolkits usually only allow manipulation of
the graphical “widgets” through the main (or application) thread,
whereas GStreamer usually spawns multiple threads to take care of
@ -72,11 +71,11 @@ in the SDK installation).
```
#include <string.h>
#include <gtk/gtk.h>
#include <gst/gst.h>
#include <gst/video/videooverlay.h>
#include <gdk/gdk.h>
#if defined (GDK_WINDOWING_X11)
#include <gdk/gdkx.h>
@ -85,29 +84,29 @@ in the SDK installation).
#elif defined (GDK_WINDOWING_QUARTZ)
#include <gdk/gdkquartz.h>
#endif
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin; /* Our one and only pipeline */
GtkWidget *slider; /* Slider widget to keep track of current position */
GtkWidget *streams_list; /* Text widget to display info about the streams */
gulong slider_update_signal_id; /* Signal ID for the slider update signal */
GstState state; /* Current state of the pipeline */
gint64 duration; /* Duration of the clip, in nanoseconds */
} CustomData;
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
* and pass it to GStreamer through the GstVideoOverlay interface. */
static void realize_cb (GtkWidget *widget, CustomData *data) {
GdkWindow *window = gtk_widget_get_window (widget);
guintptr window_handle;
if (!gdk_window_ensure_native (window))
g_error ("Couldn't create native window needed for GstVideoOverlay!");
/* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
window_handle = (guintptr)GDK_WINDOW_HWND (window);
@ -119,35 +118,35 @@ static void realize_cb (GtkWidget *widget, CustomData *data) {
/* Pass it to playbin, which implements GstVideoOverlay and will forward it to the video sink */
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
}
/* This function is called when the PLAY button is clicked */
static void play_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}
/* This function is called when the PAUSE button is clicked */
static void pause_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}
/* This function is called when the STOP button is clicked */
static void stop_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when the main window is closed */
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
stop_cb (NULL, data);
gtk_main_quit ();
}
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
* we simply draw a black rectangle to avoid garbage showing up. */
static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
if (data->state < GST_STATE_PAUSED) {
GtkAllocation allocation;
/* Cairo is a 2D graphics library which we use here to clean the video window.
* It is used by GStreamer for other reasons, so it will always be available to us. */
gtk_widget_get_allocation (widget, &allocation);
@ -156,10 +155,10 @@ static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
cairo_fill (cr);
cairo_destroy (cr);
}
return FALSE;
}
/* This function is called when the slider changes its position. We perform a seek to the
* new position here. */
static void slider_cb (GtkRange *range, CustomData *data) {
@ -167,7 +166,7 @@ static void slider_cb (GtkRange *range, CustomData *data) {
gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
(gint64)(value * GST_SECOND));
}
/* This creates all the GTK+ widgets that compose our application, and registers the callbacks */
static void create_ui (CustomData *data) {
GtkWidget *main_window; /* The uppermost window, containing all other windows */
@ -176,58 +175,58 @@ static void create_ui (CustomData *data) {
GtkWidget *main_hbox; /* HBox to hold the video_window and the stream info text widget */
GtkWidget *controls; /* HBox to hold the buttons and the slider */
GtkWidget *play_button, *pause_button, *stop_button; /* Buttons */
main_window = gtk_window_new (GTK_WINDOW_TOPLEVEL);
g_signal_connect (G_OBJECT (main_window), "delete-event", G_CALLBACK (delete_event_cb), data);
video_window = gtk_drawing_area_new ();
gtk_widget_set_double_buffered (video_window, FALSE);
g_signal_connect (video_window, "realize", G_CALLBACK (realize_cb), data);
g_signal_connect (video_window, "draw", G_CALLBACK (draw_cb), data);
play_button = gtk_button_new_from_stock (GTK_STOCK_MEDIA_PLAY);
g_signal_connect (G_OBJECT (play_button), "clicked", G_CALLBACK (play_cb), data);
pause_button = gtk_button_new_from_stock (GTK_STOCK_MEDIA_PAUSE);
g_signal_connect (G_OBJECT (pause_button), "clicked", G_CALLBACK (pause_cb), data);
stop_button = gtk_button_new_from_stock (GTK_STOCK_MEDIA_STOP);
g_signal_connect (G_OBJECT (stop_button), "clicked", G_CALLBACK (stop_cb), data);
data->slider = gtk_hscale_new_with_range (0, 100, 1);
gtk_scale_set_draw_value (GTK_SCALE (data->slider), 0);
data->slider_update_signal_id = g_signal_connect (G_OBJECT (data->slider), "value-changed", G_CALLBACK (slider_cb), data);
data->streams_list = gtk_text_view_new ();
gtk_text_view_set_editable (GTK_TEXT_VIEW (data->streams_list), FALSE);
controls = gtk_box_new (GTK_ORIENTATION_HORIZONTAL,, 0);
gtk_box_pack_start (GTK_BOX (controls), play_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), pause_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), stop_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), data->slider, TRUE, TRUE, 2);
main_hbox = gtk_box_new (GTK_ORIENTATION_HORIZONTAL,, 0);
gtk_box_pack_start (GTK_BOX (main_hbox), video_window, TRUE, TRUE, 0);
gtk_box_pack_start (GTK_BOX (main_hbox), data->streams_list, FALSE, FALSE, 2);
main_box = gtk_box_new (GTK_ORIENTATION_VERTICAL,, 0);
gtk_box_pack_start (GTK_BOX (main_box), main_hbox, TRUE, TRUE, 0);
gtk_box_pack_start (GTK_BOX (main_box), controls, FALSE, FALSE, 0);
gtk_container_add (GTK_CONTAINER (main_window), main_box);
gtk_window_set_default_size (GTK_WINDOW (main_window), 640, 480);
gtk_widget_show_all (main_window);
}
/* This function is called periodically to refresh the GUI */
static gboolean refresh_ui (CustomData *data) {
gint64 current = -1;
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
if (data->state < GST_STATE_PAUSED)
return TRUE;
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
@ -237,7 +236,7 @@ static gboolean refresh_ui (CustomData *data) {
gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
}
}
if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, &current)) {
/* Block the "value-changed" signal, so the slider_cb function is not called
* (which would trigger a seek the user has not requested) */
@ -249,7 +248,7 @@ static gboolean refresh_ui (CustomData *data) {
}
return TRUE;
}
/* This function is called when new metadata is discovered in the stream */
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
/* We are possibly in a GStreamer working thread, so we notify the main
@ -258,30 +257,30 @@ static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
gst_message_new_application (GST_OBJECT (playbin),
gst_structure_new ("tags-changed", NULL)));
}
/* This function is called when an error message is posted on the bus */
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
GError *err;
gchar *debug_info;
/* Print error details on the screen */
gst_message_parse_error (msg, &err, &debug_info);
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
g_clear_error (&err);
g_free (debug_info);
/* Set the pipeline to READY (which stops playback) */
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when an End-Of-Stream message is posted on the bus.
* We just set the pipeline to READY (which stops playback) */
static void eos_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
g_print ("End-Of-Stream reached.\n");
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when the pipeline changes states. We use it to
* keep track of the current state. */
static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
@ -296,7 +295,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
}
}
}
/* Extract metadata from all the streams and write it to the text widget in the GUI */
static void analyze_streams (CustomData *data) {
gint i;
@ -305,16 +304,16 @@ static void analyze_streams (CustomData *data) {
guint rate;
gint n_video, n_audio, n_text;
GtkTextBuffer *text;
/* Clean current contents of the widget */
text = gtk_text_view_get_buffer (GTK_TEXT_VIEW (data->streams_list));
gtk_text_buffer_set_text (text, "", -1);
/* Read some properties */
g_object_get (data->playbin, "n-video", &n_video, NULL);
g_object_get (data->playbin, "n-audio", &n_audio, NULL);
g_object_get (data->playbin, "n-text", &n_text, NULL);
for (i = 0; i < n_video; i++) {
tags = NULL;
/* Retrieve the stream's video tags */
@ -331,7 +330,7 @@ static void analyze_streams (CustomData *data) {
gst_tag_list_free (tags);
}
}
for (i = 0; i < n_audio; i++) {
tags = NULL;
/* Retrieve the stream's audio tags */
@ -360,7 +359,7 @@ static void analyze_streams (CustomData *data) {
gst_tag_list_free (tags);
}
}
for (i = 0; i < n_text; i++) {
tags = NULL;
/* Retrieve the stream's subtitle tags */
@ -379,7 +378,7 @@ static void analyze_streams (CustomData *data) {
}
}
}
/* This function is called when an "application" message is posted on the bus.
* Here we retrieve the message posted by the tags_cb callback */
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
@ -389,41 +388,41 @@ static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
analyze_streams (data);
}
}
int main(int argc, char *argv[]) {
CustomData data;
GstStateChangeReturn ret;
GstBus *bus;
/* Initialize GTK */
gtk_init (&argc, &argv);
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Initialize our data structure */
memset (&data, 0, sizeof (data));
data.duration = GST_CLOCK_TIME_NONE;
/* Create the elements */
data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Connect to interesting signals in playbin */
g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);
/* Create the GUI */
create_ui (&data);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (data.playbin);
gst_bus_add_signal_watch (bus);
@ -432,7 +431,7 @@ int main(int argc, char *argv[]) {
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
gst_object_unref (bus);
/* Start playing */
ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -440,13 +439,13 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.playbin);
return -1;
}
/* Register a function that GLib will call every second */
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
/* Start the GTK main loop. We will not regain control until gtk_main_quit is called. */
gtk_main ();
/* Free resources */
gst_element_set_state (data.playbin, GST_STATE_NULL);
gst_object_unref (data.playbin);
@ -504,25 +503,25 @@ int main(int argc, char *argv[]) {
CustomData data;
GstStateChangeReturn ret;
GstBus *bus;
/* Initialize GTK */
gtk_init (&argc, &argv);
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Initialize our data structure */
memset (&data, 0, sizeof (data));
data.duration = GST_CLOCK_TIME_NONE;
/* Create the elements */
data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
```
@ -608,10 +607,10 @@ documentation of the signal.
static void realize_cb (GtkWidget *widget, CustomData *data) {
GdkWindow *window = gtk_widget_get_window (widget);
guintptr window_handle;
if (!gdk_window_ensure_native (window))
g_error ("Couldn't create native window needed for GstVideoOverlay!");
/* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
window_handle = (guintptr)GDK_WINDOW_HWND (window);
@ -642,12 +641,12 @@ this process a lot!
static void play_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}
/* This function is called when the PAUSE button is clicked */
static void pause_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}
/* This function is called when the STOP button is clicked */
static void stop_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_READY);
@ -684,7 +683,7 @@ static gboolean expose_cb (GtkWidget *widget, GdkEventExpose *event, CustomData
GtkAllocation allocation;
GdkWindow *window = gtk_widget_get_window (widget);
cairo_t *cr;
/* Cairo is a 2D graphics library which we use here to clean the video window.
* It is used by GStreamer for other reasons, so it will always be available to us. */
gtk_widget_get_allocation (widget, &allocation);
@ -694,7 +693,7 @@ static gboolean expose_cb (GtkWidget *widget, GdkEventExpose *event, CustomData
cairo_fill (cr);
cairo_destroy (cr);
}
return FALSE;
}
```
@ -735,7 +734,7 @@ allow any seek to complete before a new one is queued.
/* This function is called periodically to refresh the GUI */
static gboolean refresh_ui (CustomData *data) {
gint64 current = -1;
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
if (data->state < GST_STATE_PAUSED)
return TRUE;
@ -886,5 +885,3 @@ The following basic tutorials keep focusing on other individual
GStreamer topics
It has been a pleasure having you here, and see you soon!

View file

@ -1,4 +1,4 @@
# Basic tutorial 6: Media formats and Pad Capabilities
# Basic tutorial 6: Media formats and Pad Capabilities
## Goal
@ -89,7 +89,7 @@ SRC template: 'src'
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
format: { I420, NV12, NV21, YV12, YUY2, Y42B, Y444, YUV9, YVU9, Y41B, Y800, Y8, GREY, Y16 , UYVY, YVYU, IYU1, v308, AYUV, A420 }
format: { I420, NV12, NV21, YV12, YUY2, Y42B, Y444, YUV9, YVU9, Y41B, Y800, Y8, GREY, Y16 , UYVY, YVYU, IYU1, v308, AYUV, A420 }
```
`video/x-raw` indicates that this source pad outputs raw video. It
@ -124,21 +124,21 @@ in the SDK installation).
```
#include <gst/gst.h>
/* Functions below print the Capabilities in a human-friendly format */
static gboolean print_field (GQuark field, const GValue * value, gpointer pfx) {
gchar *str = gst_value_serialize (value);
g_print ("%s %15s: %s\n", (gchar *) pfx, g_quark_to_string (field), str);
g_free (str);
return TRUE;
}
static void print_caps (const GstCaps * caps, const gchar * pfx) {
guint i;
g_return_if_fail (caps != NULL);
if (gst_caps_is_any (caps)) {
g_print ("%sANY\n", pfx);
return;
@ -147,38 +147,38 @@ static void print_caps (const GstCaps * caps, const gchar * pfx) {
g_print ("%sEMPTY\n", pfx);
return;
}
for (i = 0; i < gst_caps_get_size (caps); i++) {
GstStructure *structure = gst_caps_get_structure (caps, i);
g_print ("%s%s\n", pfx, gst_structure_get_name (structure));
gst_structure_foreach (structure, print_field, (gpointer) pfx);
}
}
/* Prints information about a Pad Template, including its Capabilities */
static void print_pad_templates_information (GstElementFactory * factory) {
const GList *pads;
GstStaticPadTemplate *padtemplate;
g_print ("Pad Templates for %s:\n", gst_element_factory_get_longname (factory));
if (!gst_element_factory_get_num_pad_templates (factory)) {
g_print (" none\n");
return;
}
pads = gst_element_factory_get_static_pad_templates (factory);
while (pads) {
padtemplate = pads->data
pads = g_list_next (pads);
if (padtemplate->direction == GST_PAD_SRC)
g_print (" SRC template: '%s'\n", padtemplate->name_template);
else if (padtemplate->direction == GST_PAD_SINK)
g_print (" SINK template: '%s'\n", padtemplate->name_template);
else
g_print (" UNKNOWN!!! template: '%s'\n", padtemplate->name_template);
if (padtemplate->presence == GST_PAD_ALWAYS)
g_print (" Availability: Always\n");
else if (padtemplate->presence == GST_PAD_SOMETIMES)
@ -187,7 +187,7 @@ static void print_pad_templates_information (GstElementFactory * factory) {
g_print (" Availability: On request\n");
} else
g_print (" Availability: UNKNOWN!!!\n");
if (padtemplate->static_caps.string) {
GstCaps *caps;
g_print (" Capabilities:\n");
@ -196,35 +196,35 @@ static void print_pad_templates_information (GstElementFactory * factory) {
gst_caps_unref (caps);
}
g_print ("\n");
}
}
/* Shows the CURRENT capabilities of the requested pad in the given element */
static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
GstPad *pad = NULL;
GstCaps *caps = NULL;
/* Retrieve pad */
pad = gst_element_get_static_pad (element, pad_name);
if (!pad) {
g_printerr ("Could not retrieve pad '%s'\n", pad_name);
return;
}
/* Retrieve negotiated caps (or acceptable caps if negotiation is not finished yet) */
caps = gst_pad_get_current_caps (pad);
if (!caps)
caps = gst_pad_query_caps (pad, NULL);
/* Print and free */
g_print ("Caps for the %s pad:\n", pad_name);
print_caps (caps, " ");
gst_caps_unref (caps);
gst_object_unref (pad);
}
int main(int argc, char *argv[]) {
GstElement *pipeline, *source, *sink;
GstElementFactory *source_factory, *sink_factory;
@ -232,10 +232,10 @@ int main(int argc, char *argv[]) {
GstMessage *msg;
GstStateChangeReturn ret;
gboolean terminate = FALSE;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the element factories */
source_factory = gst_element_factory_find ("audiotestsrc");
sink_factory = gst_element_factory_find ("autoaudiosink");
@ -243,23 +243,23 @@ int main(int argc, char *argv[]) {
g_printerr ("Not all element factories could be created.\n");
return -1;
}
/* Print information about the pad templates of these factories */
print_pad_templates_information (source_factory);
print_pad_templates_information (sink_factory);
/* Ask the factories to instantiate actual elements */
source = gst_element_factory_create (source_factory, "source");
sink = gst_element_factory_create (sink_factory, "sink");
/* Create the empty pipeline */
pipeline = gst_pipeline_new ("test-pipeline");
if (!pipeline || !source || !sink) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Build the pipeline */
gst_bin_add_many (GST_BIN (pipeline), source, sink, NULL);
if (gst_element_link (source, sink) != TRUE) {
@ -267,28 +267,28 @@ int main(int argc, char *argv[]) {
gst_object_unref (pipeline);
return -1;
}
/* Print initial negotiated caps (in NULL state) */
g_print ("In NULL state:\n");
print_pad_capabilities (sink, "sink");
/* Start playing */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
g_printerr ("Unable to set the pipeline to the playing state (check the bus for error messages).\n");
}
/* Wait until error, EOS or State Change */
bus = gst_element_get_bus (pipeline);
do {
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS |
GST_MESSAGE_STATE_CHANGED);
/* Parse message */
if (msg != NULL) {
GError *err;
gchar *debug_info;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR:
gst_message_parse_error (msg, &err, &debug_info);
@ -321,7 +321,7 @@ int main(int argc, char *argv[]) {
gst_message_unref (msg);
}
} while (!terminate);
/* Free resources */
gst_object_unref (bus);
gst_element_set_state (pipeline, GST_STATE_NULL);
@ -332,7 +332,6 @@ int main(int argc, char *argv[]) {
}
```
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
@ -359,19 +358,19 @@ Caps.
static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
GstPad *pad = NULL;
GstCaps *caps = NULL;
/* Retrieve pad */
pad = gst_element_get_static_pad (element, pad_name);
if (!pad) {
g_printerr ("Could not retrieve pad '%s'\n", pad_name);
return;
}
/* Retrieve negotiated caps (or acceptable caps if negotiation is not finished yet) */
caps = gst_pad_get_current_caps (pad);
if (!caps)
caps = gst_pad_query_caps (pad, NULL);
/* Print and free */
g_print ("Caps for the %s pad:\n", pad_name);
print_caps (caps, " ");
@ -403,11 +402,11 @@ if (!source_factory || !sink_factory) {
g_printerr ("Not all element factories could be created.\n");
return -1;
}
/* Print information about the pad templates of these factories */
print_pad_templates_information (source_factory);
print_pad_templates_information (sink_factory);
/* Ask the factories to instantiate actual elements */
source = gst_element_factory_create (source_factory, "source");
sink = gst_element_factory_create (sink_factory, "sink");
@ -473,6 +472,5 @@ Next tutorial shows how data can be manually injected into and extracted
from the GStreamer pipeline.
Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon!

View file

@ -1,4 +1,4 @@
# Basic tutorial 7: Multithreading and Pad Availability
# Basic tutorial 7: Multithreading and Pad Availability
## Goal
@ -84,9 +84,9 @@ in the SDK installation).
**basic-tutorial-7.c**
```
``` lang=c
#include <gst/gst.h>
int main(int argc, char *argv[]) {
GstElement *pipeline, *audio_source, *tee, *audio_queue, *audio_convert, *audio_resample, *audio_sink;
GstElement *video_queue, *visual, *video_convert, *video_sink;
@ -95,10 +95,10 @@ int main(int argc, char *argv[]) {
GstPadTemplate *tee_src_pad_template;
GstPad *tee_audio_pad, *tee_video_pad;
GstPad *queue_audio_pad, *queue_video_pad;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
audio_source = gst_element_factory_make ("audiotestsrc", "audio_source");
tee = gst_element_factory_make ("tee", "tee");
@ -110,20 +110,20 @@ int main(int argc, char *argv[]) {
visual = gst_element_factory_make ("wavescope", "visual");
video_convert = gst_element_factory_make ("videoconvert", "csp");
video_sink = gst_element_factory_make ("autovideosink", "video_sink");
/* Create the empty pipeline */
pipeline = gst_pipeline_new ("test-pipeline");
if (!pipeline || !audio_source || !tee || !audio_queue || !audio_convert || !audio_resample || !audio_sink ||
!video_queue || !visual || !video_convert || !video_sink) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Configure elements */
g_object_set (audio_source, "freq", 215.0f, NULL);
g_object_set (visual, "shader", 0, "style", 1, NULL);
/* Link all elements that can be automatically linked because they have "Always" pads */
gst_bin_add_many (GST_BIN (pipeline), audio_source, tee, audio_queue, audio_convert, audio_resample, audio_sink,
video_queue, visual, video_convert, video_sink, NULL);
@ -134,7 +134,7 @@ int main(int argc, char *argv[]) {
gst_object_unref (pipeline);
return -1;
}
/* Manually link the Tee, which has "Request" pads */
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (tee), "src_%d");
tee_audio_pad = gst_element_request_pad (tee, tee_src_pad_template, NULL, NULL);
@ -151,26 +151,26 @@ int main(int argc, char *argv[]) {
}
gst_object_unref (queue_audio_pad);
gst_object_unref (queue_video_pad);
/* Start playing the pipeline */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
/* Release the request pads from the Tee, and unref them */
gst_element_release_request_pad (tee, tee_audio_pad);
gst_element_release_request_pad (tee, tee_video_pad);
gst_object_unref (tee_audio_pad);
gst_object_unref (tee_video_pad);
/* Free resources */
if (msg != NULL)
gst_message_unref (msg);
gst_object_unref (bus);
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
return 0;
}
@ -185,7 +185,7 @@ int main(int argc, char *argv[]) {
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Run), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Run) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Run).
>
> This tutorial plays an audible tone through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.
> This tutorial plays an audible tone through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.
>
> Required libraries: `gstreamer-1.0`
@ -330,5 +330,3 @@ The next tutorial builds on top of this one to show how data can be
manually injected into and extracted from a running pipeline.
It has been a pleasure having you here, and see you soon!

View file

@ -1,4 +1,4 @@
# Basic tutorial 8: Short-cutting the pipeline
# Basic tutorial 8: Short-cutting the pipeline
## Goal
@ -98,24 +98,24 @@ in the SDK installation).
#include <gst/gst.h>
#include <gst/audio/audio.h>
#include <string.h>
#define CHUNK_SIZE 1024 /* Amount of bytes we are sending in each buffer */
#define SAMPLE_RATE 44100 /* Samples per second we are sending */
/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
GstElement *pipeline, *app_source, *tee, *audio_queue, *audio_convert1, *audio_resample, *audio_sink;
GstElement *video_queue, *audio_convert2, *visual, *video_convert, *video_sink;
GstElement *app_queue, *app_sink;
guint64 num_samples; /* Number of samples generated so far (for timestamp generation) */
gfloat a, b, c, d; /* For waveform generation */
guint sourceid; /* To control the GSource */
GMainLoop *main_loop; /* GLib's Main Loop */
} CustomData;
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
* The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
* and is removed when appsrc has enough data (enough-data signal).
@ -128,14 +128,14 @@ static gboolean push_data (CustomData *data) {
gint16 *raw;
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
gfloat freq;
/* Create a new empty buffer */
buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);
/* Set its timestamp and duration */
GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);
/* Generate some psychodelic waveforms */
gst_buffer_map (buffer, &map, GST_MAP_WRITE);
raw = (gint16 *)map.data;
@ -149,21 +149,21 @@ static gboolean push_data (CustomData *data) {
}
gst_buffer_unmap (buffer, &map);
data->num_samples += num_samples;
/* Push the buffer into the appsrc */
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
/* Free the buffer now that we are done with it */
gst_buffer_unref (buffer);
if (ret != GST_FLOW_OK) {
/* We got some error, stop sending data */
return FALSE;
}
return TRUE;
}
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
* to the mainloop to start pushing data into the appsrc */
static void start_feed (GstElement *source, guint size, CustomData *data) {
@ -172,7 +172,7 @@ static void start_feed (GstElement *source, guint size, CustomData *data) {
data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
}
}
/* This callback triggers when appsrc has enough data and we can stop sending.
* We remove the idle handler from the mainloop */
static void stop_feed (GstElement *source, CustomData *data) {
@ -182,11 +182,11 @@ static void stop_feed (GstElement *source, CustomData *data) {
data->sourceid = 0;
}
}
/* The appsink has received a buffer */
static void new_sample (GstElement *sink, CustomData *data) {
GstSample *sample;
/* Retrieve the buffer */
g_signal_emit_by_name (sink, "pull-sample", &sample);
if (sample) {
@ -195,22 +195,22 @@ static void new_sample (GstElement *sink, CustomData *data) {
gst_buffer_unref (sample);
}
}
/* This function is called when an error message is posted on the bus */
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
GError *err;
gchar *debug_info;
/* Print error details on the screen */
gst_message_parse_error (msg, &err, &debug_info);
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
g_clear_error (&err);
g_free (debug_info);
g_main_loop_quit (data->main_loop);
}
int main(int argc, char *argv[]) {
CustomData data;
GstPadTemplate *tee_src_pad_template;
@ -219,15 +219,15 @@ int main(int argc, char *argv[]) {
GstAudioInfo info;
GstCaps *audio_caps;
GstBus *bus;
/* Initialize cumstom data structure */
memset (&data, 0, sizeof (data));
data.b = 1; /* For waveform generation */
data.d = 1;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
data.app_source = gst_element_factory_make ("appsrc", "audio_source");
data.tee = gst_element_factory_make ("tee", "tee");
@ -242,35 +242,35 @@ int main(int argc, char *argv[]) {
data.video_sink = gst_element_factory_make ("autovideosink", "video_sink");
data.app_queue = gst_element_factory_make ("queue", "app_queue");
data.app_sink = gst_element_factory_make ("appsink", "app_sink");
/* Create the empty pipeline */
data.pipeline = gst_pipeline_new ("test-pipeline");
if (!data.pipeline || !data.app_source || !data.tee || !data.audio_queue || !data.audio_convert1 ||
!data.audio_resample || !data.audio_sink || !data.video_queue || !data.audio_convert2 || !data.visual ||
!data.video_convert || !data.video_sink || !data.app_queue || !data.app_sink) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Configure wavescope */
g_object_set (data.visual, "shader", 0, "style", 0, NULL);
/* Configure appsrc */
gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
audio_caps = gst_audio_info_to_caps (&info);
g_object_set (data.app_source, "caps", audio_caps, "format", GST_FORMAT_TIME, NULL);
g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);
/* Configure appsink */
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
gst_caps_unref (audio_caps);
g_free (audio_caps_text);
/* Link all elements that can be automatically linked because they have "Always" pads */
gst_bin_add_many (GST_BIN (data.pipeline), data.app_source, data.tee, data.audio_queue, data.audio_convert1, data.audio_resample,
gst_bin_add_many (GST_BIN (data.pipeline), data.app_source, data.tee, data.audio_queue, data.audio_convert1, data.audio_resample,
data.audio_sink, data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, data.app_queue,
data.app_sink, NULL);
if (gst_element_link_many (data.app_source, data.tee, NULL) != TRUE ||
@ -281,7 +281,7 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.pipeline);
return -1;
}
/* Manually link the Tee, which has "Request" pads */
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (data.tee), "src_%d");
tee_audio_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
@ -303,20 +303,20 @@ int main(int argc, char *argv[]) {
gst_object_unref (queue_audio_pad);
gst_object_unref (queue_video_pad);
gst_object_unref (queue_app_pad);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (data.pipeline);
gst_bus_add_signal_watch (bus);
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
gst_object_unref (bus);
/* Start playing the pipeline */
gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
/* Create a GLib Main Loop and set it to run */
data.main_loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.main_loop);
/* Release the request pads from the Tee, and unref them */
gst_element_release_request_pad (data.tee, tee_audio_pad);
gst_element_release_request_pad (data.tee, tee_video_pad);
@ -324,7 +324,7 @@ int main(int argc, char *argv[]) {
gst_object_unref (tee_audio_pad);
gst_object_unref (tee_video_pad);
gst_object_unref (tee_app_pad);
/* Free resources */
gst_element_set_state (data.pipeline, GST_STATE_NULL);
gst_object_unref (data.pipeline);
@ -355,7 +355,7 @@ Always Pads, and manually link the Request Pads of the `tee` element.
Regarding the configuration of the `appsrc` and `appsink` elements:
```
``` lang=c
/* Configure appsrc */
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
audio_caps = gst_caps_from_string (audio_caps_text);
@ -376,7 +376,7 @@ fired by `appsrc` when its internal queue of data is running low or
almost full, respectively. We will use these signals to start and stop
(respectively) our signal generation process.
```
``` lang=c
/* Configure appsink */
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
@ -393,7 +393,7 @@ Starting the pipeline, waiting for messages and final cleanup is done as
usual. Let's review the callbacks we have just
registered:
```
``` lang=c
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
* to the mainloop to start pushing data into the appsrc */
static void start_feed (GstElement *source, guint size, CustomData *data) {
@ -422,7 +422,7 @@ We take note of the sourceid that `g_idle_add()` returns, so we can
disable it
later.
```
``` lang=c
/* This callback triggers when appsrc has enough data and we can stop sending.
* We remove the idle handler from the mainloop */
static void stop_feed (GstElement *source, CustomData *data) {
@ -439,7 +439,7 @@ enough so we stop pushing data. Here we simply remove the idle function
by using `g_source_remove()` (The idle function is implemented as a
`GSource`).
```
``` lang=c
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
* The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
* and is removed when appsrc has enough data (enough-data signal).
@ -451,14 +451,14 @@ static gboolean push_data (CustomData *data) {
gint16 *raw;
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
gfloat freq;
/* Create a new empty buffer */
buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);
/* Set its timestamp and duration */
GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);
/* Generate some psychodelic waveforms */
raw = (gint16 *)GST_BUFFER_DATA (buffer);
```
@ -489,10 +489,10 @@ We will skip over the waveform generation, since it is outside the scope
of this tutorial (it is simply a funny way of generating a pretty
psychedelic wave).
```
``` lang=c
/* Push the buffer into the appsrc */
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
/* Free the buffer now that we are done with it */
gst_buffer_unref (buffer);
```
@ -503,11 +503,10 @@ tutorial 1: Playbin2
usage](Playback+tutorial+1+Playbin2+usage.markdown)), and then
`gst_buffer_unref()` it since we no longer need it.
```
``` lang=c
/* The appsink has received a buffer */
static void new_sample (GstElement *sink, CustomData *data) {
GstSample *sample;
/* Retrieve the buffer */
g_signal_emit_by_name (sink, "pull-sample", &sample);
if (sample) {
@ -544,4 +543,4 @@ different way. [Playback tutorial 3: Short-cutting the
pipeline](Playback+tutorial+3+Short-cutting+the+pipeline.markdown) shows
how to do it.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon\!

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Basic tutorial 9: Media information gathering
# Basic tutorial 9: Media information gathering
This page last changed on May 30, 2012 by xartigas.
@ -81,44 +81,44 @@ in the SDK installation).
**basic-tutorial-9.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <gst/gst.h>
#include <gst/pbutils/pbutils.h>
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstDiscoverer *discoverer;
GMainLoop *loop;
} CustomData;
/* Print a tag in a human-readable format (name: value) */
static void print_tag_foreach (const GstTagList *tags, const gchar *tag, gpointer user_data) {
GValue val = { 0, };
gchar *str;
gint depth = GPOINTER_TO_INT (user_data);
gst_tag_list_copy_value (&val, tags, tag);
if (G_VALUE_HOLDS_STRING (&val))
str = g_value_dup_string (&val);
else
str = gst_value_serialize (&val);
g_print ("%*s%s: %s\n", 2 * depth, " ", gst_tag_get_nick (tag), str);
g_free (str);
g_value_unset (&val);
}
/* Print information regarding a stream */
static void print_stream_info (GstDiscovererStreamInfo *info, gint depth) {
gchar *desc = NULL;
GstCaps *caps;
const GstTagList *tags;
caps = gst_discoverer_stream_info_get_caps (info);
if (caps) {
if (gst_caps_is_fixed (caps))
desc = gst_pb_utils_get_codec_description (caps);
@ -126,37 +126,37 @@ static void print_stream_info (GstDiscovererStreamInfo *info, gint depth) {
desc = gst_caps_to_string (caps);
gst_caps_unref (caps);
}
g_print ("%*s%s: %s\n", 2 * depth, " ", gst_discoverer_stream_info_get_stream_type_nick (info), (desc ? desc : ""));
if (desc) {
g_free (desc);
desc = NULL;
}
tags = gst_discoverer_stream_info_get_tags (info);
if (tags) {
g_print ("%*sTags:\n", 2 * (depth + 1), " ");
gst_tag_list_foreach (tags, print_tag_foreach, GINT_TO_POINTER (depth + 2));
}
}
/* Print information regarding a stream and its substreams, if any */
static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
GstDiscovererStreamInfo *next;
if (!info)
return;
print_stream_info (info, depth);
next = gst_discoverer_stream_info_get_next (info);
if (next) {
print_topology (next, depth + 1);
gst_discoverer_stream_info_unref (next);
} else if (GST_IS_DISCOVERER_CONTAINER_INFO (info)) {
GList *tmp, *streams;
streams = gst_discoverer_container_info_get_streams (GST_DISCOVERER_CONTAINER_INFO (info));
for (tmp = streams; tmp; tmp = tmp->next) {
GstDiscovererStreamInfo *tmpinf = (GstDiscovererStreamInfo *) tmp->data;
@ -165,7 +165,7 @@ static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
gst_discoverer_stream_info_list_free (streams);
}
}
/* This function is called every time the discoverer has information regarding
* one of the URIs we provided.*/
static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info, GError *err, CustomData *data) {
@ -173,7 +173,7 @@ static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info
const gchar *uri;
const GstTagList *tags;
GstDiscovererStreamInfo *sinfo;
uri = gst_discoverer_info_get_uri (info);
result = gst_discoverer_info_get_result (info);
switch (result) {
@ -192,10 +192,10 @@ static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info
case GST_DISCOVERER_MISSING_PLUGINS:{
const GstStructure *s;
gchar *str;
s = gst_discoverer_info_get_misc (info);
str = gst_structure_to_string (s);
g_print ("Missing plugins: %s\n", str);
g_free (str);
break;
@ -204,65 +204,65 @@ static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info
g_print ("Discovered '%s'\n", uri);
break;
}
if (result != GST_DISCOVERER_OK) {
g_printerr ("This URI cannot be played\n");
return;
}
/* If we got no error, show the retrieved information */
g_print ("\nDuration: %" GST_TIME_FORMAT "\n", GST_TIME_ARGS (gst_discoverer_info_get_duration (info)));
tags = gst_discoverer_info_get_tags (info);
if (tags) {
g_print ("Tags:\n");
gst_tag_list_foreach (tags, print_tag_foreach, GINT_TO_POINTER (1));
}
g_print ("Seekable: %s\n", (gst_discoverer_info_get_seekable (info) ? "yes" : "no"));
g_print ("\n");
sinfo = gst_discoverer_info_get_stream_info (info);
if (!sinfo)
return;
g_print ("Stream information:\n");
print_topology (sinfo, 1);
gst_discoverer_stream_info_unref (sinfo);
g_print ("\n");
}
/* This function is called when the discoverer has finished examining
* all the URIs we provided.*/
static void on_finished_cb (GstDiscoverer *discoverer, CustomData *data) {
g_print ("Finished discovering\n");
g_main_loop_quit (data->loop);
}
int main (int argc, char **argv) {
CustomData data;
GError *err = NULL;
gchar *uri = "http://docs.gstreamer.com/media/sintel_trailer-480p.webm";
/* if a URI was provided, use it instead of the default one */
if (argc > 1) {
uri = argv[1];
}
/* Initialize cumstom data structure */
memset (&data, 0, sizeof (data));
/* Initialize GStreamer */
gst_init (&argc, &argv);
g_print ("Discovering '%s'\n", uri);
/* Instantiate the Discoverer */
data.discoverer = gst_discoverer_new (5 * GST_SECOND, &err);
if (!data.discoverer) {
@ -270,32 +270,32 @@ int main (int argc, char **argv) {
g_clear_error (&err);
return -1;
}
/* Connect to the interesting signals */
g_signal_connect (data.discoverer, "discovered", G_CALLBACK (on_discovered_cb), &data);
g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &data);
/* Start the discoverer process (nothing to do yet) */
gst_discoverer_start (data.discoverer);
/* Add a request to process asynchronously the URI passed through the command line */
if (!gst_discoverer_discover_uri_async (data.discoverer, uri)) {
g_print ("Failed to start discovering URI '%s'\n", uri);
g_object_unref (data.discoverer);
return -1;
}
/* Create a GLib Main Loop and set it to run, so we can wait for the signals */
data.loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.loop);
/* Stop the discoverer process */
gst_discoverer_stop (data.discoverer);
/* Free resources */
g_object_unref (data.discoverer);
g_main_loop_unref (data.loop);
return 0;
}
```
@ -328,7 +328,7 @@ int main (int argc, char **argv) {
These are the main steps to use the `GstDiscoverer`:
``` first-line: 182; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Instantiate the Discoverer */
data.discoverer = gst_discoverer_new (5 * GST_SECOND, &err);
if (!data.discoverer) {
@ -342,7 +342,7 @@ if (!data.discoverer) {
parameter is the timeout per file, in nanoseconds (use the
`GST_SECOND` macro for simplicity).
``` first-line: 190; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Connect to the interesting signals */
g_signal_connect (data.discoverer, "discovered", G_CALLBACK (on_discovered_cb), &data);
g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &data);
@ -351,7 +351,7 @@ g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &dat
Connect to the interesting signals, as usual. We discuss them in the
snippet for their callbacks.
``` first-line: 194; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Start the discoverer process (nothing to do yet) */
gst_discoverer_start (data.discoverer);
```
@ -360,7 +360,7 @@ gst_discoverer_start (data.discoverer);
not provided any URI to discover yet. This is done
next:
``` first-line: 197; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Add a request to process asynchronously the URI passed through the command line */
if (!gst_discoverer_discover_uri_async (data.discoverer, uri)) {
g_print ("Failed to start discovering URI '%s'\n", uri);
@ -375,7 +375,7 @@ discovery process for each of them finishes, the registered callback
functions will be fired
up.
``` first-line: 204; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create a GLib Main Loop and set it to run, so we can wait for the signals */
data.loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.loop);
@ -385,7 +385,7 @@ The usual GLib main loop is instantiated and executed. We will get out
of it when `g_main_loop_quit()` is called from the
`on_finished_cb` callback.
``` first-line: 208; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Stop the discoverer process */
gst_discoverer_stop (data.discoverer);
```
@ -396,7 +396,7 @@ Once we are done with the discoverer, we stop it with
Let's review now the callbacks we have
registered:
``` first-line: 85; theme: Default; brush: cpp; gutter: true
``` lang=c
/* This function is called every time the discoverer has information regarding
* one of the URIs we provided.*/
static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info, GError *err, CustomData *data) {
@ -404,7 +404,7 @@ static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info
const gchar *uri;
const GstTagList *tags;
GstDiscovererStreamInfo *sinfo;
uri = gst_discoverer_info_get_uri (info);
result = gst_discoverer_info_get_result (info);
```
@ -417,7 +417,7 @@ case we had multiple discover process running, which is not the case in
this example) with `gst_discoverer_info_get_uri()` and the discovery
result with `gst_discoverer_info_get_result()`.
``` first-line: 95; theme: Default; brush: cpp; gutter: true
``` lang=c
switch (result) {
case GST_DISCOVERER_URI_INVALID:
g_print ("Invalid URI '%s'\n", uri);
@ -434,10 +434,10 @@ switch (result) {
case GST_DISCOVERER_MISSING_PLUGINS:{
const GstStructure *s;
gchar *str;
s = gst_discoverer_info_get_misc (info);
str = gst_structure_to_string (s);
g_print ("Missing plugins: %s\n", str);
g_free (str);
break;
@ -447,7 +447,6 @@ switch (result) {
break;
}
if (result != GST_DISCOVERER_OK) {
g_printerr ("This URI cannot be played\n");
return;
@ -468,7 +467,7 @@ If no error happened, information can be retrieved from the
Bits of information which are made of lists, like tags and stream info,
needs some extra parsing:
``` first-line: 133; theme: Default; brush: cpp; gutter: true
``` lang=c
tags = gst_discoverer_info_get_tags (info);
if (tags) {
g_print ("Tags:\n");
@ -483,15 +482,15 @@ or a specific tag could be searched for with
`gst_tag_list_get_string()`). The code for `print_tag_foreach` is pretty
much self-explicative.
``` first-line: 143; theme: Default; brush: cpp; gutter: false
``` lang=c
sinfo = gst_discoverer_info_get_stream_info (info);
if (!sinfo)
return;
g_print ("Stream information:\n");
print_topology (sinfo, 1);
gst_discoverer_stream_info_unref (sinfo);
```
@ -500,23 +499,23 @@ a `GstDiscovererStreamInfo` structure that is parsed in
the `print_topology` function, and then discarded
with `gst_discoverer_stream_info_unref()`.
``` first-line: 60; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Print information regarding a stream and its substreams, if any */
static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
GstDiscovererStreamInfo *next;
if (!info)
return;
print_stream_info (info, depth);
next = gst_discoverer_stream_info_get_next (info);
if (next) {
print_topology (next, depth + 1);
gst_discoverer_stream_info_unref (next);
} else if (GST_IS_DISCOVERER_CONTAINER_INFO (info)) {
GList *tmp, *streams;
streams = gst_discoverer_container_info_get_streams (GST_DISCOVERER_CONTAINER_INFO (info));
for (tmp = streams; tmp; tmp = tmp->next) {
GstDiscovererStreamInfo *tmpinf = (GstDiscovererStreamInfo *) tmp->data;
@ -551,4 +550,3 @@ This tutorial has shown:
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,7 +1,6 @@
# Basic tutorials
# Basic tutorials
## Welcome to the GStreamer SDK Basic tutorials
These tutorials describe general topics required to understand the rest
of tutorials in the GStreamer SDK.

View file

@ -1,55 +1,70 @@
# Building from source using Cerbero
# Building from source using Cerbero
> ![Warning](images/icons/emoticons/warning.png)
> This section is intended for advanced users.</p></td>
> ![Warning] This section is intended for advanced users.
> </p>
> </td>
## Build requirements
The GStreamer SDK build system provides bootstrapping facilities for all
platforms, but it still needs a minimum base to bootstrap:
- python \>= 2.6 and python's `argparse` module, which is already
- python &gt;= 2.6 and python's `argparse` module, which is already
included in python2.7.
- git
- git
> ![Information](images/icons/emoticons/information.png)
> **Windows users**
>
> Cerbero can be used on Windows using the Msys/MinGW shell (a Unix-like shell for Windows). There is a bit of setup that you need to do before Cerbero can take control.
>
> You need to install the following programs:
> - [Python 2.7](http://www.python.org/getit/releases/2.7/)
> - [Git](http://code.google.com/p/msysgit/downloads/list?q=full+installer+official+git) (Select the install option &quot;Checkout as-is, Commit as-is&quot; and install it in a path without spaces, eg: c:\Git)
> - [Msys/MinGW](https://sourceforge.net/projects/mingw/files/Installer/mingw-get-inst/) (Install it with all the options enabled)
> - [CMake](http://www.cmake.org/cmake/resources/software.htm) (Select the option &quot;Add CMake in system path for the current user&quot;)
> - [Yasm](http://yasm.tortall.net/Download.html) (Download the win32 or win64 version for your platform, name it <code>yasm.exe</code>, and place it in your MinGW <code>bin</code> directory, typically, <code>C:\MinGW\bin</code>)
> - [WiX 3.5](http://wix.codeplex.com/releases/view/60102)
> - [Microsoft SDK 7.1](http://www.microsoft.com/en-us/download/details.aspx?id=8279) (Install the SDK samples and the Visual C++ Compilers, required to build the DirectShow base classes. Might need installing the .NET 4 Framework first if the SDK installer doesn't find it)
> - [Windows Driver Kit 7.1.0](http://msdn.microsoft.com/en-us/windows/hardware/hh852365)
>
> Your user ID can't have spaces (eg: John Smith). Paths with spaces are not correctly handled in the build system and msys uses the user ID for the home folder.
>
>Cerbero must be run in the MinGW shell, which is accessible from the main menu once MinGW is installed.
>
>The last step is making `python` and `git` available from the shell, for which you will need to create a `.profile` file. Issue this command from within the MinGW shell:
>
> `echo "export PATH=\"\$PATH:/c/Python27:/c/Git/bin\"" >> ~/.profile`
>
> Using the appropriate paths to where you installed `python` and `git`
>
> (Note that inside the shell, / is mapped to c:\Mingw\msys\1.0\ )
### Windows users
Cerbero can be used on Windows using the Msys/MinGW shell (a Unix-like
shell for Windows). There is a bit of setup that you need to do before
Cerbero can take control.
> ![Information](images/icons/emoticons/information.png)
> **OS X users**
>
>To use cerbero on OS X you need to install the "Command Line Tools" from XCode. They are available from the "Preferences" dialog under "Downloads".
You need to install the following programs:
- [Python 2.7]
- [Git] (Select the install option "Checkout as-is, Commit as-is" and
install it in a path without spaces, eg: c:\Git)
- [Msys/MinGW] (Install it with all the options enabled)
- [CMake] (Select the option "Add CMake in system path for the
current user")
- [Yasm] (Download the win32 or win64 version for your platform, name
it `yasm.exe`, and place it in your MinGW `bin` directory,
typically, `C:\MinGW\bin`)
- [WiX 3.5]
- [Microsoft SDK 7.1] (Install the SDK samples and the Visual C++
Compilers, required to build the DirectShow base classes. Might need
installing the .NET 4 Framework first if the SDK installer doesn't
find it)
- [Windows Driver Kit 7.1.0]
> ![Information](images/icons/emoticons/information.png)
> **iOS developers**
>
>If you want to build the GStreamer-SDK for iOS, you also need the iOS SDK. The minimum required iOS SDK version is 6.0 and is included in [XCode](https://developer.apple.com/devcenter/ios/index.action#downloads) since version 4.
Your user ID can't have spaces (eg: John Smith). Paths with spaces are
not correctly handled in the build system and msys uses the user ID for
the home folder.
Cerbero must be run in the MinGW shell, which is accessible from the
main menu once MinGW is installed.
The last step is making `python` and `git` available from the shell, for
which you will need to create a `.profile` file. Issue this command from
within the MinGW shell:
`echo "export PATH=\"\$PATH:/c/Python27:/c/Git/bin\"" >> ~/.profile`
Using the appropriate paths to where you installed `python` and `git`
(Note that inside the shell, / is mapped to c:\Mingw\msys\1.0 )
### OS X users
To use cerbero on OS X you need to install the "Command Line Tools" from
XCode. They are available from the "Preferences" dialog under
"Downloads".
### iOS developers
If you want to build the GStreamer-SDK for iOS, you also need the iOS
SDK. The minimum required iOS SDK version is 6.0 and is included in
[XCode] since version 4.
## Download the sources
@ -60,18 +75,14 @@ architectures and distributions.
Get a copy of Cerbero by cloning the git repository:
```
git clone git://anongit.freedesktop.org/gstreamer/cerbero
```
git clone git://anongit.freedesktop.org/gstreamer/cerbero
Cerbero can be run uninstalled and for convenience you can create an
alias in your `.bashrc` file*. *If you prefer to skip this step,
remember that you need to replace the calls to `cerbero` with
`./cerbero-uninstalled` in the next steps.
```
echo "alias cerbero='~/git/cerbero/cerbero-uninstalled'" >> ~/.bashrc
```
echo "alias cerbero='~/git/cerbero/cerbero-uninstalled'" >> ~/.bashrc
## Setup environment
@ -87,15 +98,12 @@ are running and will use default build options such as the default build
directory. The default options should work fine on the supported
distributions.
An example configuration file with detailed comments can be found
[here](http://www.freedesktop.org/software/gstreamer-sdk/cerbero.cbc.template)
An example configuration file with detailed comments can be found [here]
To fire up the bootstrapping process, go to the directory where you
cloned/unpacked Cerbero and type:
```
cerbero bootstrap
```
cerbero bootstrap
Enter the superuser/root password when prompted.
@ -106,45 +114,33 @@ the GStreamer SDK.
To generate the SDK, use the following command:
```
cerbero package gstreamer-1.0
```
cerbero package gstreamer-1.0
This should build all required SDK components and create packages for
your distribution at the Cerbero source directory.
A list of supported packages to build can be retrieved using:
```
cerbero list-packages
```
cerbero list-packages
Packages are composed of 0 (in case of a meta package) or more
components that can be built separately if desired. The components are
defined as individual recipes and can be listed with:
```
cerbero list
```
cerbero list
To build an individual recipe and its dependencies, do the following:
```
cerbero build <recipe_name>
```
cerbero build <recipe_name>
Or to build or force a rebuild of a recipe without building its
dependencies use:
```
cerbero buildone <recipe_name>
```
cerbero buildone <recipe_name>
To wipe everything and start from scratch:
```
cerbero wipe
```
cerbero wipe
Once built, the output of the recipes will be installed at the prefix
defined in the Cerbero configuration file `$HOME/.cerbero/cerbero.cbc`
@ -157,53 +153,43 @@ can be very slow on Windows, so if you only need to rebuild a single
project (eg: gst-plugins-good to patch qtdemux) there is a much faster
way of doing it. You will need to follow the steps detailed in this
page, but skipping the step "**Build the SDK**", and installing the
SDK's development files as explained in [Installing the
SDK](Installing+the+SDK.markdown).
SDK's development files as explained in [Installing the SDK].
By default, Cerbero uses as prefix a folder in the user directory with
the following schema ~/cerbero/dist/$platform\_$arch, but for the SDK we
must change this prefix to use its installation directory. This can be
done with a custom configuration file named *custom.cbc*:
the following schema \~/cerbero/dist/$platform\_$arch, but for the SDK
we must change this prefix to use its installation directory. This can
be done with a custom configuration file named *custom.cbc*:
```
# For Windows x86
prefix='/c/gstreamer/1.0/x86/'
# For Windows x86
prefix='/c/gstreamer/1.0/x86/'
# For Windows x86_64
#prefix='/c/gstreamer/1.0/x86_64'
# For Windows x86_64
#prefix='/c/gstreamer/1.0/x86_64'
# For Linux
#prefix='/opt/gstreamer'
# For Linux
#prefix='/opt/gstreamer'
# For OS X
#prefix='/Library/Frameworks/GStreamer.framework/Versions/1.0'
```
# For OS X
#prefix='/Library/Frameworks/GStreamer.framework/Versions/1.0'
The prefix path might not be writable by your current user. Make sure
you fix it before, for instance with:
```
$ sudo chown -R <username> /Library/Frameworks/GStreamer.framework/
```
$ sudo chown -R <username> /Library/Frameworks/GStreamer.framework/
Cerbero has a shell command that starts a new shell with all the
environment set up to target the SDK. You can start a new shell using
the installation prefix defined in *custom.cbc *with the following
command:
```
$ cerbero -c custom.cbc shell
```
$ cerbero -c custom.cbc shell
Once you are in Cerbero's shell you can compile new
projects targeting the SDK using the regular build
process:
Once you are in Cerbero's shell you can compile new projects targeting
the SDK using the regular build process:
```
$ git clone -b sdk-0.10.31 git://anongit.freedesktop.org/gstreamer-sdk/gst-plugins-good; cd gst-plugins-good
$ sh autogen.sh --disable-gtk-doc --prefix=<prefix>
$ make -C gst/isomp4
```
$ git clone -b sdk-0.10.31 git://anongit.freedesktop.org/gstreamer-sdk/gst-plugins-good; cd gst-plugins-good
$ sh autogen.sh --disable-gtk-doc --prefix=<prefix>
$ make -C gst/isomp4
### Cross-compilation of the SDK
@ -219,9 +205,7 @@ You can cross-compile the SDK for Android from a Linux host using the
configuration file `config/cross-android.cbc`. Replace all the previous
commands with:
```
cerbero -c config/cross-android.cbc <command>
```
cerbero -c config/cross-android.cbc <command>
#### Windows
@ -232,21 +216,28 @@ only be created from Windows.
Replace all the above commands for Windows 32bits with:
```
cerbero -c config/cross-win32.cbc <command>
```
cerbero -c config/cross-win32.cbc <command>
Or with using the following for Windows 64bits:
```
cerbero -c config/cross-win64.cbc <command>
```
cerbero -c config/cross-win64.cbc <command>
#### iOS
To cross compile for iOS from OS X, use the configuration file
`config/cross-ios-universal.cbc`. Replace all previous commands with:
```
cerbero -c config/cross-ios-universal.cbc <command>
```
cerbero -c config/cross-ios-universal.cbc <command>
[Warning]: images/icons/emoticons/warning.png
[Python 2.7]: http://www.python.org/getit/releases/2.7/
[Git]: http://code.google.com/p/msysgit/downloads/list?q=full+installer+official+git
[Msys/MinGW]: https://sourceforge.net/projects/mingw/files/Installer/mingw-get-inst/
[CMake]: http://www.cmake.org/cmake/resources/software.htm
[Yasm]: http://yasm.tortall.net/Download.html
[WiX 3.5]: http://wix.codeplex.com/releases/view/60102
[Microsoft SDK 7.1]: http://www.microsoft.com/en-us/download/details.aspx?id=8279
[Windows Driver Kit 7.1.0]: http://msdn.microsoft.com/en-us/windows/hardware/hh852365
[XCode]: https://developer.apple.com/devcenter/ios/index.action#downloads
[here]: http://www.freedesktop.org/software/gstreamer-sdk/cerbero.cbc.template
[Installing the SDK]: Installing+the+SDK.markdown

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Contact
# Contact
This page last changed on Dec 03, 2012 by xartigas.
@ -35,4 +35,3 @@ We want to hear from you!</p></td>
</table>
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Deploying your application
# Deploying your application
This page last changed on Jun 12, 2013 by xartigas.
@ -128,4 +128,3 @@ options.
 
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Frequently Asked Questions
# Frequently Asked Questions
This page last changed on Jun 12, 2013 by xartigas.
@ -110,22 +110,22 @@ In summary:
Some cool media apps using GStreamer:
- [Banshee](http://banshee.fm/)
- [Banshee](http://banshee.fm/)
- [Songbird](http://getsongbird.com/)
- [Snappy](http://live.gnome.org/snappy)  
- [Empathy](https://live.gnome.org/Empathy)
- [Totem](http://projects.gnome.org/totem/)
- [Transmaggedon](http://www.linuxrising.org/)
- [Flumotion](http://www.flumotion.net/)
- [Landell](http://landell.holoscopio.com/)
- [Longo match](http://longomatch.org/)
- [Rygel](https://live.gnome.org/Rygel)
- [Empathy](https://live.gnome.org/Empathy)
- [Totem](http://projects.gnome.org/totem/)
- [Transmaggedon](http://www.linuxrising.org/)
- [Flumotion](http://www.flumotion.net/)
- [Landell](http://landell.holoscopio.com/)
- [Longo match](http://longomatch.org/)
- [Rygel](https://live.gnome.org/Rygel)
- [Sound
juicer](http://www.burtonini.com/blog/computers/sound-juicer)
- [Buzztard](http://wiki.buzztard.org/index.php/Overview)
juicer](http://www.burtonini.com/blog/computers/sound-juicer)
- [Buzztard](http://wiki.buzztard.org/index.php/Overview)
- [Moovida](http://www.moovida.com/) (Based on Banshee)
- [Fluendo DVD
Player](http://www.fluendo.com/shop/product/fluendo-dvd-player/)
Player](http://www.fluendo.com/shop/product/fluendo-dvd-player/)
- and many [more](http://gstreamer.freedesktop.org/apps/)
# What is the target audience?
@ -154,4 +154,3 @@ The GStreamer SDK supports the iOS platform since [version 2013.6
(Congo)](2013.6%2BCongo.html).
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : GStreamer reference
# GStreamer reference
This page last changed on Jun 25, 2012 by xartigas.
@ -54,4 +54,3 @@ generated from the source code of GStreamer.
</table>
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK
# GStreamer SDK documentation
## Welcome to the GStreamer SDK documentation!
@ -17,4 +17,3 @@ all about.
| [![](attachments/faq.png)](Frequently+Asked+Questions.markdown) | [Frequently Asked Questions](Frequently+Asked+Questions.markdown) |
| [![](attachments/legal.png)](Legal+information.markdown) | [Patents, Licenses and legal F.A.Q.](Legal+information.markdown) |
| [![](attachments/contact.png)](Contact.markdown) | [Community support, bug reporting...](Contact.markdown) |

View file

@ -1,4 +1,6 @@
# Installing for Android development
# Installing for Android development
![](images/icons/emoticons/information.png) All versions starting from 2.3.1 Gingerbread are supported
## Prerequisites
@ -22,7 +24,6 @@ Optionally, you can use the [Android Studio] (FIX LINK). As stated in the Androi
documentation, *developing in Android Studio is highly recommended and
is the fastest way to get started*.
Before continuing, make sure you can compile and run the samples
included in the Android NDK, and that you understand how the integration
of C and Java works via the [Java Native
@ -34,24 +35,9 @@ here](http://developer.android.com/guide/practices/jni.html).
## Download and install the SDK
The SDK has two variants: **Debug** and **Release**. The Debug variant
produces lots of debug output and is useful while developing your
application. The Release variant is what you will use to produce the
final version of your application, since GStreamer code runs slightly
faster and the libraries are smaller.
Get the compressed file below and just unzip it into any folder of your
choice (Choose your preferred file format; both files have exactly the
same content)
### Debug variant
FIXME: Link to download files
If you intend to build the tutorials in this same folder, make sure you
have writing permissions.
FIXME: Is this all TRUE ?
The GStreamer project provides [prebuilt binaries](https://gstreamer.freedesktop.org/data/pkg/android/)
you should download the latest version and unzip it into any folder of your
choice.
In the process of building GStreamer-enabled Android applications, some
tools will need to know where you installed the SDK. You must define an
@ -62,10 +48,12 @@ system-wide by adding it to your `~/.profile` file (on Linux and Mac) or
to the Environment Variables in the System Properties dialog (on
Windows).
- Point `GSTREAMER_SDK_ROOT_ANDROID` to the folder where you unzipped
the SDK.
Point `GSTREAMER_SDK_ROOT_ANDROID` to the folder where you unzipped the SDK.
> ![](images/icons/emoticons/information.png) If you plan to use Eclipse and do not want to define this environment variable globally, you can set it inside Eclipse. Go to Window → Preferences → C/C++ → Build → Build Variables and define `GSTREAMER_SDK_ROOT_ANDROID` there.
> ![](images/icons/emoticons/information.png) If you plan to use Android Studio and
> do not want to define this environment variable globally, you can set
> it inside Eclipse. Go to Window → Preferences → C/C++ → Build → Build Variables
> and define `GSTREAMER_SDK_ROOT_ANDROID` there.
## Configure your development environment
@ -133,7 +121,9 @@ OpenGL ES).
#### Using the command line
> ![](images/icons/emoticons/warning.png) Note that, on Windows, this procedure requires a working Cygwin shell, as explained in the [Android NDK System Requirements](http://developer.android.com/tools/sdk/ndk/index.html#Reqs)
> ![](images/icons/emoticons/warning.png) Note that, on Windows,
> this procedure requires a working Cygwin shell, as explained in
> the [Android NDK System Requirements](http://developer.android.com/tools/sdk/ndk/index.html#Reqs)
For each tutorial, move to its folder and run:
@ -188,7 +178,6 @@ tutorial in an Android Virtual Device (AVD), make sure to create the
device with support for audio playback and GPU Emulation (to enable
OpenGL ES).
> ![](images/icons/emoticons/warning.png) Windows linkage problems
>
> Due to problems related to the standard linker, Googles <a href="http://en.wikipedia.org/wiki/Gold_(linker)" class="external-link">Gold Linker</a> is used to build GStreamer applications.  Unfortunately, the Android NDK toolchain for Windows does not include the gold linker and the standard one has to be used.

View file

@ -1,4 +1,6 @@
# Installing for iOS development
# Installing for iOS development
![](images/icons/emoticons/information.png) All versions starting form iOS 6 are supported
## Prerequisites
@ -86,4 +88,3 @@ Once a project has been created using a GStreamer SDK Template, it is
ready to build and run. All necessary infrastructure is already in
place. To understand what files have been created and how they interact,
take a look at the [iOS tutorials](iOS+tutorials.markdown).

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Installing on Linux
# Installing on Linux
This page last changed on Jun 12, 2013 by slomo.
@ -61,7 +61,7 @@ distribution:
And copy it to the `/etc/apt/sources.list.d/` directory on your system:
``` theme: Default; brush: plain; gutter: false
```
sudo cp gstreamer-sdk.list /etc/apt/sources.list.d/
```
@ -70,7 +70,7 @@ be added and the apt repository list needs to be refreshed. This can be
done by
running:
``` theme: Default; brush: plain; gutter: false
```
wget -q -O - http://www.freedesktop.org/software/gstreamer-sdk/sdk.gpg | sudo apt-key add -
sudo apt-get update
```
@ -78,7 +78,7 @@ sudo apt-get update
Now that the new repositories are available, install the SDK with the
following command:
``` theme: Default; brush: plain; gutter: false
```
sudo apt-get install gstreamer-sdk-dev
```
@ -103,7 +103,7 @@ distribution:
And copy it to the `/etc/apt/sources.list.d/` directory on your system:
``` theme: Default; brush: plain; gutter: false
```
su -c 'cp gstreamer-sdk.list /etc/apt/sources.list.d/'
```
@ -112,7 +112,7 @@ be added and the apt repository list needs to be refreshed. This can be
done by
running:
``` theme: Default; brush: plain; gutter: false
```
su -c 'wget -q -O - http://www.freedesktop.org/software/gstreamer-sdk/sdk.gpg | apt-key add -'
su -c 'apt-get update'
```
@ -120,7 +120,7 @@ su -c 'apt-get update'
Now that the new repositories are available, install the SDK with the
following command:
``` theme: Default; brush: plain; gutter: false
```
su -c 'apt-get install gstreamer-sdk-dev'
```
@ -141,21 +141,21 @@ distribution:
And copy it to the `/etc/yum.repos.d/` directory on your system:
``` theme: Default; brush: plain; gutter: false
```
su -c 'cp gstreamer-sdk.repo /etc/yum.repos.d/'
```
After adding the repositories, the yum repository list needs to be
refreshed. This can be done by running:
``` theme: Default; brush: plain; gutter: false
```
su -c 'yum update'
```
Now that the new repositories are available, install the SDK with the
following command:
``` theme: Default; brush: plain; gutter: false
```
su -c 'yum install gstreamer-sdk-devel'
```
@ -170,7 +170,7 @@ installed in a non-standard location `/opt/gstreamer-sdk`. The shell
script `gst-sdk-shell` sets the required environment variables for
building applications with the GStreamer SDK:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
/opt/gstreamer-sdk/bin/gst-sdk-shell
```
@ -179,7 +179,7 @@ the `gcc` compiler and a text editor. In order to compile code that
requires the GStreamer SDK and uses the GStreamer core library, remember
to add this string to your `gcc` command:
``` theme: Default; brush: plain; gutter: false
```
`pkg-config --cflags --libs gstreamer-0.10`
```
@ -214,7 +214,7 @@ available in a GIT repository and distributed with the SDK.
The GIT repository can be cloned with:
``` theme: Default; brush: plain; gutter: false
```
git clone git://anongit.freedesktop.org/gstreamer-sdk/gst-sdk-tutorials
```
@ -231,7 +231,7 @@ Run `/opt/gstreamer-sdk/bin/gst-sdk-shell` to enter this shell.
Then go to the folder where you copied/cloned the tutorials and
write:
``` theme: Default; brush: plain; gutter: false
```
gcc basic-tutorial-1.c -o basic-tutorial-1 `pkg-config --cflags --libs gstreamer-0.10`
```
@ -254,7 +254,7 @@ Using the file name of the tutorial you are interested in
To run the tutorials, simply execute the desired tutorial (**from within
the `gst-sdk-shell`**):
``` theme: Default; brush: cpp; gutter: false
``` lang=c
./basic-tutorial-1
```
@ -282,4 +282,3 @@ the `gensdkshell` command of the Cerbero build system, if you built
the SDK yourself as explained above.
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,10 @@
# Installing on Mac OS X
# Installing on Mac OS X
## Supported platforms
* 10.6 (Snow Leopard)
* 10.7 (Lion)
* 10.8 (Mountain Lion)
## Prerequisites
@ -51,7 +57,7 @@ click in the installer to start the installation process.
These are some paths of the GStreamer framework that you might find
useful:
- /Library/Frameworks/GStreamer.framework/: Framework's root path
- /Library/Frameworks/GStreamer.framework/: Framework's root path
- /Library/Frameworks/GStreamer.framework/Versions: path with all the
versions of the framework
- /Library/Frameworks/GStreamer.framework/Versions/Current: link to
@ -81,7 +87,7 @@ folder `/Library/Frameworks/GStreamer.framework/Current/share/gst-sdk/tutorials`
You can fire up XCode and load the project file.
Press the **Run **button to build and run the first tutorial. You can
switch the tutorial to build selecting one of the available schemes.
switch the tutorial to build selecting one of the available schemes.
### Creating new projects
@ -97,4 +103,4 @@ path `/Library/Frameworks/GStreamer.framework/Headers`
- XCode: Add the headers path to **Search Paths-\> Header Search
Paths**
- GCC: Using the compiler
option** -I/Library/Frameworks/GStreamer.framework/Headers**
option** -I/Library/Frameworks/GStreamer.framework/Headers**

View file

@ -1,4 +1,11 @@
# Installing on Windows
# Installing on Windows
## Supported platforms
* Windows XP
* Windows Vista
* Windows 7
* Windows 8
## Prerequisites
@ -48,7 +55,6 @@ There are 3 sets of files in the SDK:
Get **the Runtime and Development files** installers appropriate for
your architecture from here:
**FIXME: Add links **
> ![Warning](images/icons/emoticons/warning.png)
@ -60,7 +66,6 @@ default is usually OK.
> ![Warning](images/icons/emoticons/warning.png)
>`If you plan to use Visual Studio, **close it before installing the GStreamer SDK**. The installer will define new environment variables which will not be picked up by Visual Studio if it is open.
> On **Windows 8** and **Windows 10**, it might be necessary to log out and log back in to your account after the installation for the newly defined environment variables to be picked up by Visual Studio.
It is the application's responsibility to ensure that, at runtime,
@ -140,7 +145,6 @@ from within Visual Studio. You use the `%...%` notation from Windows
Explorer)
You should now be able to run the tutorials.
### Creating new projects manually
@ -161,7 +165,7 @@ load `gstreamer-1.0.props `
This property sheet contains the directories where the headers and
libraries are located, and the necessary options for the compiler and
linker, so you do not need to change anything else in your project.
If you cannot find the Property Manager, you might need to enable Expert
Settings. Go to Tools → Settings → Expert Settings. Upon first
installation of Visual Studio, Expert Settings are disabled by
@ -169,7 +173,6 @@ default.
![](attachments/WindowsInstall10.png)
> ![Warning](images/icons/emoticons/warning.png)
> **Depending on the GStreamer libraries you need to use, you will have to add more property pages, besides `gstreamer-1.0`** (each property page corresponds to one GStreamer library).
>
@ -225,4 +228,4 @@ necessary project settings, both for 32 and 64 bits architectures.
The generated project file includes the two required Property Sheets
described in the previous section, so, in order to link to the correct
`MSVCRT.DLL`, **you still need to install the Windows Device Driver
Kit** and change the appropriate property sheets.
Kit** and change the appropriate property sheets.

View file

@ -1,81 +1,16 @@
# Installing the SDK
# Installing the SDK
### Choose your platform
## Choose your platform by clicking on the corresponding logo
<table>
<colgroup>
<col width="33%" />
<col width="33%" />
<col width="33%" />
</colgroup>
<tbody>
<tr class="odd">
<td><p><a href="Installing%2Bon%2BLinux.html"><img src="attachments/linux.png" class="confluence-embedded-image image-center" /></a></p>
<h3 id="InstallingtheSDK-Linux" style="text-align: center;">Linux</h3>
<ul>
<li>Ubuntu 12.04 (Precise Pangolin)</li>
<li>Ubuntu 12.10 (Quantal <span style="color: rgb(0,0,0);">Quetzal</span>)</li>
<li>Debian 6.0 (Squeeze)</li>
<li>Debian 7.0 (Wheezy)</li>
<li>Fedora 16</li>
<li>Fedora 17</li>
</ul>
<p><span style="color: rgb(255,255,255);">______________________________________</span></p></td>
<td><p><a href="Installing%2Bon%2BMac%2BOS%2BX.html"><img src="attachments/mac.png" class="confluence-embedded-image image-center" /></a></p>
<h3 id="InstallingtheSDK-MacOSX" style="text-align: center;">Mac OS X</h3>
<ul>
<li>10.6 (Snow Leopard)</li>
<li>10.7 (Lion)</li>
<li>10.8 (Mountain Lion)</li>
</ul>
<p><span style="color: rgb(255,255,255);">______________________________________</span></p></td>
<td><p><a href="Installing%2Bon%2BWindows.html"><img src="attachments/windows.png" class="confluence-embedded-image image-center" /></a></p>
<h3 id="InstallingtheSDK-MicrosoftWindows" style="text-align: center;">Microsoft Windows</h3>
<ul>
<li>Windows XP</li>
<li>Windows Vista</li>
<li>Windows 7</li>
<li>Windows 8</li>
</ul>
<p><span style="color: rgb(255,255,255);">______________________________________</span></p>
<div>
<span style="color: rgb(255,255,255);"><br />
</span>
</div></td>
</tr>
</tbody>
</table>
[![](attachments/1540163.png)](Installing+on+Mac+OS+X.markdown) [![](attachments/1540164.png)](Installing+on+Windows.markdown) [![](attachments/2654239.png)](Installing+for+Android+development.markdown) [![](attachments/3539150.jpeg)](Installing+for+iOS+development.markdown)
<table>
<colgroup>
<col width="50%" />
<col width="50%" />
</colgroup>
<tbody>
<tr class="odd">
<td><p><a href="Installing%2Bfor%2BAndroid%2Bdevelopment.html"><img src="attachments/android.png" class="confluence-embedded-image image-center" /></a></p>
<h3 id="InstallingtheSDK-Android" style="text-align: center;">Android</h3>
<ul>
<li>2.3.1 Gingerbread and above</li>
</ul>
<p><span style="color: rgb(255,255,255);">______________________________________</span></p>
<div>
<span style="color: rgb(255,255,255);"><br />
</span>
</div></td>
<td><p><a href="Installing%2Bfor%2BiOS%2Bdevelopment.html"><img src="attachments/ios.jpeg" class="confluence-embedded-image image-center" /></a></p>
<h3 id="InstallingtheSDK-iOS" style="text-align: center;">iOS</h3>
<ul>
<li>iOS 6 and above</li>
</ul>
<p><span style="color: rgb(255,255,255);">______________________________________</span></p>
<div>
<span style="color: rgb(255,255,255);"><br />
</span>
</div></td>
</tr>
</tbody>
</table>
## Linux
The installation instructions are different depending on your platform.
Please select the appropriate one by clicking on its logo.
The GStreamer community does not provide the SDK for GNU/Linux platforms
as it will always be available through package managers on all
distribution. It is also always installed by default on desktop
environments, you will just need to make sure you have the development
packages installed (refer to your distribution documentation for more
information). If you really want to run the Sdk on Linux, you can
always follow the instructions to
[build from source using cerbero](Building+from+source+using+Cerbero.markdown).

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Legal information
# Legal information
This page last changed on Jun 11, 2012 by xartigas.
@ -219,4 +219,3 @@ that the license of the conditions of the resulting program must allow
decompilation to debug modifications to the library.
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Mac OS X deployment
# Mac OS X deployment
This page last changed on Nov 28, 2012 by xartigas.
@ -31,7 +31,7 @@ should replace `$INSTALL_PATH` with the path where your installer copied
the SDK's disk image files (the `/tmp` directory is good place to
install it as it will be removed at the end of the installation):
``` theme: Default; brush: bash; gutter: false
``` lang=bash
hdiutil attach $INSTALL_PATH/gstreamer-sdk-2012.7-x86.dmg
cd /Volumes/gstreamer-sdk-2012.7-x86/
installer -pkg gstreamer-sdk-2012.7-x86.pkg -target "/"
@ -47,7 +47,7 @@ simply copy the framework to the application's Frameworks folder as
defined in the [bundle programming
guide](https://developer.apple.com/library/mac/documentation/CoreFoundation/Conceptual/CFBundles/BundleTypes/BundleTypes.html#//apple_ref/doc/uid/10000123i-CH101-SW19):
``` theme: Default; brush: bash; gutter: false
``` lang=bash
cp -r /Library/Frameworks/GStreamer.framework ~/MyApp.app/Contents/Frameworks
```
@ -56,7 +56,7 @@ different architectures, installed in the system. Make sure you only
copy the version you need and that you update accordingly the link
`GStreamer.framework/Version/Current`:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
$ ls -l Frameworks/GStreamer.framework/Version/Current
lrwxr-xr-x 1 fluendo staff 21 Jun 5 18:46 Frameworks/GStreamer.framework/Versions/Current -> ../Versions/0.10/x86
```
@ -274,8 +274,8 @@ We can get the list of paths used by an object file to locate its
dependent dynamic libraries
using [otool](https://developer.apple.com/library/mac/#documentation/darwin/reference/manpages/man1/otool.1.html):
``` theme: Default; brush: bash; gutter: false
$ otool -L /Library/Frameworks/GStreamer.framework/Commands/gst-launch-0.10
``` lang=bash
$ otool -L /Library/Frameworks/GStreamer.framework/Commands/gst-launch-0.10
/Library/Frameworks/GStreamer.framework/Commands/gst-launch-0.10:
/System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation (compatibility version 150.0.0, current version 550.43.0)
/Library/Frameworks/GStreamer.framework/Versions/0.10/x86/lib/libgstreamer-0.10.0.dylib (compatibility version 31.0.0, current version 31.0.0)
@ -293,8 +293,8 @@ This full path is extracted from the dynamic library  ***install name***
install name of a library can be retrieved with
[otool](https://developer.apple.com/library/mac/#documentation/darwin/reference/manpages/man1/otool.1.html) too:
``` theme: Default; brush: bash; gutter: false
$ otool -D /Library/Frameworks/GStreamer.framework/Libraries/libgstreamer-0.10.dylib
``` lang=bash
$ otool -D /Library/Frameworks/GStreamer.framework/Libraries/libgstreamer-0.10.dylib
/Library/Frameworks/GStreamer.framework/Libraries/libgstreamer-0.10.dylib:
/Library/Frameworks/GStreamer.framework/Versions/0.10/x86/lib/libgstreamer-0.10.0.dylib
```
@ -348,7 +348,7 @@ When looking for binaries to fix, we will run the script in the
following
directories:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/lib /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/libexec /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/bin /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
@ -376,9 +376,8 @@ You can use the following functions:
## Attachments:
![](images/icons/bullet_blue.gif)
[PackageMaker1.png](attachments/2097292/2424841.png) (image/png)
[PackageMaker1.png](attachments/2097292/2424841.png) (image/png)
![](images/icons/bullet_blue.gif)
[PackageMaker2.png](attachments/2097292/2424842.png) (image/png)
[PackageMaker2.png](attachments/2097292/2424842.png) (image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Multiplatform deployment using Cerbero
# Multiplatform deployment using Cerbero
This page last changed on Nov 21, 2012 by slomo.
@ -28,7 +28,7 @@ In the Cerbero installation directory you will find the
`cerbero-uninstalled` script. Execute it without parameters to see the
list of commands it accepts:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled
```
@ -37,7 +37,7 @@ list of commands it accepts:
The first step is to create an empty recipe that you can then tailor to
your needs:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled add-recipe my-app 1.0
```
@ -204,7 +204,7 @@ Alternatively, you can pass some options to cerbero-uninstalled so some
of these attributes are already set for you. For
example:
``` theme: Default; brush: python; gutter: false
```
./cerbero-uninstalled add-recipe --licenses "LGPL" --deps "glib,gtk+" --origin "git://git.my-app.com" --commit "git-commit-to-use" my-app 1.0
```
@ -212,7 +212,7 @@ See `./cerbero-uninstalled add-recipe -h` for help.
As an example, this is the recipe used to build the Snappy media player:
``` theme: Default; brush: python; gutter: false
```
class Recipe(recipe.Recipe):
name = 'snappy'
version = '0.2+git'
@ -223,11 +223,9 @@ class Recipe(recipe.Recipe):
use_system_libs = True
remotes = {'upstream': 'git://git.gnome.org/snappy'}
files_bins = ['snappy']
files_data = ['share/snappy']
def prepare(self):
if self.config.target_platform == Platform.LINUX:
self.configure_options += ' --enable-dbus' 
@ -244,7 +242,7 @@ Snappy.
Once the recipe is ready, instruct Cerbero to build it:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled build my-app
```
@ -259,7 +257,7 @@ files in `cerbero/packages`.
Now, to create an empty package, do:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled add-package my-app 1.0
```
@ -412,7 +410,7 @@ Alternatively you can also pass some options to `cerbero-uninstalled`,
for
example:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled add-package my-app 1.0 --license "LGPL" --codename MyApp --vendor MyAppVendor --url "http://www.my-app.com" --files=my-app:bins:libs --files-devel=my-app:devel --platform-files=linux:my-app:linux_specific --platform-files-devel=linux:my-app:linux_specific_devel,windows:my-app:windows_specific_devel --deps base-system --includes gstreamer-core
```
@ -421,7 +419,7 @@ See `./cerbero-uninstalled add-package -h` for help.
As an example, this is the package file that is used for packaging the
`gstreamer-core` package:
``` theme: Default; brush: python; gutter: false
```
class Package(package.Package):
name = 'gstreamer-codecs'
shortdesc = 'GStreamer codecs'
@ -433,7 +431,6 @@ class Package(package.Package):
uuid = '6cd161c2-4535-411f-8287-e8f6a892f853'
deps = ['gstreamer-core']
files = ['flac:libs',
'jasper:libs', 'libkate:libs',
'libogg:libs', 'schroedinger:libs', 'speex:libs',
@ -475,7 +472,7 @@ packages\_prefix as the ones in your Cerbero configuration file.
Finally, build your package by using:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled package your-package 
```
@ -486,4 +483,3 @@ the dependencies and your software). The resulting files will be in the
current working directory.
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,6 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 1: Playbin2 usage
This page last changed on Jun 26, 2012 by xartigas.
# Playback tutorial 1: Playbin usage
# Goal
@ -43,15 +41,14 @@ Finally, multiple video streams can also be found in a single file, for
example, in DVD with multiple angles of the same scene, but they are
somewhat rare.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>Embedding multiple streams inside a single file is called “multiplexing” or “muxing”, and such file is then known as a “container”. Common container formats are Matroska (.mkv), Quicktime (.qt, .mov, .mp4), Ogg (.ogg) or Webm (.webm).</p>
<p>Retrieving the individual streams from within the container is called “demultiplexing” or “demuxing”.</p></td>
</tr>
</tbody>
</table>
> ![](images/icons/emoticons/information.png) Embedding multiple streams
> inside a single file is called “multiplexing” or “muxing”, and such file
> is then known as a “container”. Common container formats are Matroska
> (.mkv), Quicktime (.qt, .mov, .mp4), Ogg (.ogg) or Webm (.webm).
>
>
> Retrieving the individual streams from within the container is called
> “demultiplexing” or “demuxing”.
The following code recovers the amount of streams in the file, their
associated metadata, and allows switching the audio stream while the
@ -64,69 +61,69 @@ it in the SDK installation).
**playback-tutorial-1.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */
gint n_video; /* Number of embedded video streams */
gint n_audio; /* Number of embedded audio streams */
gint n_text; /* Number of embedded subtitle streams */
gint current_video; /* Currently playing video stream */
gint current_audio; /* Currently playing audio stream */
gint current_text; /* Currently playing subtitle stream */
GMainLoop *main_loop; /* GLib's Main Loop */
} CustomData;
/* playbin2 flags */
typedef enum {
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */
GST_PLAY_FLAG_TEXT = (1 << 2) /* We want subtitle output */
} GstPlayFlags;
/* Forward definition for the message and keyboard processing functions */
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data);
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data);
int main(int argc, char *argv[]) {
CustomData data;
GstBus *bus;
GstStateChangeReturn ret;
gint flags;
GIOChannel *io_stdin;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
if (!data.playbin2) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", NULL);
/* Set flags to show Audio and Video but ignore Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO;
flags &= ~GST_PLAY_FLAG_TEXT;
g_object_set (data.playbin2, "flags", flags, NULL);
/* Set connection speed. This will affect some internal decisions of playbin2 */
g_object_set (data.playbin2, "connection-speed", 56, NULL);
/* Add a bus watch, so we get notified when a message arrives */
bus = gst_element_get_bus (data.playbin2);
gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data);
/* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
@ -134,7 +131,7 @@ int main(int argc, char *argv[]) {
io_stdin = g_io_channel_unix_new (fileno (stdin));
#endif
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
/* Start playing */
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -142,11 +139,11 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.playbin2);
return -1;
}
/* Create a GLib Main Loop and set it to run */
data.main_loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.main_loop);
/* Free resources */
g_main_loop_unref (data.main_loop);
g_io_channel_unref (io_stdin);
@ -155,22 +152,22 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.playbin2);
return 0;
}
/* Extract some metadata from the streams and print it on the screen */
static void analyze_streams (CustomData *data) {
gint i;
GstTagList *tags;
gchar *str;
guint rate;
/* Read some properties */
g_object_get (data->playbin2, "n-video", &data->n_video, NULL);
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL);
g_object_get (data->playbin2, "n-text", &data->n_text, NULL);
g_print ("%d video stream(s), %d audio stream(s), %d text stream(s)\n",
data->n_video, data->n_audio, data->n_text);
g_print ("\n");
for (i = 0; i < data->n_video; i++) {
tags = NULL;
@ -184,7 +181,7 @@ static void analyze_streams (CustomData *data) {
gst_tag_list_free (tags);
}
}
g_print ("\n");
for (i = 0; i < data->n_audio; i++) {
tags = NULL;
@ -206,7 +203,7 @@ static void analyze_streams (CustomData *data) {
gst_tag_list_free (tags);
}
}
g_print ("\n");
for (i = 0; i < data->n_text; i++) {
tags = NULL;
@ -221,22 +218,22 @@ static void analyze_streams (CustomData *data) {
gst_tag_list_free (tags);
}
}
g_object_get (data->playbin2, "current-video", &data->current_video, NULL);
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL);
g_object_get (data->playbin2, "current-text", &data->current_text, NULL);
g_print ("\n");
g_print ("Currently playing video stream %d, audio stream %d and text stream %d\n",
data->current_video, data->current_audio, data->current_text);
g_print ("Type any number and hit ENTER to select a different audio stream\n");
}
/* Process messages from GStreamer */
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data) {
GError *err;
gchar *debug_info;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR:
gst_message_parse_error (msg, &err, &debug_info);
@ -261,15 +258,15 @@ static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data)
}
} break;
}
/* We want to keep receiving messages */
return TRUE;
}
/* Process keyboard input */
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
gchar *str = NULL;
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) {
int index = atoi (str);
if (index < 0 || index >= data->n_audio) {
@ -285,48 +282,69 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-1972852059" class="expand-container">
<div id="expander-control-1972852059" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-1972852059" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc playback-tutorial-1.c -o playback-tutorial-1 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p></p>
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The number of audio streams is shown in the terminal, and the user can switch from one to another by entering a number and pressing enter. A small delay is to be expected.</span></p>
<p><span><span>Bear in mind that there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how </span><a href="http://docs.gstreamer.com/display/GstSDK/Tutorial+12%3A+Live+streaming">Tutorial 12: Live streaming</a><span> solves this issue.</span></span></p>
<p></p>
<p>Required libraries: <code>gstreamer-0.10</code></p>
</div>
</div></td>
> ![](images/icons/emoticons/information.png) If you need help to compile this code, refer to the **Building the
> tutorials** section for your platform: [Mac](Installing+on+Mac+OS+X.markdown) or [Windows](Installing+on+Windows)
> or use this specific command on Linux:
> ```gcc playback-tutorial-1.c -o playback-tutorial-1 `pkg-config --cflags --libs gstreamer-1.0` ```
If you need help to run this code, refer to the **Running the
tutorials** section for your platform:
[Mac OS X](Installing+on+Mac+OS+X.markdown#building-the-tutorials),
[Windows](Installing+on+Windows.markdown#running-the-tutorials), for
[iOS](Installing+for+iOS+development.markdown#building-the-tutorials) or for
[android](Installing+for+Android+development.markdown#building-the-tutorials).
This tutorial opens a window and displays a movie, with accompanying
audio. The media is fetched from the Internet, so the window might take
a few seconds to appear, depending on your connection speed. The number
of audio streams is shown in the terminal, and the user can switch from
one to another by entering a number and pressing enter. A small delay is
to be expected.
</p>
<p>
Bear in mind that there is no latency management (buffering), so on slow
connections, the movie might stop after a few seconds. See
how <a href="http://docs.gstreamer.com/display/GstSDK/Tutorial+12%3A+Live+streaming">Tutorial
12: Live streaming</a> solves this issue.
</p>
<p>
</p>
<p>
Required libraries: <code>gstreamer-0.10</code>
</p>
</td>
</tr>
</tbody>
</table>
# Walkthrough
``` first-line: 3; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */
gint n_video; /* Number of embedded video streams */
gint n_audio; /* Number of embedded audio streams */
gint n_text; /* Number of embedded subtitle streams */
gint current_video; /* Currently playing video stream */
gint current_audio; /* Currently playing audio stream */
gint current_text; /* Currently playing subtitle stream */
GMainLoop *main_loop; /* GLib's Main Loop */
} CustomData;
```
@ -337,7 +355,7 @@ streams of each type, and the currently playing one. Also, we are going
to use a different mechanism to wait for messages that allows
interactivity, so we need a GLib's main loop object.
``` first-line: 18; theme: Default; brush: cpp; gutter: true
``` lang=c
/* playbin2 flags */
typedef enum {
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
@ -356,7 +374,7 @@ be retrieved at runtime without using this trick, but in a far more
cumbersome
way.
``` first-line: 25; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Forward definition for the message and keyboard processing functions */
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data);
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data);
@ -375,7 +393,7 @@ the pipeline, and use directly the  `playbin2` element.
We focus on some of the other properties of `playbin2`, though:
``` first-line: 50; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set flags to show Audio and Video but ignore Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO;
@ -387,31 +405,15 @@ g_object_set (data.playbin2, "flags", flags, NULL);
can have any combination of `GstPlayFlags`. The most interesting values
are:
<table>
<tbody>
<tr class="odd">
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_VIDEO</code></span></p></td>
</tr>
<tr class="even">
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_AUDIO</code></span></p></td>
</tr>
<tr class="odd">
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_TEXT</code></span></p></td>
</tr>
<tr class="even">
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_VIS</code></span></p></td>
</tr>
<tr class="odd">
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_DOWNLOAD</code></span></p></td>
</tr>
<tr class="even">
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_BUFFERING</code></span></p></td>
</tr>
<tr class="odd">
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_DEINTERLACE</code></span></p></td>
</tr>
</tbody>
</table>
| | |
|---------------------------|------------------------------------------------------------------------------------------------------------------------------------|
| GST_PLAY_FLAG_VIDEO | Enable video rendering. If this flag is not set, there will be no video output. |
| GST_PLAY_FLAG_AUDIO | Enable audio rendering. If this flag is not set, there will be no audio output. |
| GST_PLAY_FLAG_TEXT | Enable subtitle rendering. If this flag is not set, subtitles will not be shown in the video output. |
| GST_PLAY_FLAG_VIS | Enable rendering of visualisations when there is no video stream. Playback tutorial 6: Audio visualization goes into more details. |
| GST_PLAY_FLAG_DOWNLOAD | See Basic tutorial 12: Streaming and Playback tutorial 4: Progressive streaming. |
| GST_PLAY_FLAG_BUFFERING | See Basic tutorial 12: Streaming and Playback tutorial 4: Progressive streaming. |
| GST_PLAY_FLAG_DEINTERLACE | If the video content was interlaced, this flag instructs playbin2 to deinterlace it before displaying it. |
In our case, for demonstration purposes, we are enabling audio and video
and disabling subtitles, leaving the rest of flags to their default
@ -419,7 +421,7 @@ values (this is why we read the current value of the flags with
`g_object_get()` before overwriting it with
`g_object_set()`).
``` first-line: 56; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set connection speed. This will affect some internal decisions of playbin2 */
g_object_set (data.playbin2, "connection-speed", 56, NULL);
```
@ -435,13 +437,13 @@ We have set all these properties one by one, but we could have all of
them with a single call to
`g_object_set()`:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", "flags", flags, "connection-speed", 56, NULL);
```
This is why `g_object_set()` requires a NULL as the last parameter.
``` first-line: 63; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
@ -459,7 +461,7 @@ GStreamer has little to do with it besides the Navigation interface
discussed briefly in [Tutorial 17: DVD
playback](http://docs.gstreamer.com/display/GstSDK/Tutorial+17%3A+DVD+playback).
``` first-line: 79; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create a GLib Main Loop and set it to run */
data.main_loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.main_loop);
@ -476,14 +478,14 @@ times: `handle_message` when a message appears on the bus, and
There is nothing new in handle\_message, except that when the pipeline
moves to the PLAYING state, it will call the `analyze_streams` function:
``` first-line: 92; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Extract some metadata from the streams and print it on the screen */
static void analyze_streams (CustomData *data) {
gint i;
GstTagList *tags;
gchar *str;
guint rate;
/* Read some properties */
g_object_get (data->playbin2, "n-video", &data->n_video, NULL);
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL);
@ -495,7 +497,7 @@ media and prints it on the screen. The number of video, audio and
subtitle streams is directly available through the `n-video`,
`n-audio` and `n-text` properties.
``` first-line: 108; theme: Default; brush: cpp; gutter: true
``` lang=c
for (i = 0; i < data->n_video; i++) {
tags = NULL;
/* Retrieve the stream's video tags */
@ -515,15 +517,42 @@ stored as tags in a `GstTagList` structure, which is a list of data
pieces identified by a name. The `GstTagList` associated with a stream
can be recovered with `g_signal_emit_by_name()`, and then individual
tags are extracted with the `gst_tag_list_get_*` functions
like `gst_tag_list_get_string()` for example.
like `gst_tag_list_get_string()` for
example.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>This rather unintuitive way of retrieving the tag list is called an Action Signal. Action signals are emitted by the application to a specific element, which then performs an action and returns a result. They behave like a dynamic function call, in which methods of a class are identified by their name (the signal's name) instead of their memory address. These signals are listed In the documentation along with the regular signals, and are tagged “Action”. See <code>playbin2</code>, for example.</p></td>
<td>
<img src="images/icons/emoticons/information.png" width="16" height="16" />
</td>
<td>
<p>
This rather unintuitive way of retrieving the tag list is called an
Action Signal. Action signals are emitted by the application to a
specific element, which then performs an action and returns a result.
They behave like a dynamic function call, in which methods of a class
are identified by their name (the signal's name) instead of their memory
address. These signals are listed In the documentation along with the
regular signals, and are tagged “Action”. See <code>playbin2</code>, for
example.
</p>
</td>
</tr>
</tbody>
</table>
`playbin2` defines 3 action signals to retrieve
@ -534,7 +563,7 @@ name if the tags is standardized, and the list can be found in the
`GST_TAG_*_CODEC` (audio, video or
text).
``` first-line: 158; theme: Default; brush: cpp; gutter: true
``` lang=c
g_object_get (data->playbin2, "current-video", &data->current_video, NULL);
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL);
g_object_get (data->playbin2, "current-text", &data->current_text, NULL);
@ -550,11 +579,11 @@ never make any assumption. Multiple internal conditions can make
in which the streams are listed can change from one run to another, so
checking the metadata to identify one particular stream becomes crucial.
``` first-line: 202; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Process keyboard input */
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
gchar *str = NULL;
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) {
int index = atoi (str);
if (index < 0 || index >= data->n_audio) {
@ -611,6 +640,3 @@ Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 2: Subtitle management
# Playback tutorial 2: Subtitle management
This page last changed on May 16, 2012 by xartigas.
@ -41,69 +41,69 @@ it in the SDK installation).
**playback-tutorial-2.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */
gint n_video; /* Number of embedded video streams */
gint n_audio; /* Number of embedded audio streams */
gint n_text; /* Number of embedded subtitle streams */
gint current_video; /* Currently playing video stream */
gint current_audio; /* Currently playing audio stream */
gint current_text; /* Currently playing subtitle stream */
GMainLoop *main_loop; /* GLib's Main Loop */
} CustomData;
/* playbin2 flags */
typedef enum {
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */
GST_PLAY_FLAG_TEXT = (1 << 2) /* We want subtitle output */
} GstPlayFlags;
/* Forward definition for the message and keyboard processing functions */
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data);
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data);
int main(int argc, char *argv[]) {
CustomData data;
GstBus *bus;
GstStateChangeReturn ret;
gint flags;
GIOChannel *io_stdin;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
if (!data.playbin2) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.ogv", NULL);
/* Set the subtitle URI to play and some font description */
g_object_set (data.playbin2, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL);
g_object_set (data.playbin2, "subtitle-font-desc", "Sans, 18", NULL);
/* Set flags to show Audio, Video and Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT;
g_object_set (data.playbin2, "flags", flags, NULL);
/* Add a bus watch, so we get notified when a message arrives */
bus = gst_element_get_bus (data.playbin2);
gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data);
/* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
@ -111,7 +111,7 @@ int main(int argc, char *argv[]) {
io_stdin = g_io_channel_unix_new (fileno (stdin));
#endif
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
/* Start playing */
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -119,11 +119,11 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.playbin2);
return -1;
}
/* Create a GLib Main Loop and set it to run */
data.main_loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.main_loop);
/* Free resources */
g_main_loop_unref (data.main_loop);
g_io_channel_unref (io_stdin);
@ -132,22 +132,22 @@ int main(int argc, char *argv[]) {
gst_object_unref (data.playbin2);
return 0;
}
/* Extract some metadata from the streams and print it on the screen */
static void analyze_streams (CustomData *data) {
gint i;
GstTagList *tags;
gchar *str;
guint rate;
/* Read some properties */
g_object_get (data->playbin2, "n-video", &data->n_video, NULL);
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL);
g_object_get (data->playbin2, "n-text", &data->n_text, NULL);
g_print ("%d video stream(s), %d audio stream(s), %d text stream(s)\n",
data->n_video, data->n_audio, data->n_text);
g_print ("\n");
for (i = 0; i < data->n_video; i++) {
tags = NULL;
@ -161,7 +161,7 @@ static void analyze_streams (CustomData *data) {
gst_tag_list_free (tags);
}
}
g_print ("\n");
for (i = 0; i < data->n_audio; i++) {
tags = NULL;
@ -183,7 +183,7 @@ static void analyze_streams (CustomData *data) {
gst_tag_list_free (tags);
}
}
g_print ("\n");
for (i = 0; i < data->n_text; i++) {
tags = NULL;
@ -200,22 +200,22 @@ static void analyze_streams (CustomData *data) {
g_print (" no tags found\n");
}
}
g_object_get (data->playbin2, "current-video", &data->current_video, NULL);
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL);
g_object_get (data->playbin2, "current-text", &data->current_text, NULL);
g_print ("\n");
g_print ("Currently playing video stream %d, audio stream %d and subtitle stream %d\n",
data->current_video, data->current_audio, data->current_text);
g_print ("Type any number and hit ENTER to select a different subtitle stream\n");
}
/* Process messages from GStreamer */
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data) {
GError *err;
gchar *debug_info;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR:
gst_message_parse_error (msg, &err, &debug_info);
@ -240,15 +240,15 @@ static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data)
}
} break;
}
/* We want to keep receiving messages */
return TRUE;
}
/* Process keyboard input */
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
gchar *str = NULL;
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) {
int index = atoi (str);
if (index < 0 || index >= data->n_text) {
@ -297,7 +297,7 @@ This tutorial is copied from [Playback tutorial 1: Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html) with some
changes, so let's review only the changes.
``` first-line: 50; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set the subtitle URI to play and some font description */
g_object_set (data.playbin2, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL);
g_object_set (data.playbin2, "subtitle-font-desc", "Sans, 18", NULL);
@ -351,7 +351,7 @@ Extra-Expanded, Ultra-Expanded
 
``` first-line: 54; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set flags to show Audio, Video and Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT;
@ -384,7 +384,7 @@ they are embedded in the container or in a different file:
The next playback tutorial shows how to change the playback speed.
Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon\!
<table>
@ -400,4 +400,3 @@ It has been a pleasure having you here, and see you soon\!
</table>
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 3: Short-cutting the pipeline
# Playback tutorial 3: Short-cutting the pipeline
This page last changed on Jun 26, 2012 by xartigas.
@ -32,27 +32,27 @@ Copy this code into a text file named `playback-tutorial-3.c`.
**playback-tutorial-3.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
#include <string.h>
#define CHUNK_SIZE 1024 /* Amount of bytes we are sending in each buffer */
#define SAMPLE_RATE 44100 /* Samples per second we are sending */
#define AUDIO_CAPS "audio/x-raw-int,channels=1,rate=%d,signed=(boolean)true,width=16,depth=16,endianness=BYTE_ORDER"
/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
GstElement *pipeline;
GstElement *app_source;
guint64 num_samples; /* Number of samples generated so far (for timestamp generation) */
gfloat a, b, c, d; /* For waveform generation */
guint sourceid; /* To control the GSource */
GMainLoop *main_loop; /* GLib's Main Loop */
} CustomData;
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
* The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
* and is removed when appsrc has enough data (enough-data signal).
@ -64,14 +64,14 @@ static gboolean push_data (CustomData *data) {
gint16 *raw;
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
gfloat freq;
/* Create a new empty buffer */
buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);
/* Set its timestamp and duration */
GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);
/* Generate some psychodelic waveforms */
raw = (gint16 *)GST_BUFFER_DATA (buffer);
data->c += data->d;
@ -83,21 +83,21 @@ static gboolean push_data (CustomData *data) {
raw[i] = (gint16)(500 * data->a);
}
data->num_samples += num_samples;
/* Push the buffer into the appsrc */
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
/* Free the buffer now that we are done with it */
gst_buffer_unref (buffer);
if (ret != GST_FLOW_OK) {
/* We got some error, stop sending data */
return FALSE;
}
return TRUE;
}
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
* to the mainloop to start pushing data into the appsrc */
static void start_feed (GstElement *source, guint size, CustomData *data) {
@ -106,7 +106,7 @@ static void start_feed (GstElement *source, guint size, CustomData *data) {
data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
}
}
/* This callback triggers when appsrc has enough data and we can stop sending.
* We remove the idle handler from the mainloop */
static void stop_feed (GstElement *source, CustomData *data) {
@ -116,31 +116,31 @@ static void stop_feed (GstElement *source, CustomData *data) {
data->sourceid = 0;
}
}
/* This function is called when an error message is posted on the bus */
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
GError *err;
gchar *debug_info;
/* Print error details on the screen */
gst_message_parse_error (msg, &err, &debug_info);
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
g_clear_error (&err);
g_free (debug_info);
g_main_loop_quit (data->main_loop);
}
/* This function is called when playbin2 has created the appsrc element, so we have
* a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
gchar *audio_caps_text;
GstCaps *audio_caps;
g_print ("Source has been created. Configuring.\n");
data->app_source = source;
/* Configure appsrc */
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
audio_caps = gst_caps_from_string (audio_caps_text);
@ -150,36 +150,36 @@ static void source_setup (GstElement *pipeline, GstElement *source, CustomData *
gst_caps_unref (audio_caps);
g_free (audio_caps_text);
}
int main(int argc, char *argv[]) {
CustomData data;
GstBus *bus;
/* Initialize cumstom data structure */
memset (&data, 0, sizeof (data));
data.b = 1; /* For waveform generation */
data.d = 1;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the playbin2 element */
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL);
g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (data.pipeline);
gst_bus_add_signal_watch (bus);
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
gst_object_unref (bus);
/* Start playing the pipeline */
gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
/* Create a GLib Main Loop and set it to run */
data.main_loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.main_loop);
/* Free resources */
gst_element_set_state (data.pipeline, GST_STATE_NULL);
gst_object_unref (data.pipeline);
@ -190,7 +190,7 @@ int main(int argc, char *argv[]) {
To use an `appsrc` as the source for the pipeline, simply instantiate a
`playbin2` and set its URI to `appsrc://`
``` first-line: 131; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create the playbin2 element */
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL);
```
@ -199,7 +199,7 @@ data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL);
`source-setup` signal to allow the application to configure
it:
``` first-line: 133; theme: Default; brush: cpp; gutter: true
``` lang=c
g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data);
```
@ -208,16 +208,16 @@ since, once the signal handler returns, `playbin2` will instantiate the
next element in the pipeline according to these
caps:
``` first-line: 100; theme: Default; brush: cpp; gutter: true
``` lang=c
/* This function is called when playbin2 has created the appsrc element, so we have
* a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
gchar *audio_caps_text;
GstCaps *audio_caps;
g_print ("Source has been created. Configuring.\n");
data->app_source = source;
/* Configure appsrc */
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
audio_caps = gst_caps_from_string (audio_caps_text);
@ -262,11 +262,10 @@ It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[playback-tutorial-3.c](attachments/1442200/2424850.c) (text/plain)
[playback-tutorial-3.c](attachments/1442200/2424850.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/1442200/2424849.zip) (application/zip)
[vs2010.zip](attachments/1442200/2424849.zip) (application/zip)
![](images/icons/bullet_blue.gif)
[playback-tutorial-3.c](attachments/1442200/2424848.c) (text/plain)
[playback-tutorial-3.c](attachments/1442200/2424848.c) (text/plain)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 4: Progressive streaming
# Playback tutorial 4: Progressive streaming
This page last changed on Sep 13, 2012 by xartigas.
@ -54,24 +54,24 @@ Copy this code into a text file named `playback-tutorial-4.c`.
**playback-tutorial-4.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
#include <string.h>
#define GRAPH_LENGTH 80
/* playbin2 flags */
typedef enum {
GST_PLAY_FLAG_DOWNLOAD = (1 << 7) /* Enable progressive download (on selected formats) */
} GstPlayFlags;
typedef struct _CustomData {
gboolean is_live;
GstElement *pipeline;
GMainLoop *loop;
gint buffering_level;
} CustomData;
static void got_location (GstObject *gstobject, GstObject *prop_object, GParamSpec *prop, gpointer data) {
gchar *location;
g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL);
@ -79,19 +79,19 @@ static void got_location (GstObject *gstobject, GstObject *prop_object, GParamSp
/* Uncomment this line to keep the temporary file after the program exits */
/* g_object_set (G_OBJECT (prop_object), "temp-remove", FALSE, NULL); */
}
static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (msg, &err, &debug);
g_print ("Error: %s\n", err->message);
g_error_free (err);
g_free (debug);
gst_element_set_state (data->pipeline, GST_STATE_READY);
g_main_loop_quit (data->loop);
break;
@ -104,9 +104,9 @@ static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
case GST_MESSAGE_BUFFERING:
/* If the stream is live, we do not care about buffering. */
if (data->is_live) break;
gst_message_parse_buffering (msg, &data->buffering_level);
/* Wait until buffering is complete before start/resume playing */
if (data->buffering_level < 100)
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
@ -123,11 +123,11 @@ static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
break;
}
}
static gboolean refresh_ui (CustomData *data) {
GstQuery *query;
gboolean result;
query = gst_query_new_buffering (GST_FORMAT_PERCENT);
result = gst_element_query (data->pipeline, query);
if (result) {
@ -135,10 +135,10 @@ static gboolean refresh_ui (CustomData *data) {
gchar graph[GRAPH_LENGTH + 1];
GstFormat format = GST_FORMAT_TIME;
gint64 position = 0, duration = 0;
memset (graph, ' ', GRAPH_LENGTH);
graph[GRAPH_LENGTH] = '\0';
n_ranges = gst_query_get_n_buffering_ranges (query);
for (range = 0; range < n_ranges; range++) {
gint64 start, stop;
@ -163,11 +163,11 @@ static gboolean refresh_ui (CustomData *data) {
}
g_print ("\r");
}
return TRUE;
}
int main(int argc, char *argv[]) {
GstElement *pipeline;
GstBus *bus;
@ -175,26 +175,26 @@ int main(int argc, char *argv[]) {
GMainLoop *main_loop;
CustomData data;
guint flags;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Initialize our data structure */
memset (&data, 0, sizeof (data));
data.buffering_level = 100;
/* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
bus = gst_element_get_bus (pipeline);
/* Set the download flag */
g_object_get (pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_DOWNLOAD;
g_object_set (pipeline, "flags", flags, NULL);
/* Uncomment this line to limit the amount of downloaded data */
/* g_object_set (pipeline, "ring-buffer-max-size", (guint64)4000000, NULL); */
/* Start playing */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -204,20 +204,20 @@ int main(int argc, char *argv[]) {
} else if (ret == GST_STATE_CHANGE_NO_PREROLL) {
data.is_live = TRUE;
}
main_loop = g_main_loop_new (NULL, FALSE);
data.loop = main_loop;
data.pipeline = pipeline;
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message", G_CALLBACK (cb_message), &data);
g_signal_connect (pipeline, "deep-notify::temp-location", G_CALLBACK (got_location), NULL);
/* Register a function that GLib will call every second */
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
g_main_loop_run (main_loop);
/* Free resources */
g_main_loop_unref (main_loop);
gst_object_unref (bus);
@ -260,7 +260,7 @@ only the differences.
#### Setup
``` first-line: 133; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set the download flag */
g_object_get (pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_DOWNLOAD;
@ -271,7 +271,7 @@ By setting this flag, `playbin2` instructs its internal queue (a
`queue2` element, actually) to store all downloaded
data.
``` first-line: 157; theme: Default; brush: cpp; gutter: true
``` lang=c
g_signal_connect (pipeline, "deep-notify::temp-location", G_CALLBACK (got_location), NULL);
```
@ -282,7 +282,7 @@ changes, indicating that the `queue2` has decided where to store the
downloaded
data.
``` first-line: 18; theme: Default; brush: cpp; gutter: true
``` lang=c
static void got_location (GstObject *gstobject, GstObject *prop_object, GParamSpec *prop, gpointer data) {
gchar *location;
g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL);
@ -313,7 +313,7 @@ removed. As the comment reads, you can keep it by setting the
In `main` we also install a timer which we use to refresh the UI every
second.
``` first-line: 159; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Register a function that GLib will call every second */
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
```
@ -332,7 +332,7 @@ pipeline is paused). Keep in mind that if your network is fast enough,
you will not see the download bar (the dashes) advance at all; it will
be completely full from the beginning.
``` first-line: 70; theme: Default; brush: cpp; gutter: true
``` lang=c
static gboolean refresh_ui (CustomData *data) {
GstQuery *query;
gboolean result;
@ -356,7 +356,7 @@ succeeded. The answer to the query is contained in the same
`GstQuery` structure we created, and can be retrieved using multiple
parse methods:
``` first-line: 85; theme: Default; brush: cpp; gutter: true
``` lang=c
n_ranges = gst_query_get_n_buffering_ranges (query);
for (range = 0; range < n_ranges; range++) {
gint64 start, stop;
@ -380,7 +380,7 @@ range) depends on what we requested in the
`gst_query_new_buffering()` call. In this case, PERCENTAGE. These
values are used to generate the graph.
``` first-line: 94; theme: Default; brush: cpp; gutter: true
``` lang=c
if (gst_element_query_position (data->pipeline, &format, &position) &&
GST_CLOCK_TIME_IS_VALID (position) &&
gst_element_query_duration (data->pipeline, &format, &duration) &&
@ -402,7 +402,7 @@ depending on the buffering level. If it is below 100%, the code in the
an `X`. If the buffering level is 100% the pipeline is in the
`PLAYING` state and we print a `>`.
``` first-line: 102; theme: Default; brush: cpp; gutter: true
``` lang=c
if (data->buffering_level < 100) {
g_print (" Buffering: %3d%%", data->buffering_level);
} else {
@ -415,7 +415,7 @@ information (and delete it otherwise).
#### Limiting the size of the downloaded file
``` first-line: 138; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Uncomment this line to limit the amount of downloaded data */
/* g_object_set (pipeline, "ring-buffer-max-size", (guint64)4000000, NULL); */
```
@ -442,9 +442,8 @@ It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[playback-tutorial-4.c](attachments/327808/2424846.c) (text/plain)
[playback-tutorial-4.c](attachments/327808/2424846.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/327808/2424847.zip) (application/zip)
[vs2010.zip](attachments/327808/2424847.zip) (application/zip)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 5: Color Balance
# Playback tutorial 5: Color Balance
This page last changed on Jun 25, 2012 by xartigas.
@ -44,28 +44,28 @@ Copy this code into a text file named `playback-tutorial-5.c`.
**playback-tutorial-5.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <gst/gst.h>
#include <gst/interfaces/colorbalance.h>
typedef struct _CustomData {
GstElement *pipeline;
GMainLoop *loop;
} CustomData;
/* Process a color balance command */
static void update_color_channel (const gchar *channel_name, gboolean increase, GstColorBalance *cb) {
gdouble step;
gint value;
GstColorBalanceChannel *channel = NULL;
const GList *channels, *l;
/* Retrieve the list of channels and locate the requested one */
channels = gst_color_balance_list_channels (cb);
for (l = channels; l != NULL; l = l->next) {
GstColorBalanceChannel *tmp = (GstColorBalanceChannel *)l->data;
if (g_strrstr (tmp->label, channel_name)) {
channel = tmp;
break;
@ -73,7 +73,7 @@ static void update_color_channel (const gchar *channel_name, gboolean increase,
}
if (!channel)
return;
/* Change the channel's value */
step = 0.1 * (channel->max_value - channel->min_value);
value = gst_color_balance_get_value (cb, channel);
@ -88,11 +88,11 @@ static void update_color_channel (const gchar *channel_name, gboolean increase,
}
gst_color_balance_set_value (cb, channel, value);
}
/* Output the current values of all Color Balance channels */
static void print_current_values (GstElement *pipeline) {
const GList *channels, *l;
/* Output Color Balance values */
channels = gst_color_balance_list_channels (GST_COLOR_BALANCE (pipeline));
for (l = channels; l != NULL; l = l->next) {
@ -103,15 +103,15 @@ static void print_current_values (GstElement *pipeline) {
}
g_print ("\n");
}
/* Process keyboard input */
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
gchar *str = NULL;
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) != G_IO_STATUS_NORMAL) {
return TRUE;
}
switch (g_ascii_tolower (str[0])) {
case 'c':
update_color_channel ("CONTRAST", g_ascii_isupper (str[0]), GST_COLOR_BALANCE (data->pipeline));
@ -131,25 +131,25 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
default:
break;
}
g_free (str);
print_current_values (data->pipeline);
return TRUE;
}
int main(int argc, char *argv[]) {
CustomData data;
GstStateChangeReturn ret;
GIOChannel *io_stdin;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Initialize our data structure */
memset (&data, 0, sizeof (data));
/* Print usage map */
g_print (
"USAGE: Choose one of the following options, then press enter:\n"
@ -158,10 +158,10 @@ int main(int argc, char *argv[]) {
" 'H' to increase hue, 'h' to decrease hue\n"
" 'S' to increase saturation, 's' to decrease saturation\n"
" 'Q' to quit\n");
/* Build the pipeline */
data.pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
@ -169,7 +169,7 @@ int main(int argc, char *argv[]) {
io_stdin = g_io_channel_unix_new (fileno (stdin));
#endif
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
/* Start playing */
ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -178,11 +178,11 @@ int main(int argc, char *argv[]) {
return -1;
}
print_current_values (data.pipeline);
/* Create a GLib Main Loop and set it to run */
data.loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.loop);
/* Free resources */
g_main_loop_unref (data.loop);
g_io_channel_unref (io_stdin);
@ -225,11 +225,11 @@ The `main()` function is fairly simple. A `playbin2` pipeline is
instantiated and set to run, and a keyboard watch is installed so
keystrokes can be monitored.
``` first-line: 45; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Output the current values of all Color Balance channels */
static void print_current_values (GstElement *pipeline) {
const GList *channels, *l;
/* Output Color Balance values */
channels = gst_color_balance_list_channels (GST_COLOR_BALANCE (pipeline));
for (l = channels; l != NULL; l = l->next) {
@ -255,19 +255,19 @@ retrieve the current value.
In this example, the minimum and maximum values are used to output the
current value as a percentage.
``` first-line: 10; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Process a color balance command */
static void update_color_channel (const gchar *channel_name, gboolean increase, GstColorBalance *cb) {
gdouble step;
gint value;
GstColorBalanceChannel *channel = NULL;
const GList *channels, *l;
/* Retrieve the list of channels and locate the requested one */
channels = gst_color_balance_list_channels (cb);
for (l = channels; l != NULL; l = l->next) {
GstColorBalanceChannel *tmp = (GstColorBalanceChannel *)l->data;
if (g_strrstr (tmp->label, channel_name)) {
channel = tmp;
break;
@ -283,7 +283,7 @@ parsed looking for the channel with the specified name. Obviously, this
list could be parsed only once and the pointers to the channels be
stored and indexed by something more efficient than a string.
``` first-line: 30; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Change the channel's value */
step = 0.1 * (channel->max_value - channel->min_value);
value = gst_color_balance_get_value (cb, channel);
@ -322,9 +322,8 @@ It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[playback-tutorial-5.c](attachments/327804/2424874.c) (text/plain)
[playback-tutorial-5.c](attachments/327804/2424874.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/327804/2424875.zip) (application/zip)
[vs2010.zip](attachments/327804/2424875.zip) (application/zip)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 6: Audio visualization
# Playback tutorial 6: Audio visualization
This page last changed on Jun 26, 2012 by xartigas.
@ -41,27 +41,27 @@ Copy this code into a text file named `playback-tutorial-6.c`.
**playback-tutorial-6.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
/* playbin2 flags */
typedef enum {
GST_PLAY_FLAG_VIS = (1 << 3) /* Enable rendering of visualizations when there is no video stream. */
} GstPlayFlags;
/* Return TRUE if this is a Visualization element */
static gboolean filter_vis_features (GstPluginFeature *feature, gpointer data) {
GstElementFactory *factory;
if (!GST_IS_ELEMENT_FACTORY (feature))
return FALSE;
factory = GST_ELEMENT_FACTORY (feature);
if (!g_strrstr (gst_element_factory_get_klass (factory), "Visualization"))
return FALSE;
return TRUE;
}
int main(int argc, char *argv[]) {
GstElement *pipeline, *vis_plugin;
GstBus *bus;
@ -69,59 +69,59 @@ int main(int argc, char *argv[]) {
GList *list, *walk;
GstElementFactory *selected_factory = NULL;
guint flags;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Get a list of all visualization plugins */
list = gst_registry_feature_filter (gst_registry_get_default (), filter_vis_features, FALSE, NULL);
/* Print their names */
g_print("Available visualization plugins:\n");
for (walk = list; walk != NULL; walk = g_list_next (walk)) {
const gchar *name;
GstElementFactory *factory;
factory = GST_ELEMENT_FACTORY (walk->data);
name = gst_element_factory_get_longname (factory);
g_print(" %s\n", name);
if (selected_factory == NULL || g_str_has_prefix (name, "GOOM")) {
selected_factory = factory;
}
}
/* Don't use the factory if it's still empty */
/* e.g. no visualization plugins found */
if (!selected_factory) {
g_print ("No visualization plugins found!\n");
return -1;
}
/* We have now selected a factory for the visualization element */
g_print ("Selected '%s'\n", gst_element_factory_get_longname (selected_factory));
vis_plugin = gst_element_factory_create (selected_factory, NULL);
if (!vis_plugin)
return -1;
/* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://radio.hbr1.com:19800/ambient.ogg", NULL);
/* Set the visualization flag */
g_object_get (pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIS;
g_object_set (pipeline, "flags", flags, NULL);
/* set vis plugin for playbin2 */
g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
/* Start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
/* Free resources */
if (msg != NULL)
gst_message_unref (msg);
@ -163,7 +163,7 @@ First off, we indicate `playbin2` that we want an audio visualization by
setting the `GST_PLAY_FLAG_VIS` flag. If the media already contains
video, this flag has no effect.
``` first-line: 66; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set the visualization flag */
g_object_get (pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIS;
@ -175,7 +175,7 @@ If no visualization plugin is enforced by the user, `playbin2` will use
available). The rest of the tutorial shows how to find out the available
visualization elements and enforce one to `playbin2`.
``` first-line: 32; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Get a list of all visualization plugins */
list = gst_registry_feature_filter (gst_registry_get_default (), filter_vis_features, FALSE, NULL);
```
@ -185,17 +185,17 @@ GStreamer registry and selects those for which
the `filter_vis_features` function returns TRUE. This function selects
only the Visualization plugins:
``` first-line: 8; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Return TRUE if this is a Visualization element */
static gboolean filter_vis_features (GstPluginFeature *feature, gpointer data) {
GstElementFactory *factory;
if (!GST_IS_ELEMENT_FACTORY (feature))
return FALSE;
factory = GST_ELEMENT_FACTORY (feature);
if (!g_strrstr (gst_element_factory_get_klass (factory), "Visualization"))
return FALSE;
return TRUE;
}
```
@ -215,17 +215,17 @@ is a “string describing the type of element, as an unordered list
separated with slashes (/)”. Examples of classes are “Source/Network”,
“Codec/Decoder/Video”, “Codec/Encoder/Audio” or “Visualization”.
``` first-line: 35; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Print their names */
g_print("Available visualization plugins:\n");
for (walk = list; walk != NULL; walk = g_list_next (walk)) {
const gchar *name;
GstElementFactory *factory;
factory = GST_ELEMENT_FACTORY (walk->data);
name = gst_element_factory_get_longname (factory);
g_print(" %s\n", name);
if (selected_factory == NULL || g_str_has_prefix (name, "GOOM")) {
selected_factory = factory;
}
@ -236,7 +236,7 @@ Once we have the list of Visualization plugins, we print their names
(`gst_element_factory_get_longname()`) and choose one (in this case,
GOOM).
``` first-line: 57; theme: Default; brush: cpp; gutter: true
``` lang=c
/* We have now selected a factory for the visualization element */
g_print ("Selected '%s'\n", gst_element_factory_get_longname (selected_factory));
vis_plugin = gst_element_factory_create (selected_factory, NULL);
@ -247,7 +247,7 @@ if (!vis_plugin)
The selected factory is used to instantiate an actual `GstElement` which
is then passed to `playbin2` through the `vis-plugin` property:
``` first-line: 71; theme: Default; brush: cpp; gutter: true
``` lang=c
/* set vis plugin for playbin2 */
g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
```
@ -268,9 +268,8 @@ It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/327802/2424878.zip) (application/zip)
[vs2010.zip](attachments/327802/2424878.zip) (application/zip)
![](images/icons/bullet_blue.gif)
[playback-tutorial-6.c](attachments/327802/2424879.c) (text/plain)
[playback-tutorial-6.c](attachments/327802/2424879.c) (text/plain)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 7: Custom playbin2 sinks
# Playback tutorial 7: Custom playbin2 sinks
This page last changed on Dec 03, 2012 by xartigas.
@ -55,21 +55,21 @@ Copy this code into a text file named `playback-tutorial-7.c`.
**playback-tutorial7.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
int main(int argc, char *argv[]) {
GstElement *pipeline, *bin, *equalizer, *convert, *sink;
GstPad *pad, *ghost_pad;
GstBus *bus;
GstMessage *msg;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Create the elements inside the sink bin */
equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer");
convert = gst_element_factory_make ("audioconvert", "convert");
@ -78,7 +78,7 @@ int main(int argc, char *argv[]) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Create the sink bin, add the elements and link them */
bin = gst_bin_new ("audio_sink_bin");
gst_bin_add_many (GST_BIN (bin), equalizer, convert, sink, NULL);
@ -88,21 +88,21 @@ int main(int argc, char *argv[]) {
gst_pad_set_active (ghost_pad, TRUE);
gst_element_add_pad (bin, ghost_pad);
gst_object_unref (pad);
/* Configure the equalizer */
g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL);
g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL);
/* Set playbin2's audio sink to be our sink bin */
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
/* Start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
/* Free resources */
if (msg != NULL)
gst_message_unref (msg);
@ -139,7 +139,7 @@ int main(int argc, char *argv[]) {
# Walkthrough
``` first-line: 15; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create the elements inside the sink bin */
equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer");
convert = gst_element_factory_make ("audioconvert", "convert");
@ -155,7 +155,7 @@ All the Elements that compose our sink-bin are instantiated. We use an
between, because we are not sure of the capabilities of the audio sink
(since they are hardware-dependant).
``` first-line: 24; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create the sink bin, add the elements and link them */
bin = gst_bin_new ("audio_sink_bin");
gst_bin_add_many (GST_BIN (bin), equalizer, convert, sink, NULL);
@ -165,7 +165,7 @@ gst_element_link_many (equalizer, convert, sink, NULL);
This adds the new Elements to the Bin and links them just as we would do
if this was a pipeline.
``` first-line: 28; theme: Default; brush: cpp; gutter: true
``` lang=c
pad = gst_element_get_static_pad (equalizer, "sink");
ghost_pad = gst_ghost_pad_new ("sink", pad);
gst_pad_set_active (ghost_pad, TRUE);
@ -194,7 +194,7 @@ with `gst_object_unref()`.
At this point, we have a functional sink-bin, which we can use as the
audio sink in `playbin2`. We just need to instruct `playbin2` to use it:
``` first-line: 38; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set playbin2's audio sink to be our sink bin */
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
```
@ -202,7 +202,7 @@ g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
It is as simple as setting the `audio-sink` property on `playbin2` to
the newly created sink.
``` first-line: 34; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Configure the equalizer */
g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL);
g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL);
@ -236,11 +236,10 @@ It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[bin-element-ghost.png](attachments/1441842/2424880.png) (image/png)
[bin-element-ghost.png](attachments/1441842/2424880.png) (image/png)
![](images/icons/bullet_blue.gif)
[playback-tutorial-7.c](attachments/1441842/2424881.c) (text/plain)
[playback-tutorial-7.c](attachments/1441842/2424881.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/1441842/2424882.zip) (application/zip)
[vs2010.zip](attachments/1441842/2424882.zip) (application/zip)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 8: Hardware-accelerated video decoding
# Playback tutorial 8: Hardware-accelerated video decoding
This page last changed on Jul 24, 2012 by xartigas.
@ -173,24 +173,24 @@ type. Therefore, the easiest way to make sure hardware acceleration is
enabled or disabled is by changing the rank of the associated element,
as shown in this code:
``` theme: Default; brush: cpp; gutter: true
``` lang=c
static void enable_factory (const gchar *name, gboolean enable) {
GstRegistry *registry = NULL;
GstElementFactory *factory = NULL;
registry = gst_registry_get_default ();
if (!registry) return;
factory = gst_element_factory_find (name);
if (!factory) return;
if (enable) {
gst_plugin_feature_set_rank (GST_PLUGIN_FEATURE (factory), GST_RANK_PRIMARY + 1);
}
else {
gst_plugin_feature_set_rank (GST_PLUGIN_FEATURE (factory), GST_RANK_NONE);
}
gst_registry_add_feature (registry, GST_PLUGIN_FEATURE (factory));
return;
}
@ -348,4 +348,3 @@ accelerated video decoding. Particularly,
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorial 9: Digital audio pass-through
# Playback tutorial 9: Digital audio pass-through
This page last changed on Jul 24, 2012 by xartigas.
@ -106,4 +106,3 @@ In particular, it has shown that:
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Playback tutorials
# Playback tutorials
This page last changed on Mar 28, 2012 by xartigas.
@ -8,4 +8,3 @@ These tutorials explain everything you need to know to produce a media
playback application using GStreamer.
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Qt tutorials
# Qt tutorials
This page last changed on May 02, 2013 by tdfischer.
@ -16,4 +16,3 @@ previous one and adds progressively more functionality, until a working
media player application is obtained in \#FIXME\#
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : QtGStreamer vs C GStreamer
# QtGStreamer vs C GStreamer
This page last changed on May 24, 2013 by xartigas.
@ -65,13 +65,13 @@ with the g\[st\]\_\<class\> prefix removed and converted to camel case.
For example,
``` theme: Default; brush: cpp; gutter: false
``` lang=c
gboolean gst_caps_is_emtpy(const GstCaps *caps);
```
becomes:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
namespace QGst {
class Caps {
bool isEmpty() const;
@ -104,10 +104,9 @@ to call `g_object_ref()`` and g_object_unref()`.
QtGStreamer provides access to the underlying C objects, in case you
need them. This is accessible with a simple cast:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
ElementPtr qgstElement = QGst::ElementFactory::make("playbin2");
GstElement* gstElement = GST_ELEMENT(qgstElement);
```
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Releases
# Releases
This page last changed on Jun 12, 2013 by xartigas.
@ -23,4 +23,3 @@ bottom):
### [2012.5 Amazon](2012.5%2BAmazon.html)
Document generated by Confluence on Oct 08, 2015 10:27

77
TODO.markdown Normal file
View file

@ -0,0 +1,77 @@
# Todo
This is just a simple TODO list to follow progress of the port from
gstreamer.com content to hotdoc
Pages to review:
- Basic+tutorial+6+Media+formats+and+Pad+Capabilities.markdown
- iOS+tutorials.markdown
- 2012.11+Brahmaputra.markdown
- 2012.5+Amazon.markdown
- 2012.7+Amazon+%28Bugfix+Release+1%29.markdown
- 2012.9+Amazon+%28Bugfix+Release+2%29.markdown
- 2013.6+Congo.markdown
- Basic+tutorial+7+Multithreading+and+Pad+Availability.markdown
- Legal+information.markdown
- Basic+tutorial+8+Short-cutting+the+pipeline.markdown
- Mac+OS+X+deployment.markdown
- Basic+tutorial+9+Media+information+gathering.markdown
- Multiplatform+deployment+using+Cerbero.markdown
- Basic+tutorials.markdown
- Playback+tutorial+1+Playbin2+usage.markdown
- Android+tutorial+1+Link+against+GStreamer.markdown
- Playback+tutorial+2+Subtitle+management.markdown
- Android+tutorial+2+A+running+pipeline.markdown
- Contact.markdown
- Playback+tutorial+3+Short-cutting+the+pipeline.markdown
- Android+tutorial+3+Video.markdown
- Deploying+your+application.markdown
- Playback+tutorial+4+Progressive+streaming.markdown
- Android+tutorial+4+A+basic+media+player.markdown
- Frequently+Asked+Questions.markdown
- Playback+tutorial+5+Color+Balance.markdown
- Android+tutorial+5+A+Complete+media+player.markdown
- gst-inspect.markdown
- Playback+tutorial+6+Audio+visualization.markdown
- Android+tutorials.markdown
- gst-launch.markdown
- Playback+tutorial+7+Custom+playbin2+sinks.markdown
- Basic+Media+Player.markdown
- GStreamer+reference.markdown
- Playback+tutorial+8+Hardware-accelerated+video+decoding.markdown
- Basic+tutorial+10+GStreamer+tools.markdown
- Playback+tutorial+9+Digital+audio+pass-through.markdown
- Basic+tutorial+11+Debugging+tools.markdown
- Playback+tutorials.markdown
- Basic+tutorial+12+Streaming.markdown
- Installing+for+iOS+development.markdown
- Installing+on+Linux.markdown
- Installing+on+Mac+OS+X.markdown
- Installing+on+Windows.markdown
- QtGStreamer+vs+C+GStreamer.markdown
- Basic+tutorial+13+Playback+speed.markdown
- Qt+tutorials.markdown
- Basic+tutorial+14+Handy+elements.markdown
- Releases.markdown
- Basic+tutorial+15+Clutter+integration.markdown
- Basic+tutorial+16+Platform-specific+elements.markdown
- Basic+tutorial+1+Hello+world.markdown
- iOS+tutorial+1+Link+against+GStreamer.markdown
- Basic+tutorial+2+GStreamer+concepts.markdown
- iOS+tutorial+2+A+running+pipeline.markdown
- Upcoming+tutorials.markdown
- Basic+tutorial+3+Dynamic+pipelines.markdown
- iOS+tutorial+3+Video.markdown
- Using+appsink%2Fappsrc+in+Qt.markdown
- Basic+tutorial+4+Time+management.markdown
- iOS+tutorial+4+A+basic+media+player.markdown
- Windows+deployment.markdown
Reviewed pages:
- Home.markdown
- Installing+the+SDK.markdown
- Installing+for+Android+development.markdown
- Building+from+source+using+Cerbero.markdown
- Table+of+Concepts.markdown
- Tutorials.markdown

View file

@ -1,58 +1,46 @@
# GStreamer SDK documentation : Table of Concepts
# Table of Concepts
This page last changed on Jun 06, 2012 by xartigas.
This table shows in which tutorial each of the following key GStreamer
concepts is discussed.
- Action signals: [Playback tutorial 1: Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html)
- Audio switching: [Playback tutorial 1: Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html)
- Buffers: [Basic tutorial 8: Short-cutting the
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html)
- Bus: [Basic tutorial 2: GStreamer
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)
- Capabilities: [Basic tutorial 6: Media formats and Pad
Capabilities](Basic%2Btutorial%2B6%253A%2BMedia%2Bformats%2Band%2BPad%2BCapabilities.html)
- Debugging: [Basic tutorial 11: Debugging
tools](Basic%2Btutorial%2B11%253A%2BDebugging%2Btools.html)
- Discoverer: [Basic tutorial 9: Media information
gathering](Basic%2Btutorial%2B9%253A%2BMedia%2Binformation%2Bgathering.html)
- Elements: [Basic tutorial 2: GStreamer
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)
- gst-discoverer: [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)
- gst-inspect: [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html), [gst-inspect](gst-inspect.html)
- gst-launch: [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html), [gst-launch](gst-launch.html)
- GUI: [Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html)
- Links: [Basic tutorial 2: GStreamer
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)
- Pads: [Basic tutorial 3: Dynamic
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html)
- Pad Availability: [Basic tutorial 7: Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html)
- Pipelines: [Basic tutorial 2: GStreamer
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)
- Queries: [Basic tutorial 4: Time
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html)
- Seeks: [Basic tutorial 4: Time
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html)
- Signals: [Basic tutorial 3: Dynamic
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html)
- States: [Basic tutorial 3: Dynamic
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html)
- Subtitles: [Playback tutorial 2: Subtitle
management](Playback%2Btutorial%2B2%253A%2BSubtitle%2Bmanagement.html)
- Tags: [Playback tutorial 1: Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html)
- Tools: [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)
- Threads: [Basic tutorial 7: Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html)
Document generated by Confluence on Oct 08, 2015 10:27
- Action signals: [Playback tutorial 1: Playbin usage]
- Audio switching: [Playback tutorial 1: Playbin usage]
- Buffers: [Basic tutorial 8: Short-cutting the pipeline]
- Bus: [Basic tutorial 2: GStreamer concepts]
- Capabilities: [Basic tutorial 6: Media formats and Pad Capabilities]
- Debugging: [Basic tutorial 11: Debugging tools]
- Discoverer: [Basic tutorial 9: Media information gathering]
- Elements: [Basic tutorial 2: GStreamer concepts]
- gst-discoverer: [Basic tutorial 10: GStreamer tools]
- gst-inspect: [Basic tutorial 10: GStreamer tools], [gst-inspect]
- gst-launch: [Basic tutorial 10: GStreamer tools], [gst-launch]
- GUI: [Basic tutorial 5: GUI toolkit integration]
- Links: [Basic tutorial 2: GStreamer concepts]
- Pads: [Basic tutorial 3: Dynamic pipelines]
- Pad Availability: [Basic tutorial 7: Multithreading and Pad
Availability]
- Pipelines: [Basic tutorial 2: GStreamer concepts]
- Queries: [Basic tutorial 4: Time management]
- Seeks: [Basic tutorial 4: Time management]
- Signals: [Basic tutorial 3: Dynamic pipelines]
- States: [Basic tutorial 3: Dynamic pipelines]
- Subtitles: [Playback tutorial 2: Subtitle management]
- Tags: [Playback tutorial 1: Playbin usage]
- Tools: [Basic tutorial 10: GStreamer tools]
- Threads: [Basic tutorial 7: Multithreading and Pad Availability]
[Playback tutorial 1: Playbin usage]: Playback+tutorial+1+Playbin2+usage.markdown
[Basic tutorial 8: Short-cutting the pipeline]: Basic+tutorial+8+Short-cutting+the+pipeline.markdown
[Basic tutorial 2: GStreamer concepts]: Basic+tutorial+2+GStreamer+concepts.markdown
[Basic tutorial 6: Media formats and Pad Capabilities]: Basic+tutorial+6+Media+formats+and+Pad+Capabilities.markdown
[Basic tutorial 11: Debugging tools]: Basic+tutorial+11+Debugging+tools.markdown
[Basic tutorial 9: Media information gathering]: Basic+tutorial+9+Media+information+gathering.markdown
[Basic tutorial 10: GStreamer tools]: Basic+tutorial+10+GStreamer+tools.markdown
[gst-inspect]: gst-inspect.markdown
[gst-launch]: gst-launch.markdown
[Basic tutorial 5: GUI toolkit integration]: Basic+tutorial+5+GUI+toolkit+integration.markdown
[Basic tutorial 3: Dynamic pipelines]: Basic+tutorial+3+Dynamic+pipelines.markdown
[Basic tutorial 7: Multithreading and Pad Availability]: Basic+tutorial+7+Multithreading+and+Pad+Availability.markdown
[Basic tutorial 4: Time management]: Basic+tutorial+4+Time+management.markdown
[Playback tutorial 2: Subtitle management]: Playback+tutorial+2+Subtitle+management.markdown

View file

@ -1,4 +1,4 @@
# Tutorials
# Tutorials
## Welcome to the GStreamer SDK Tutorials!
@ -9,11 +9,8 @@ open-source, media streaming framework.
### Prerequisites
Before following these tutorials, you need to set up your development
environment according to your platform. If you have not done so yet,
follow the appropriate link for [Linux](Installing+on+Linux.markdown),
[Mac OS X](Installing+on+Mac+OS+X.markdown) or
[Windows](Installing+on+Windows.markdown) and come back here
afterwards.
environment according to your platform. If you have not done so yet, go
to the [installing the SDK] page and come back here afterwards.
The tutorials are currently written only in the C programming language,
so you need to be comfortable with it. Even though C is not an
@ -46,27 +43,25 @@ GObject use `g_`.
### Sources of documentation
You have the `GObject` and `GLib` reference guides, and, of course the
upstream [GStreamer
documentation](http://gstreamer.freedesktop.org/documentation/).
upstream [GStreamer documentation].
### Structure
The tutorials are organized in sections, revolving about a common theme:
- [Basic tutorials](Basic+tutorials.markdown): Describe general topics
required to understand the rest of tutorials in the GStreamer SDK.
- [Playback tutorials](Playback+tutorials.markdown): Explain everything
you need to know to produce a media playback application using
GStreamer.
- [Android tutorials](Android+tutorials.markdown): Tutorials dealing
with the few Android-specific topics you need to know.
- [iOS tutorials](iOS+tutorials.markdown): Tutorials dealing with the
few iOS-specific topics you need to know.
- [Basic tutorials]: Describe general topics required to understand
the rest of tutorials in the GStreamer SDK.
- [Playback tutorials]: Explain everything you need to know to produce
a media playback application using GStreamer.
- [Android tutorials]: Tutorials dealing with the few Android-specific
topics you need to know.
- [iOS tutorials]: Tutorials dealing with the few iOS-specific topics
you need to know.
If you cannot remember in which tutorial a certain GStreamer concept is
explained, use the following:
- [Table of Concepts](Table+of+Concepts.markdown)
- [Table of Concepts]
### Sample media
@ -75,4 +70,13 @@ publicly available and the copyright remains with their respective
authors. In some cases they have been re-encoded for demonstration
purposes.
- [Sintel, the Durian Open Movie Project](http://www.sintel.org/)
- [Sintel, the Durian Open Movie Project]
[installing the SDK]: Installing+the+SDK.markdown
[GStreamer documentation]: http://gstreamer.freedesktop.org/documentation/
[Basic tutorials]: Basic+tutorials.markdown
[Playback tutorials]: Playback+tutorials.markdown
[Android tutorials]: Android+tutorials.markdown
[iOS tutorials]: iOS+tutorials.markdown
[Table of Concepts]: Table+of+Concepts.markdown
[Sintel, the Durian Open Movie Project]: http://www.sintel.org/

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Upcoming tutorials
# Upcoming tutorials
This page last changed on May 24, 2013 by xartigas.
@ -18,4 +18,3 @@ Playback tutorials:
- DVD playback
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Using appsink/appsrc in Qt
# Using appsink/appsrc in Qt
This page last changed on May 24, 2013 by xartigas.
@ -24,7 +24,7 @@ First, the files. These are also available in the
**CMakeLists.txt**
``` theme: Default; brush: plain; gutter: true
```
project(qtgst-example-appsink-src)
find_package(QtGStreamer REQUIRED)
find_package(Qt4 REQUIRED)
@ -37,7 +37,7 @@ target_link_libraries(appsink-src ${QTGSTREAMER_UTILS_LIBRARIES} ${QT_QTCORE_LIB
**main.cpp**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <iostream>
#include <QtCore/QCoreApplication>
#include <QGlib/Error>
@ -146,7 +146,7 @@ As this is a very simple example, most of the action happens in the
**GStreamer Initialization**
``` theme: Default; brush: cpp; gutter: false
``` lang=c
QGst::init(&argc, &argv);
```
@ -154,7 +154,7 @@ Now we can construct the first half of the pipeline:
**Pipeline Setup**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
const char *caps = "audio/x-raw-int,channels=1,rate=8000,"
"signed=(boolean)true,width=16,depth=16,endianness=1234";
 
@ -188,7 +188,7 @@ The second half of the pipeline is created similarly:
**Second Pipeline**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
/* sink pipeline */
QString pipe2Descr = QString("appsrc name=\"mysrc\" caps=\"%1\" ! autoaudiosink").arg(caps);
pipeline2 = QGst::Parse::launch(pipe2Descr).dynamicCast<QGst::Pipeline>();
@ -201,7 +201,7 @@ Finally, the pipeline is started:
**Starting the pipeline**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
/* start playing */
pipeline1->setState(QGst::StatePlaying);
pipeline2->setState(QGst::StatePlaying);
@ -214,7 +214,7 @@ ready for processing:
**MySink::newBuffer()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
virtual QGst::FlowReturn newBuffer()
{
m_src->pushBuffer(pullBuffer());
@ -227,7 +227,7 @@ Our implementation takes the new buffer and pushes it into the
**Player::Player()**
``` theme: Default; brush: cpp; gutter: false
``` lang=c
Player::Player(int argc, char **argv)
: QCoreApplication(argc, argv), m_sink(&m_src)
```
@ -243,4 +243,3 @@ data into and out of a GStreamer pipeline.
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : Windows deployment
# Windows deployment
This page last changed on Nov 28, 2012 by xartigas.
@ -24,7 +24,7 @@ the Windows Installer functionality and offers a number of options to
suit your needs. You can review these options by
executing `msiexec` without parameters. For example:
``` theme: Default; brush: plain; gutter: false
```
msiexec /i gstreamer-sdk-2012.9-x86.msi
```
@ -40,7 +40,7 @@ installer to deploy to your applications folder (or a
subfolder). Again, use the `msiexec` parameters that suit you best. For
example:
``` theme: Default; brush: plain; gutter: false
```
msiexec /passive INSTALLDIR=C:\Desired\Folder /i gstreamer-sdk-2012.9-x86.msi
```
@ -258,4 +258,3 @@ Get the ZIP file with all Merge Modules for your architecture:
</table>
Document generated by Confluence on Oct 08, 2015 10:27

3
build.sh Executable file
View file

@ -0,0 +1,3 @@
hotdoc run
cp -R attachments built_doc/html/
cp -R images built_doc/html/

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : gst-inspect
# gst-inspect
This page last changed on May 30, 2012 by xartigas.
@ -81,13 +81,13 @@ Add directories separated with ':' to the plugin search path
## Example
``` theme: Default; brush: plain; gutter: false
```
gst-inspect-0.10 audiotestsrc
```
should produce:
``` theme: Default; brush: plain; gutter: false
```
Factory Details:
Long name: Audio test source
Class: Source/Audio
@ -210,4 +210,3 @@ Element Properties:
```
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : gst-launch
# gst-launch
This page last changed on May 30, 2012 by xartigas.
@ -97,7 +97,7 @@ Creates an element of type ELEMENTTYPE and sets the PROPERTIES.
PROPERTY=VALUE ...
Sets the property to the specified value. You can use **gst-inspect**(1)
to find out about properties and allowed values of different elements.
to find out about properties and allowed values of different elements.
Enumeration properties can be set by name, nick or value.
**Bins**
@ -126,7 +126,7 @@ used. This works across bins. If a padname is given, the link is done
with these pads. If no pad names are given all possibilities are tried
and a matching pad is used. If multiple padnames are given, both sides
must have the same number of pads specified and multiple links are done
in the given order.
in the given order.
So the simplest link is a simple exclamation mark, that links the
element to the left of it to the element right of it.
@ -140,25 +140,25 @@ chain caps, you can add more caps in the same format afterwards.
**Properties**
NAME=*\[(TYPE)\]*VALUE
NAME=*\[(TYPE)\]*VALUE
in lists and ranges: *\[(TYPE)\]*VALUE
Sets the requested property in capabilities. The name is an alphanumeric
value and the type can have the following case-insensitive values:
\- **i** or **int** for integer values or ranges
\- **f** or **float** for float values or ranges
\- **4** or **fourcc** for FOURCC values
\- **b**, **bool** or **boolean** for boolean values
\- **s**, **str** or **string** for strings
\- **fraction** for fractions (framerate, pixel-aspect-ratio)
\- **l** or **list** for lists
value and the type can have the following case-insensitive values:
\- **i** or **int** for integer values or ranges
\- **f** or **float** for float values or ranges
\- **4** or **fourcc** for FOURCC values
\- **b**, **bool** or **boolean** for boolean values
\- **s**, **str** or **string** for strings
\- **fraction** for fractions (framerate, pixel-aspect-ratio)
\- **l** or **list** for lists
If no type was given, the following order is tried: integer, float,
boolean, string.
boolean, string.
Integer values must be parsable by **strtol()**, floats by **strtod()**.
FOURCC values may either be integers or strings. Boolean values are
(case insensitive) *yes*, *no*, *true* or *false* and may like strings
be escaped with " or '.
Ranges are in this format: \[ VALUE, VALUE \]
be escaped with " or '.
Ranges are in this format: \[ VALUE, VALUE \]
Lists use this format: ( VALUE *\[, VALUE ...\]* )
## Pipeline Control
@ -166,7 +166,7 @@ Lists use this format: ( VALUE *\[, VALUE ...\]* )
A pipeline can be controlled by signals. SIGUSR2 will stop the pipeline
(GST\_STATE\_NULL); SIGUSR1 will put it back to play
(GST\_STATE\_PLAYING). By default, the pipeline will start in the
playing state.
playing state.
There are currently no signals defined to go into the ready or pause
(GST\_STATE\_READY and GST\_STATE\_PAUSED) state explicitely.
@ -186,53 +186,53 @@ ffmpegcolorspace (for video) in front of the sink to make things work.
**Audio playback**
**gst-launch filesrc location=music.mp3 \! mad \! audioconvert \!
audioresample \! osssink**
audioresample \! osssink**
Play the mp3 music file "music.mp3" using a libmad-based plug-in and
output to an OSS device
**gst-launch filesrc location=music.ogg \! oggdemux \! vorbisdec \!
audioconvert \! audioresample \! osssink**
audioconvert \! audioresample \! osssink**
Play an Ogg Vorbis format file
**gst-launch gnomevfssrc location=music.mp3 \! mad \! osssink
**gst-launch gnomevfssrc location=music.mp3 \! mad \! osssink
gst-launch gnomevfssrc location=<http://domain.com/music.mp3> \! mad \!
audioconvert \! audioresample \! osssink**
audioconvert \! audioresample \! osssink**
Play an mp3 file or an http stream using GNOME-VFS
**gst-launch gnomevfssrc location=<smb://computer/music.mp3> \! mad \!
audioconvert \! audioresample \! osssink**
audioconvert \! audioresample \! osssink**
Use GNOME-VFS to play an mp3 file located on an SMB server
**Format conversion**
**gst-launch filesrc location=music.mp3 \! mad \! audioconvert \!
vorbisenc \! oggmux \! filesink location=music.ogg**
vorbisenc \! oggmux \! filesink location=music.ogg**
Convert an mp3 music file to an Ogg Vorbis file
**gst-launch filesrc location=music.mp3 \! mad \! audioconvert \!
flacenc \! filesink location=test.flac**
flacenc \! filesink location=test.flac**
Convert to the FLAC format
**Other**
**gst-launch filesrc location=music.wav \! wavparse \! audioconvert \!
audioresample \! osssink**
audioresample \! osssink**
Plays a .WAV file that contains raw audio data (PCM).
**gst-launch filesrc location=music.wav \! wavparse \! audioconvert \!
vorbisenc \! oggmux \! filesink location=music.ogg
vorbisenc \! oggmux \! filesink location=music.ogg
gst-launch filesrc location=music.wav \! wavparse \! audioconvert \!
lame \! filesink location=music.mp3**
lame \! filesink location=music.mp3**
Convert a .WAV file containing raw audio data into an Ogg Vorbis or mp3
file
**gst-launch cdparanoiasrc mode=continuous \! audioconvert \! lame \!
id3v2mux \! filesink location=cd.mp3**
id3v2mux \! filesink location=cd.mp3**
rips all tracks from compact disc and convert them into a single mp3
file
**gst-launch cdparanoiasrc track=5 \! audioconvert \! lame \! id3v2mux
\! filesink location=track5.mp3**
\! filesink location=track5.mp3**
rips track 5 from the CD and converts it into a single mp3 file
Using **gst-inspect**(1), it is possible to discover settings like the
@ -243,29 +243,29 @@ you, e.g.: **gst-launch [cdda://5]() \! lame vbr=new vbr-quality=6 \!
filesink location=track5.mp3**
**gst-launch osssrc \! audioconvert \! vorbisenc \! oggmux \! filesink
location=input.ogg**
location=input.ogg**
records sound from your audio input and encodes it into an ogg file
**Video**
**gst-launch filesrc location=JB\_FF9\_TheGravityOfLove.mpg \! dvddemux
\! mpeg2dec \! xvimagesink**
\! mpeg2dec \! xvimagesink**
Display only the video portion of an MPEG-1 video file, outputting to an
X display window
**gst-launch filesrc location=/flflfj.vob \! dvddemux \! mpeg2dec \!
sdlvideosink**
sdlvideosink**
Display the video portion of a .vob file (used on DVDs), outputting to
an SDL window
**gst-launch filesrc location=movie.mpg \! dvddemux name=demuxer
demuxer. \! queue \! mpeg2dec \! sdlvideosink demuxer. \! queue \! mad
\! audioconvert \! audioresample \! osssink**
\! audioconvert \! audioresample \! osssink**
Play both video and audio portions of an MPEG movie
**gst-launch filesrc location=movie.mpg \! mpegdemux name=demuxer
demuxer. \! queue \! mpeg2dec \! ffmpegcolorspace \! sdlvideosink
demuxer. \! queue \! mad \! audioconvert \! audioresample \! osssink**
demuxer. \! queue \! mad \! audioconvert \! audioresample \! osssink**
Play an AVI movie with an external text subtitle stream
This example also shows how to refer to specific pads by name if an
@ -288,25 +288,25 @@ Stream video using RTP and network elements.
**gst-launch v4l2src \!
video/x-raw-yuv,width=128,height=96,format='(fourcc)'UYVY \!
ffmpegcolorspace \! ffenc\_h263 \! video/x-h263 \! rtph263ppay pt=96 \!
udpsink host=192.168.1.1 port=5000 sync=false**
udpsink host=192.168.1.1 port=5000 sync=false**
Use this command on the receiver
**gst-launch udpsrc port=5000 \! application/x-rtp,
clock-rate=90000,payload=96 \! rtph263pdepay queue-delay=0 \!
ffdec\_h263 \! xvimagesink**
ffdec\_h263 \! xvimagesink**
This command would be run on the transmitter
**Diagnostic**
**gst-launch -v fakesrc num-buffers=16 \! fakesink**
**gst-launch -v fakesrc num-buffers=16 \! fakesink**
Generate a null stream and ignore it (and print out details).
**gst-launch audiotestsrc \! audioconvert \! audioresample \!
osssink**
osssink**
Generate a pure sine tone to test the audio output
**gst-launch videotestsrc \! xvimagesink
gst-launch videotestsrc \! ximagesink**
**gst-launch videotestsrc \! xvimagesink
gst-launch videotestsrc \! ximagesink**
Generate a familiar test pattern to test the video output
**Automatic linking**
@ -315,12 +315,12 @@ You can use the decodebin element to automatically select the right
elements to get a working pipeline.
**gst-launch filesrc location=musicfile \! decodebin \! audioconvert \!
audioresample \! osssink**
audioresample \! osssink**
Play any supported audio format
**gst-launch filesrc location=videofile \! decodebin name=decoder
decoder. \! queue \! audioconvert \! audioresample \! osssink decoder.
\! ffmpegcolorspace \! xvimagesink**
\! ffmpegcolorspace \! xvimagesink**
Play any supported video format with video and audio output. Threads are
used automatically. To make this even easier, you can use the playbin
element:
@ -333,12 +333,12 @@ These examples show you how to use filtered caps.
**gst-launch videotestsrc \!
'video/x-raw-yuv,format=(fourcc)YUY2;video/x-raw-yuv,format=(fourcc)YV12'
\! xvimagesink**
\! xvimagesink**
Show a test image and use the YUY2 or YV12 video format for this.
**gst-launch osssrc \!
'audio/x-raw-int,rate=\[32000,64000\],width=\[16,32\],depth={16,24,32},signed=(boolean)true'
\! wavenc \! filesink location=recording.wav**
\! wavenc \! filesink location=recording.wav**
record audio and write it to a .wav file. Force usage of signed 16 to 32
bit samples and a sample rate between 32kHz and 64KHz.
@ -403,4 +403,3 @@ let it core dump). Then get a stack trace in the usual way
<!-- end list -->
Document generated by Confluence on Oct 08, 2015 10:28

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : iOS tutorial 1: Link against GStreamer
# iOS tutorial 1: Link against GStreamer
This page last changed on May 06, 2013 by xartigas.
@ -24,7 +24,7 @@ The UI uses storyboards and contains a single `View` with a centered
**ViewController.h**
``` theme: Default; brush: plain; gutter: true
```
#import <UIKit/UIKit.h>
 
@interface ViewController : UIViewController {
@ -51,7 +51,7 @@ the [Android tutorials](Android%2Btutorials.html).
**GStreamerBackend.m**
``` theme: Default; brush: plain; gutter: true
```
#import "GStreamerBackend.h"
#include <gst/gst.h>
@ -85,7 +85,7 @@ GStreamer version to display at the label. That's it\!
**ViewController.m**
``` theme: Default; brush: plain; gutter: true
```
#import "ViewController.h"
#import "GStreamerBackend.h"
@ -104,7 +104,7 @@ GStreamer version to display at the label. That's it\!
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
gst_backend = [[GStreamerBackend alloc] init];
label.text = [NSString stringWithFormat:@"Welcome to %@!", [gst_backend getGStreamerVersion]];
}
@ -135,13 +135,12 @@ It has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113602.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113603.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial1-screenshot.png](attachments/3014792/3113601.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : iOS tutorial 2: A running pipeline
# iOS tutorial 2: A running pipeline
This page last changed on May 13, 2013 by xartigas.
@ -60,7 +60,7 @@ behalf:
**ViewController.m**
``` theme: Default; brush: plain; gutter: true
```
#import "ViewController.h"
#import "GStreamerBackend.h"
#import <UIKit/UIKit.h>
@ -80,7 +80,7 @@ behalf:
- (void)viewDidLoad
{
[super viewDidLoad];
play_button.enabled = FALSE;
pause_button.enabled = FALSE;
@ -130,7 +130,7 @@ behalf:
An instance of the `GStreamerBackend` in stored inside the class:
``` first-line: 5; theme: Default; brush: plain; gutter: true
```
@interface ViewController () {
GStreamerBackend *gst_backend;
}
@ -139,11 +139,11 @@ An instance of the `GStreamerBackend` in stored inside the class:
This instance is created in the `viewDidLoad` function through a custom
`init:` method in the `GStreamerBackend`:
``` first-line: 17; theme: Default; brush: plain; gutter: true
```
- (void)viewDidLoad
{
[super viewDidLoad];
play_button.enabled = FALSE;
pause_button.enabled = FALSE;
@ -158,7 +158,7 @@ The Play and Pause buttons are also disabled in the
`viewDidLoad` function, and they are not re-enabled until the
`GStreamerBackend` reports that it is initialized and ready.
``` first-line: 33; theme: Default; brush: plain; gutter: true
```
/* Called when the Play button is pressed */
-(IBAction) play:(id)sender
{
@ -176,7 +176,7 @@ These two methods are called when the user presses the Play or Pause
buttons, and simply forward the call to the appropriate method in the
`GStreamerBackend`.
``` first-line: 49; theme: Default; brush: plain; gutter: true
```
-(void) gstreamerInitialized
{
dispatch_async(dispatch_get_main_queue(), ^{
@ -196,7 +196,7 @@ the
[dispatch\_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call
wrapping all UI code.
``` first-line: 58; theme: Default; brush: plain; gutter: true
```
-(void) gstreamerSetUIMessage:(NSString *)message
{
dispatch_async(dispatch_get_main_queue(), ^{
@ -221,7 +221,7 @@ the `GStreamerBackendDelegate` protocol:
**GStreamerBackend.m**
``` theme: Default; brush: plain; gutter: true
```
#import "GStreamerBackend.h"
#include <gst/gst.h>
@ -309,7 +309,7 @@ static void error_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
GError *err;
gchar *debug_info;
gchar *message_string;
gst_message_parse_error (msg, &err, &debug_info);
message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
g_clear_error (&err);
@ -358,7 +358,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Create our own GLib Main Context and make it the default one */
context = g_main_context_new ();
g_main_context_push_thread_default(context);
/* Build pipeline */
pipeline = gst_parse_launch("audiotestsrc ! audioconvert ! audioresample ! autoaudiosink", &error);
if (error) {
@ -368,7 +368,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
g_free (message);
return;
}
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (pipeline);
bus_source = gst_bus_create_watch (bus);
@ -378,7 +378,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, (__bridge void *)self);
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, (__bridge void *)self);
gst_object_unref (bus);
/* Create a GLib Main Loop and set it to run */
GST_DEBUG ("Entering main loop...");
main_loop = g_main_loop_new (context, FALSE);
@ -387,13 +387,13 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
GST_DEBUG ("Exited main loop");
g_main_loop_unref (main_loop);
main_loop = NULL;
/* Free resources */
g_main_context_pop_thread_default(context);
g_main_context_unref (context);
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
return;
}
@ -404,7 +404,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
#### Interface methods:
``` first-line: 26; theme: Default; brush: plain; gutter: true
```
-(id) init:(id) uiDelegate
{
if (self = [super init])
@ -434,7 +434,7 @@ warns the application when interesting things happen.
threshold, so we can see the debug output from within Xcode and keep
track of our application progress.
``` first-line: 44; theme: Default; brush: plain; gutter: true
```
-(void) dealloc
{
if (pipeline) {
@ -449,7 +449,7 @@ track of our application progress.
The `dealloc` method takes care of bringing the pipeline to the NULL
state and releasing it.
``` first-line: 54; theme: Default; brush: plain; gutter: true
```
-(void) play
{
if(gst_element_set_state(pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
@ -470,7 +470,7 @@ desired state and warn the application if something fails.
#### Private methods:
``` first-line: 72; theme: Default; brush: plain; gutter: true
```
/* Change the message on the UI through the UI delegate */
-(void)setUIMessage:(gchar*) message
{
@ -488,14 +488,14 @@ into `NSString *` and displays them through the
implementation of this method is marked as `@optional`, and hence the
check for its existence in the delegate with `respondsToSelector:`
``` first-line: 82; theme: Default; brush: plain; gutter: true
```
/* Retrieve errors from the bus and show them on the UI */
static void error_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
{
GError *err;
gchar *debug_info;
gchar *message_string;
gst_message_parse_error (msg, &err, &debug_info);
message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
g_clear_error (&err);
@ -534,7 +534,7 @@ through the `userdata` pointer of the callbacks (the `self` pointer in
these implementations). This is discussed below when registering the
callbacks in the `app_function`.
``` first-line: 111; theme: Default; brush: plain; gutter: true
```
/* Check if all conditions are met to report GStreamer as initialized.
* These conditions will change depending on the application */
-(void) check_initialization_complete
@ -562,7 +562,7 @@ It exists with almost identical content in the Android tutorial, which
exemplifies how the same code can run on both platforms with little
change.
``` first-line: 134; theme: Default; brush: plain; gutter: true
```
/* Create our own GLib Main Context and make it the default one */
context = g_main_context_new ();
g_main_context_push_thread_default(context);
@ -574,7 +574,7 @@ libraries which might not have been properly disposed of. A new context
is created with `g_main_context_new()` and then it is made the default
one for the thread with `g_main_context_push_thread_default()`.
``` first-line: 138; theme: Default; brush: plain; gutter: true
```
/* Build pipeline */
pipeline = gst_parse_launch("audiotestsrc ! audioconvert ! audioresample ! autoaudiosink", &error);
if (error) {
@ -591,7 +591,7 @@ this case, it is simply an  `audiotestsrc` (which produces a continuous
tone) and an `autoaudiosink`, with accompanying adapter
elements.
``` first-line: 148; theme: Default; brush: plain; gutter: true
```
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (pipeline);
bus_source = gst_bus_create_watch (bus);
@ -616,7 +616,7 @@ because it travels through C-land untouched. It re-emerges at the
different callbacks through the userdata pointer and cast again to a
`GStreamerBackend *`.
``` first-line: 158; theme: Default; brush: plain; gutter: true
```
/* Create a GLib Main Loop and set it to run */
GST_DEBUG ("Entering main loop...");
main_loop = g_main_loop_new (context, FALSE);
@ -659,10 +659,9 @@ It has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[ios-tutorial2-screenshot.png](attachments/3571718/3538954.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial2-screenshot.png](attachments/3571718/3538953.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : iOS tutorial 3: Video
# iOS tutorial 3: Video
This page last changed on May 13, 2013 by xartigas.
@ -41,7 +41,7 @@ outlets):
**ViewController.h**
``` theme: Default; brush: plain; gutter: true
```
#import <UIKit/UIKit.h>
#import "GStreamerBackendDelegate.h"
@ -73,7 +73,7 @@ behalf:
**ViewController.m**
``` theme: Default; brush: plain; gutter: true
```
#import "ViewController.h"
#import "GStreamerBackend.h"
#import <UIKit/UIKit.h>
@ -95,10 +95,10 @@ behalf:
- (void)viewDidLoad
{
[super viewDidLoad];
play_button.enabled = FALSE;
pause_button.enabled = FALSE;
/* Make these constant for now, later tutorials will change them */
media_width = 320;
media_height = 240;
@ -167,7 +167,7 @@ behalf:
We expand the class to remember the width and height of the media we are
currently playing:
``` first-line: 5; theme: Default; brush: plain; gutter: true
```
@interface ViewController () {
GStreamerBackend *gst_backend;
int media_width;
@ -179,14 +179,14 @@ In later tutorials this data is retrieved from the GStreamer pipeline,
but in this tutorial, for simplicitys sake, the width and height of the
media is constant and initialized in `viewDidLoad`:
``` first-line: 19; theme: Default; brush: plain; gutter: true
```
- (void)viewDidLoad
{
[super viewDidLoad];
play_button.enabled = FALSE;
pause_button.enabled = FALSE;
/* Make these constant for now, later tutorials will change them */
media_width = 320;
media_height = 240;
@ -203,7 +203,7 @@ The rest of the `ViewController `code is the same as the previous
tutorial, except for the code that adapts the `video_view` size to the
media size, respecting its aspect ratio:
``` first-line: 51; theme: Default; brush: plain; gutter: true
```
- (void)viewDidLayoutSubviews
{
CGFloat view_width = video_container_view.bounds.size.width;
@ -250,7 +250,7 @@ the `GStreamerBackendDelegate` protocol:
**GStreamerBackend.m**
``` theme: Default; brush: plain; gutter: true
```
#import "GStreamerBackend.h"
#include <gst/gst.h>
@ -342,7 +342,7 @@ static void error_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
GError *err;
gchar *debug_info;
gchar *message_string;
gst_message_parse_error (msg, &err, &debug_info);
message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
g_clear_error (&err);
@ -391,7 +391,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Create our own GLib Main Context and make it the default one */
context = g_main_context_new ();
g_main_context_push_thread_default(context);
/* Build pipeline */
pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink", &error);
if (error) {
@ -404,7 +404,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_X_OVERLAY);
if (!video_sink) {
GST_ERROR ("Could not retrieve video sink");
@ -421,7 +421,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, (__bridge void *)self);
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, (__bridge void *)self);
gst_object_unref (bus);
/* Create a GLib Main Loop and set it to run */
GST_DEBUG ("Entering main loop...");
main_loop = g_main_loop_new (context, FALSE);
@ -430,13 +430,13 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
GST_DEBUG ("Exited main loop");
g_main_loop_unref (main_loop);
main_loop = NULL;
/* Free resources */
g_main_context_pop_thread_default(context);
g_main_context_unref (context);
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
return;
}
@ -446,7 +446,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
The main differences with the previous tutorial are related to the
handling of the `XOverlay` interface:
``` first-line: 15; theme: Default; brush: plain; gutter: true
```
@implementation GStreamerBackend {
id ui_delegate; /* Class that we use to interact with the user interface */
GstElement *pipeline; /* The running pipeline */
@ -461,7 +461,7 @@ handling of the `XOverlay` interface:
The class is expanded to keep track of the video sink element in the
pipeline and the `UIView *` onto which rendering is to occur.
``` first-line: 29; theme: Default; brush: plain; gutter: true
```
-(id) init:(id) uiDelegate videoView:(UIView *)video_view
{
if (self = [super init])
@ -485,7 +485,7 @@ pipeline and the `UIView *` onto which rendering is to occur.
The constructor accepts the `UIView *` as a new parameter, which, at
this point, is simply remembered in `ui_video_view`.
``` first-line: 142; theme: Default; brush: plain; gutter: true
```
/* Build pipeline */
pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink", &error);
```
@ -497,7 +497,7 @@ choses the appropriate sink for the platform (currently,
`eglglessink` is the only option for
iOS).
``` first-line: 152; theme: Default; brush: plain; gutter: true
```
/* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY);
@ -541,14 +541,13 @@ To this avail, we create the `EaglUIView` class, derived from
**EaglUIView.m**
``` theme: Default; brush: plain; gutter: true
```
#import "EaglUIVIew.h"
#import <QuartzCore/QuartzCore.h>
@implementation EaglUIView
+ (Class) layerClass
{
return [CAEAGLLayer class];
@ -583,7 +582,6 @@ It has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[ios-tutorial3-screenshot.png](attachments/3571736/3538955.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : iOS tutorial 4: A basic media player
# iOS tutorial 4: A basic media player
This page last changed on May 21, 2013 by xartigas.
@ -52,7 +52,7 @@ duration.
**VideoViewController.h**
``` theme: Default; brush: plain; gutter: true
```
#import <UIKit/UIKit.h>
#import "GStreamerBackendDelegate.h"
@ -101,7 +101,7 @@ this view is collapsed by default. Click here to expand…
**VideoViewController.m**
``` theme: Default; brush: plain; gutter: true
```
#import "VideoViewController.h"
#import "GStreamerBackend.h"
#import <UIKit/UIKit.h>
@ -162,10 +162,10 @@ this view is collapsed by default. Click here to expand…
- (void)viewDidLoad
{
[super viewDidLoad];
play_button.enabled = FALSE;
pause_button.enabled = FALSE;
/* As soon as the GStreamer backend knows the real values, these ones will be replaced */
media_width = 320;
media_height = 240;
@ -311,7 +311,7 @@ because we will not offer the same functionalities. We keep track of
this in the `is_local_media` variable, which is set when the URI is set,
in the `gstreamerInitialized` method:
``` first-line: 154; theme: Default; brush: plain; gutter: true
```
-(void) gstreamerInitialized
{
dispatch_async(dispatch_get_main_queue(), ^{
@ -331,7 +331,7 @@ Every time the size of the media changes (which could happen mid-stream,
for some kind of streams), or when it is first detected,
`GStreamerBackend`  calls our `mediaSizeChanged()` callback:
``` first-line: 173; theme: Default; brush: plain; gutter: true
```
-(void) mediaSizeChanged:(NSInteger)width height:(NSInteger)height
{
media_width = width;
@ -370,7 +370,7 @@ call our `setCurrentPosition` method so we can update the position of
the thumb in the Seek Bar. Again we do so from the UI thread, using
`dispatch_async()`.
``` first-line: 184; theme: Default; brush: plain; gutter: true
```
-(void) setCurrentPosition:(NSInteger)position duration:(NSInteger)duration
{
/* Ignore messages from the pipeline if the time sliders is being dragged */
@ -397,7 +397,7 @@ which we will use to display the current position and duration in
takes care of it, and must be called every time the Seek Bar is
updated:
``` first-line: 24; theme: Default; brush: plain; gutter: true
```
/* The text widget acts as an slave for the seek bar, so it reflects what the seek bar shows, whether
* it is an actual pipeline position or the position the user is currently dragging to. */
- (void) updateTimeWidget
@ -438,7 +438,7 @@ outlets are connected. We will be notified when the user starts dragging
the Slider, when the Slider position changes and when the users releases
the Slider.
``` first-line: 112; theme: Default; brush: plain; gutter: true
```
/* Called when the user starts to drag the time slider */
- (IBAction)sliderTouchDown:(id)sender {
[gst_backend pause];
@ -452,7 +452,7 @@ do not want it to keep moving. We also mark that a drag operation is in
progress in the
`dragging_slider` variable.
``` first-line: 102; theme: Default; brush: plain; gutter: true
```
/* Called when the time slider position has changed, either because the user dragged it or
* we programmatically changed its position. dragging_slider tells us which one happened */
- (IBAction)sliderValueChanged:(id)sender {
@ -475,7 +475,7 @@ Otherwise, the seek operation will be performed when the thumb is
released, and the only thing we do here is update the textual time
widget.
``` first-line: 118; theme: Default; brush: plain; gutter: true
```
/* Called when the user stops dragging the time slider */
- (IBAction)sliderTouchUp:(id)sender {
dragging_slider = NO;
@ -509,7 +509,7 @@ this view is collapsed by default. Click here to expand…
**GStreamerBackend.m**
``` theme: Default; brush: plain; gutter: true
```
#import "GStreamerBackend.h"
#include <gst/gst.h>
@ -702,7 +702,7 @@ static void error_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
GError *err;
gchar *debug_info;
gchar *message_string;
gst_message_parse_error (msg, &err, &debug_info);
message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
g_clear_error (&err);
@ -839,7 +839,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Create our own GLib Main Context and make it the default one */
context = g_main_context_new ();
g_main_context_push_thread_default(context);
/* Build pipeline */
pipeline = gst_parse_launch("playbin2", &error);
if (error) {
@ -852,7 +852,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
/* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_X_OVERLAY);
if (!video_sink) {
GST_ERROR ("Could not retrieve video sink");
@ -888,14 +888,14 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
GST_DEBUG ("Exited main loop");
g_main_loop_unref (main_loop);
main_loop = NULL;
/* Free resources */
g_main_context_pop_thread_default(context);
g_main_context_unref (context);
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
pipeline = NULL;
ui_delegate = NULL;
ui_video_view = NULL;
@ -911,7 +911,7 @@ The UI code will call `setUri` whenever it wants to change the playing
URI (in this tutorial the URI never changes, but it does in the next
one):
``` first-line: 79; theme: Default; brush: plain; gutter: true
```
-(void) setUri:(NSString*)uri
{
const char *char_uri = [uri UTF8String];
@ -934,7 +934,7 @@ do not. Therefore, in the READY to PAUSED state change, once the Caps of
the decoded media are known, we inspect them
in `check_media_size()`:
``` first-line: 244; theme: Default; brush: plain; gutter: true
```
/* Retrieve the video sink's Caps and tell the application about the media size */
static void check_media_size (GStreamerBackend *self) {
GstElement *video_sink;
@ -988,7 +988,7 @@ To keep the UI updated, a GLib timer is installed in
the `app_function` that fires 4 times per second (or every 250ms),
right before entering the main loop:
``` first-line: 365; theme: Default; brush: plain; gutter: true
```
/* Register a function that GLib will call 4 times per second */
timeout_source = g_timeout_source_new (250);
g_source_set_callback (timeout_source, (GSourceFunc)refresh_ui, (__bridge void *)self, NULL);
@ -999,7 +999,7 @@ right before entering the main loop:
Then, in the refresh\_ui
method:
``` first-line: 120; theme: Default; brush: plain; gutter: true
```
/* If we have pipeline and it is running, query the current position and clip duration and inform
* the application */
static gboolean refresh_ui (GStreamerBackend *self) {
@ -1051,7 +1051,7 @@ see how to overcome these problems.
In `setPosition`:
``` first-line: 86; theme: Default; brush: plain; gutter: true
```
-(void) setPosition:(NSInteger)milliseconds
{
gint64 position = (gint64)(milliseconds * GST_MSECOND);
@ -1069,7 +1069,7 @@ away; otherwise, store the desired position in
the `desired_position` variable. Then, in
the `state_changed_cb()` callback:
``` first-line: 292; theme: Default; brush: plain; gutter: true
```
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED)
{
check_media_size(self);
@ -1105,7 +1105,7 @@ once this period elapses.
To achieve this, all seek requests are routed through
the `execute_seek()` method:
``` first-line: 145; theme: Default; brush: plain; gutter: true
```
/* Perform seek, if we are not too close to the previous seek. Otherwise, schedule the seek for
* some time in the future. */
static void execute_seek (gint64 position, GStreamerBackend *self) {
@ -1176,7 +1176,7 @@ using buffering. The same procedure is used here, by listening to the
buffering
messages:
``` first-line: 361; theme: Default; brush: plain; gutter: true
```
g_signal_connect (G_OBJECT (bus), "message::buffering", (GCallback)buffering_cb, (__bridge void *)self);
```
@ -1186,7 +1186,7 @@ source):
 
``` first-line: 215; theme: Default; brush: plain; gutter: true
```
/* Called when buffering messages are received. We inform the UI about the current buffering level and
* keep the pipeline paused until 100% buffering is reached. At that point, set the desired state. */
static void buffering_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self) {
@ -1236,7 +1236,6 @@ here into an acceptable iOS media player.
![](images/icons/bullet_blue.gif)
[ios-tutorial4-screenshot.png](attachments/3571758/3539044.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : iOS tutorial 5: A Complete media player
# iOS tutorial 5: A Complete media player
This page last changed on May 22, 2013 by
xartigas.
@ -66,13 +66,12 @@ It has been a pleasure having you here, and see you soon\!
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot0.png](attachments/3571769/3539071.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot1.png](attachments/3571769/3539046.png)
(image/png)
(image/png)
![](images/icons/bullet_blue.gif)
[ios-tutorial5-screenshot0.png](attachments/3571769/3539045.png)
(image/png)
(image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,4 +1,4 @@
# GStreamer SDK documentation : iOS tutorials
# iOS tutorials
This page last changed on May 07, 2013 by xartigas.
@ -36,4 +36,3 @@ All iOS tutorials are split into the following classes:
`GStreamerBackend`.
Document generated by Confluence on Oct 08, 2015 10:27

Binary file not shown.

After

Width:  |  Height:  |  Size: 60 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 53 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 61 B