Fix comment block metadatas

This commit is contained in:
Thibault Saunier 2016-05-26 22:48:36 -04:00
parent a230b60399
commit e8395b01f5
34 changed files with 308 additions and 308 deletions

View file

@ -26,7 +26,7 @@ makefile that allows GStreamer integration.
**src/com/gst\_sdk\_tutorials/tutorial\_1/Tutorial1.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_1;
import android.app.Activity;
@ -70,7 +70,7 @@ public class Tutorial1 extends Activity {
Calls from Java to C happen through native methods, like the one
declared here:
``` first-line: 11; theme: Default; brush: java; gutter: true
``` lang=java
private native String nativeGetGStreamerInfo();
```
@ -82,7 +82,7 @@ shown later.
The first bit of code that gets actually executed is the static
initializer of the class:
``` first-line: 33; theme: Default; brush: java; gutter: true
``` lang=java
static {
System.loadLibrary("gstreamer_android");
System.loadLibrary("tutorial-1");
@ -99,7 +99,7 @@ expose. The GStreamer library only exposes a `init()` method, which
initializes GStreamer and registers all plugins (The tutorial library is
explained later below).
``` first-line: 19; theme: Default; brush: java; gutter: true
``` lang=java
try {
GStreamer.init(this);
} catch (Exception e) {
@ -122,7 +122,7 @@ Should initialization fail, the `init()` method would throw an
[Exception](http://developer.android.com/reference/java/lang/Exception.html)
with the details provided by the GStreamer library.
``` first-line: 29; theme: Default; brush: java; gutter: true
``` lang=java
TextView tv = (TextView)findViewById(R.id.textview_info);
tv.setText("Welcome to " + nativeGetGStreamerInfo() + " !");
```
@ -139,7 +139,7 @@ code:
**jni/tutorial-1.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <jni.h>
#include <android/log.h>
@ -179,7 +179,7 @@ Machine (VM) loads a library.
Here, we retrieve the JNI environment needed to make calls that interact
with Java:
``` first-line: 21; theme: Default; brush: cpp; gutter: true
``` lang=c
JNIEnv *env = NULL;
if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) {
@ -192,7 +192,7 @@ And then locate the class containing the UI part of this tutorial using
`
FindClass()`:
``` first-line: 27; theme: Default; brush: cpp; gutter: true
``` lang=c
jclass klass = (*env)->FindClass (env, "com/gst_sdk_tutorials/tutorial_1/Tutorial1");
```
@ -201,7 +201,7 @@ is, we provide the code for the methods we advertised in Java using the
**`native`**
 keyword:
``` first-line: 28; theme: Default; brush: cpp; gutter: true
``` lang=c
(*env)->RegisterNatives (env, klass, native_methods, G_N_ELEMENTS(native_methods));
```
@ -211,7 +211,7 @@ name, its [type
signature](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/types.html#wp276)
and a pointer to the C function implementing it:
``` first-line: 16; theme: Default; brush: cpp; gutter: true
``` lang=c
static JNINativeMethod native_methods[] = {
{ "nativeGetGStreamerInfo", "()Ljava/lang/String;", (void *) gst_native_get_gstreamer_info}
};
@ -220,7 +220,7 @@ static JNINativeMethod native_methods[] = {
The only native method used in this tutorial
is `nativeGetGStreamerInfo()`:
``` first-line: 9; theme: Default; brush: cpp; gutter: true
``` lang=c
jstring gst_native_get_gstreamer_info (JNIEnv* env, jobject thiz) {
char *version_utf8 = gst_version_string();
jstring *version_jstring = (*env)->NewStringUTF(env, version_utf8);
@ -241,7 +241,7 @@ must free the `char *` returned by `gst_version_string()`.
**jni/Android.mk**
``` theme: Default; brush: ruby; gutter: true
``` lang=ruby
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)

View file

@ -56,7 +56,7 @@ messages sent from the C code (for errors and state changes).
**src/com/gst\_sdk\_tutorials/tutorial\_2/Tutorial2.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_2;
import android.app.Activity;
@ -181,7 +181,7 @@ public class Tutorial2 extends Activity {
As usual, the first bit that gets executed is the static initializer of
the class:
``` first-line: 113; theme: Default; brush: java; gutter: true
``` lang=java
static {
System.loadLibrary("gstreamer_android");
System.loadLibrary("tutorial-2");
@ -198,7 +198,7 @@ In the `onCreate()` method GStreamer is initialized as in the previous
tutorial with `GStreamer.init(this)`, and then the layout is inflated
and listeners are setup for the two UI buttons:
``` first-line: 41; theme: Default; brush: java; gutter: true
``` lang=java
ImageButton play = (ImageButton) this.findViewById(R.id.button_play);
play.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
@ -224,7 +224,7 @@ and safer than tracking the actual pipeline state, because orientation
changes can happen before the pipeline has moved to the desired state,
for example.
``` first-line: 57; theme: Default; brush: java; gutter: true
``` lang=java
if (savedInstanceState != null) {
is_playing_desired = savedInstanceState.getBoolean("playing");
Log.i ("GStreamer", "Activity created. Saved state is playing:" + is_playing_desired);
@ -239,7 +239,7 @@ We will first build the GStreamer pipeline (below) and only when the
native code reports itself as initialized we will use
`is_playing_desired`.
``` first-line: 69; theme: Default; brush: java; gutter: true
``` lang=java
nativeInit();
```
@ -252,7 +252,7 @@ This finishes the `onCreate()` method and the Java initialization. The
UI buttons are disabled, so nothing will happen until native code is
ready and `onGStreamerInitialized()` is called:
``` first-line: 94; theme: Default; brush: java; gutter: true
``` lang=java
private void onGStreamerInitialized () {
Log.i ("GStreamer", "Gst initialized. Restoring state, playing:" + is_playing_desired);
```
@ -261,7 +261,7 @@ This is called by the native code when its main loop is finally running.
We first retrieve the desired playing state from `is_playing_desired`,
and then set that state:
``` first-line: 96; theme: Default; brush: java; gutter: true
``` lang=java
// Restore previous playing state
if (is_playing_desired) {
nativePlay();
@ -272,7 +272,7 @@ if (is_playing_desired) {
Here comes the first caveat, when re-enabling the UI buttons:
``` first-line: 103; theme: Default; brush: java; gutter: true
``` lang=java
// Re-enable buttons, now that GStreamer is initialized
final Activity activity = this;
runOnUiThread(new Runnable() {
@ -300,7 +300,7 @@ The same problem exists when the native code wants to output a string in
our TextView using the `setMessage()` method: it has to be done from the
UI thread. The solution is the same:
``` first-line: 83; theme: Default; brush: java; gutter: true
``` lang=java
private void setMessage(final String message) {
final TextView tv = (TextView) this.findViewById(R.id.textview_message);
runOnUiThread (new Runnable() {
@ -313,7 +313,7 @@ private void setMessage(final String message) {
Finally, a few remaining bits:
``` first-line: 72; theme: Default; brush: java; gutter: true
``` lang=java
protected void onSaveInstanceState (Bundle outState) {
Log.d ("GStreamer", "Saving state, playing:" + is_playing_desired);
outState.putBoolean("playing", is_playing_desired);
@ -324,7 +324,7 @@ This method stores the currently desired playing state when Android is
about to shut us down, so next time it restarts (after an orientation
change, for example), it can restore the same state.
``` first-line: 77; theme: Default; brush: java; gutter: true
``` lang=java
protected void onDestroy() {
nativeFinalize();
super.onDestroy();
@ -341,7 +341,7 @@ This concludes the UI part of the tutorial.
**jni/tutorial-2.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <jni.h>
#include <android/log.h>
@ -624,7 +624,7 @@ the basic tutorials, and it is used to hold all our information in one
place, so we can easily pass it around to
callbacks:
``` first-line: 22; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
jobject app; /* Application instance, used to call its methods. A global reference is kept. */
@ -651,7 +651,7 @@ the `long` type used in Java is always 64 bits wide, but the pointer
used in C can be either 32 or 64 bits wide. The macros take care of the
conversion without warnings.
``` first-line: 259; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Library initializer */
jint JNI_OnLoad(JavaVM *vm, void *reserved) {
JNIEnv *env = NULL;
@ -678,7 +678,7 @@ uses [pthread\_key\_create()](http://pubs.opengroup.org/onlinepubs/9699919799/f
to be able to store per-thread information, which is crucial to properly
manage the JNI Environment, as shown later.
``` first-line: 234; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Static class initializer: retrieve method and field IDs */
static jboolean gst_native_class_init (JNIEnv* env, jclass klass) {
custom_data_field_id = (*env)->GetFieldID (env, klass, "native_custom_data", "J");
@ -716,7 +716,7 @@ from Java:
This method is called at the end of Java's `onCreate()`.
``` first-line: 191; theme: Default; brush: cpp; gutter: true
``` lang=c
static void gst_native_init (JNIEnv* env, jobject thiz) {
CustomData *data = g_new0 (CustomData, 1);
SET_CUSTOM_DATA (env, thiz, custom_data_field_id, data);
@ -725,7 +725,7 @@ static void gst_native_init (JNIEnv* env, jobject thiz) {
It first allocates memory for the `CustomData` structure and passes the
pointer to the Java class with `SET_CUSTOM_DATA`, so it is remembered.
``` first-line: 197; theme: Default; brush: cpp; gutter: true
``` lang=c
data->app = (*env)->NewGlobalRef (env, thiz);
```
@ -734,7 +734,7 @@ in `CustomData` (a [Global
Reference](http://developer.android.com/guide/practices/jni.html#local_and_global_references)
is used) so its methods can be called later.
``` first-line: 199; theme: Default; brush: cpp; gutter: true
``` lang=c
pthread_create (&gst_app_thread, NULL, &app_function, data);
```
@ -743,7 +743,7 @@ Finally, a thread is created and it starts running the
### `app_function()`
``` first-line: 134; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Main method for the native code. This is executed on its own thread. */
static void *app_function (void *userdata) {
JavaVMAttachArgs args;
@ -766,7 +766,7 @@ is created with `g_main_context_new()` and then it is made the default
one for the thread with
`g_main_context_push_thread_default()`.
``` first-line: 149; theme: Default; brush: cpp; gutter: true
``` lang=c
data->pipeline = gst_parse_launch("audiotestsrc ! audioconvert ! audioresample ! autoaudiosink", &error);
if (error) {
gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
@ -781,7 +781,7 @@ It then creates a pipeline the easy way, with `gst-parse-launch()`. In
this case, it is simply an `audiotestsrc` (which produces a continuous
tone) and an `autoaudiosink`, with accompanying adapter elements.
``` first-line: 159; theme: Default; brush: cpp; gutter: true
``` lang=c
bus = gst_element_get_bus (data->pipeline);
bus_source = gst_bus_create_watch (bus);
g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, NULL, NULL);
@ -798,7 +798,7 @@ creation of the watch is done step by step instead of using
`gst_bus_add_signal_watch()` to exemplify how to use a custom GLib
context.
``` first-line: 169; theme: Default; brush: cpp; gutter: true
``` lang=c
GST_DEBUG ("Entering main loop... (CustomData:%p)", data);
data->main_loop = g_main_loop_new (data->context, FALSE);
check_initialization_complete (data);
@ -822,7 +822,7 @@ Once the main loop has quit, all resources are freed in lines 178 to
### `check_initialization_complete()`
``` first-line: 121; theme: Default; brush: cpp; gutter: true
``` lang=c
static void check_initialization_complete (CustomData *data) {
JNIEnv *env = get_jni_env ();
if (!data->initialized && data->main_loop) {
@ -866,7 +866,7 @@ see how it works, step by step:
### `get_jni_env()`
``` first-line: 68; theme: Default; brush: cpp; gutter: true
``` lang=c
static JNIEnv *get_jni_env (void) {
JNIEnv *env;
if ((env = pthread_getspecific (current_jni_env)) == NULL) {
@ -903,7 +903,7 @@ Let's now review the rest of the native methods accessible from Java:
### `gst_native_finalize()` (`nativeFinalize()` from Java)
``` first-line: 203; theme: Default; brush: cpp; gutter: true
``` lang=c
static void gst_native_finalize (JNIEnv* env, jobject thiz) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
@ -952,7 +952,7 @@ error or state changed message and display a message in the UI using the
### `set_ui_message()`
``` first-line: 80; theme: Default; brush: cpp; gutter: true
``` lang=c
static void set_ui_message (const gchar *message, CustomData *data) {
JNIEnv *env = get_jni_env ();
GST_DEBUG ("Setting message to: %s", message);
@ -997,7 +997,7 @@ method and free the UTF16 message with
**jni/Android.mk**
``` theme: Default; brush: ruby; gutter: true
``` lang=ruby
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)

View file

@ -38,7 +38,7 @@ until a main loop is running and a drawing surface has been received.
**src/com/gst\_sdk\_tutorials/tutorial\_3/Tutorial3.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_3;
import android.app.Activity;
@ -189,7 +189,7 @@ surface to the layout and changing the GStreamer pipeline to produce
video instead of audio. Only the parts of the code that are new will be
discussed.
``` first-line: 22; theme: Default; brush: java; gutter: true
``` lang=java
private native void nativeSurfaceInit(Object surface);
private native void nativeSurfaceFinalize();
```
@ -199,7 +199,7 @@ Two new entry points to the C code are defined,
when the video surface becomes available and when it is about to be
destroyed, respectively.
``` first-line: 61; theme: Default; brush: java; gutter: true
``` lang=java
SurfaceView sv = (SurfaceView) this.findViewById(R.id.surface_video);
SurfaceHolder sh = sv.getHolder();
sh.addCallback(this);
@ -214,7 +214,7 @@ interface. This is why we declared this Activity as implementing the
[SurfaceHolder.Callback](http://developer.android.com/reference/android/view/SurfaceHolder.Callback.html)
interface in line 16.
``` first-line: 127; theme: Default; brush: java; gutter: true
``` lang=java
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
Log.d("GStreamer", "Surface changed to format " + format + " width "
@ -245,7 +245,7 @@ Lets review the C code to see what these functions do.
**jni/tutorial-3.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <jni.h>
#include <android/log.h>
@ -589,7 +589,7 @@ First, our `CustomData` structure is augmented to keep a pointer to the
video sink element and the native window
handle:
``` first-line: 33; theme: Default; brush: cpp; gutter: true
``` lang=c
GstElement *video_sink; /* The video sink element which receives XOverlay commands */
ANativeWindow *native_window; /* The Android native window where video will be rendered */
```
@ -598,7 +598,7 @@ The `check_initialization_complete()` method is also augmented so that
it requires a native window before considering GStreamer to be
initialized:
``` first-line: 127; theme: Default; brush: cpp; gutter: true
``` lang=c
static void check_initialization_complete (CustomData *data) {
JNIEnv *env = get_jni_env ();
if (!data->initialized && data->native_window && data->main_loop) {
@ -627,14 +627,14 @@ effects in the `GSTREAMER_PLUGINS_EFFECTS` package), and an
`autovideosink` which will instantiate the adequate video sink for the
platform:
``` first-line: 159; theme: Default; brush: cpp; gutter: true
``` lang=c
data->pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink ", &error);
```
Here things start to get more
interesting:
``` first-line: 168; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set the pipeline to READY, so it can already accept a window handle, if we have one */
gst_element_set_state(data->pipeline, GST_STATE_READY);
@ -662,7 +662,7 @@ Now we will implement the two native functions called by the Java code
when the drawing surface becomes available or is about to be
destroyed:
``` first-line: 270; theme: Default; brush: cpp; gutter: true
``` lang=c
static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
@ -719,7 +719,7 @@ We finally store the new window handle and call
`check_initialization_complete()` to inform the Java code that
everything is set up, if that is the case.
``` first-line: 295; theme: Default; brush: cpp; gutter: true
``` lang=c
static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
@ -772,7 +772,7 @@ surface.
**src/com/gst\_sdk\_tutorials/tutorial\_3/GStreamerSurfaceView.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_3;
import android.content.Context;
@ -864,7 +864,7 @@ public class GStreamerSurfaceView extends SurfaceView {
**/jni/Android.mk**
``` theme: Default; brush: ruby; gutter: true
``` lang=ruby
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)

View file

@ -49,7 +49,7 @@ this view is collapsed by default. Click here to expand…
**src/com/gst\_sdk\_tutorials/tutorial\_4/Tutorial4.java**
``` theme: Default; brush: java; gutter: true
``` lang=java
package com.gst_sdk_tutorials.tutorial_4;
import java.text.SimpleDateFormat;
@ -313,7 +313,7 @@ offer the same functionalities. We keep track of this in the
`is_local_media` variable, and update it every time we change the media
URI:
``` first-line: 132; theme: Default; brush: java; gutter: true
``` lang=java
private void setMediaUri() {
nativeSetUri (mediaUri);
is_local_media = mediaUri.startsWith("file://");
@ -329,7 +329,7 @@ Every time the size of the media changes (which could happen mid-stream,
for some kind of streams), or when it is first detected, C code calls
our `onMediaSizeChanged()` callback:
``` first-line: 217; theme: Default; brush: java; gutter: true
``` lang=java
private void onMediaSizeChanged (int width, int height) {
Log.i ("GStreamer", "Media size changed to " + width + "x" + height);
final GStreamerSurfaceView gsv = (GStreamerSurfaceView) this.findViewById(R.id.surface_video);
@ -371,7 +371,7 @@ To realize the first function, C code will periodically call our
in the Seek Bar. Again we do so from the UI thread, using
`RunOnUiThread()`.
``` first-line: 176; theme: Default; brush: java; gutter: true
``` lang=java
private void setCurrentPosition(final int position, final int duration) {
final SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
@ -397,7 +397,7 @@ widget which we will use to display the current position and duration in
`HH:mm:ss / HH:mm:ss` textual format. The `updateTimeWidget()` method
takes care of it, and must be called every time the Seek Bar is updated:
``` first-line: 164; theme: Default; brush: java; gutter: true
``` lang=java
private void updateTimeWidget () {
final TextView tv = (TextView) this.findViewById(R.id.textview_time);
final SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
@ -419,7 +419,7 @@ the user to seek by dragging the thumb), we implement the
interface in the
Activity:
``` first-line: 22; theme: Default; brush: java; gutter: true
``` lang=java
public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSeekBarChangeListener {
```
@ -427,7 +427,7 @@ And we register the Activity as the listener for the [Seek
Bar](http://developer.android.com/reference/android/widget/SeekBar.html)s
events in the `onCreate()` method:
``` first-line: 80; theme: Default; brush: java; gutter: true
``` lang=java
SeekBar sb = (SeekBar) this.findViewById(R.id.seek_bar);
sb.setOnSeekBarChangeListener(this);
```
@ -436,7 +436,7 @@ We will now be notified of three events: When the user starts dragging
the thumb, every time the thumb moves and when the thumb is released by
the user:
``` first-line: 239; theme: Default; brush: java; gutter: true
``` lang=java
public void onStartTrackingTouch(SeekBar sb) {
nativePause();
} 
@ -448,7 +448,7 @@ pause the pipeline. If the user is searching for a particular scene, we
do not want it to keep
moving.
``` first-line: 230; theme: Default; brush: java; gutter: true
``` lang=java
public void onProgressChanged(SeekBar sb, int progress, boolean fromUser) {
if (fromUser == false) return;
desired_position = progress;
@ -468,7 +468,7 @@ this is, we jump to the indicated position as soon as the thumb moves.
Otherwise, the seek will be performed when the thumb is released, and
the only thing we do here is update the textual time widget.
``` first-line: 244; theme: Default; brush: java; gutter: true
``` lang=java
public void onStopTrackingTouch(SeekBar sb) {
// If this is a remote file, scrub seeking is probably not going to work smoothly enough.
// Therefore, perform only the seek when the slider is released.
@ -492,7 +492,7 @@ this view is collapsed by default. Click here to expand…
**jni/tutorial-4.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <jni.h>
#include <android/log.h>
@ -1068,7 +1068,7 @@ jint JNI_OnLoad(JavaVM *vm, void *reserved) {
Java code will call `gst_native_set_uri()` whenever it wants to change
the playing URI (in this tutorial the URI never changes, but it could):
``` first-line: 436; theme: Default; brush: cpp; gutter: true
``` lang=c
void gst_native_set_uri (JNIEnv* env, jobject thiz, jstring uri) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data || !data->pipeline) return;
@ -1116,7 +1116,7 @@ change during playback. For simplicity, this tutorial assumes that they
do not. Therefore, in the READY to PAUSED state change, once the Caps of
the decoded media are known, we inspect them in `check_media_size()`:
``` first-line: 252; theme: Default; brush: cpp; gutter: true
``` lang=c
static void check_media_size (CustomData *data) {
JNIEnv *env = get_jni_env ();
GstElement *video_sink;
@ -1167,7 +1167,7 @@ To keep the UI updated, a GLib timer is installed in the
`app_function()` that fires 4 times per second (or every 250ms), right
before entering the main loop:
``` first-line: 377; theme: Default; brush: cpp; gutter: true
``` lang=c
timeout_source = g_timeout_source_new (250);
g_source_set_callback (timeout_source, (GSourceFunc)refresh_ui, data, NULL);
g_source_attach (timeout_source, data->context);
@ -1176,7 +1176,7 @@ g_source_unref (timeout_source); 
Then, in the refresh\_ui method:
``` first-line: 126; theme: Default; brush: cpp; gutter: true
``` lang=c
static gboolean refresh_ui (CustomData *data) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 current = -1;
@ -1230,7 +1230,7 @@ see how to overcome these problems.
In
`gst_native_set_position()`:
``` first-line: 468; theme: Default; brush: cpp; gutter: true
``` lang=c
void gst_native_set_position (JNIEnv* env, jobject thiz, int milliseconds) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data) return;
@ -1249,7 +1249,7 @@ away; otherwise, store the desired position in the
`desired_position` variable. Then, in the
`state_changed_cb()` callback:
``` first-line: 297; theme: Default; brush: cpp; gutter: true
``` lang=c
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
/* By now the sink already knows the media size */
check_media_size(data);
@ -1286,7 +1286,7 @@ once this period elapses.
To achieve this, all seek requests are routed through the
`execute_seek()` method:
``` first-line: 154; theme: Default; brush: cpp; gutter: true
``` lang=c
static void execute_seek (gint64 desired_position, CustomData *data) {
gint64 diff;
@ -1355,7 +1355,7 @@ using buffering. The same procedure is used here, by listening to the
buffering
messages:
``` first-line: 372; theme: Default; brush: cpp; gutter: true
``` lang=c
g_signal_connect (G_OBJECT (bus), "message::buffering", (GCallback)buffering_cb, data);
```
@ -1363,7 +1363,7 @@ And pausing the pipeline until buffering is complete (unless this is a
live
source):
``` first-line: 224; theme: Default; brush: cpp; gutter: true
``` lang=c
static void buffering_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
gint percent;
@ -1396,7 +1396,7 @@ is `GSTREAMER_PLUGINS`:
**jni/Android.mk**
``` first-line: 19; theme: Default; brush: plain; gutter: true
```
GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_PLAYBACK) $(GSTREAMER_PLUGINS_CODECS) $(GSTREAMER_PLUGINS_NET) $(GSTREAMER_PLUGINS_SYS)
```

View file

@ -28,7 +28,7 @@ each file to expand.
**CMakeLists.txt**
``` theme: Default; brush: plain; gutter: true
```
project(qtgst-example-player)
find_package(QtGStreamer REQUIRED)
# automoc is now a built-in tool since CMake 2.8.6.
@ -52,7 +52,7 @@ target_link_libraries(player ${QTGSTREAMER_UI_LIBRARIES} ${QT_QTOPENGL_LIBRARIES
**main.cpp**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include "mediaapp.h"
#include <QtWidgets/QApplication>
#include <QGst/Init>
@ -73,7 +73,7 @@ int main(int argc, char *argv[])
**mediaapp.h**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#ifndef MEDIAAPP_H
#define MEDIAAPP_H
#include <QtCore/QTimer>
@ -126,7 +126,7 @@ private:
**mediaapp.cpp**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include "mediaapp.h"
#include "player.h"
#if (QT_VERSION >= QT_VERSION_CHECK(5, 0, 0))
@ -326,7 +326,7 @@ void MediaApp::createUI(QBoxLayout *appLayout)
**player.h**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#ifndef PLAYER_H
#define PLAYER_H
#include <QtCore/QTimer>
@ -374,7 +374,7 @@ private:
**player.cpp**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include "player.h"
#include <QtCore/QDir>
#include <QtCore/QUrl>
@ -555,7 +555,7 @@ We begin by looking at `main()`:
**main.cpp**
``` first-line: 4; theme: Default; brush: cpp; gutter: true
``` lang=c
int main(int argc, char *argv[])
{
QApplication app(argc, argv);
@ -584,7 +584,7 @@ the UI:
**MediaApp::MediaApp()**
``` first-line: 20; theme: Default; brush: cpp; gutter: true
``` lang=c
//create the player
m_player = new Player(this);
connect(m_player, SIGNAL(positionChanged()), this, SLOT(onPositionChanged()));
@ -596,7 +596,7 @@ line, if any:
**MediaApp::openFile()**
``` first-line: 43; theme: Default; brush: cpp; gutter: true
``` lang=c
void MediaApp::openFile(const QString & fileName)
{
m_baseDir = QFileInfo(fileName).path();
@ -610,7 +610,7 @@ This in turn instructs the `Player` to construct our GStreamer pipeline:
**Player::setUri()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::setUri(const QString & uri)
{
QString realUri = uri;
@ -650,7 +650,7 @@ rendering. For clarity, here is a portion of the implementation:
**prepare-xwindow-id handling**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
QGlib::connect(pipeline->bus(), "sync-message",
this, &PipelineWatch::onBusSyncMessage);
...
@ -666,7 +666,7 @@ void PipelineWatch::onBusSyncMessage(const MessagePtr & msg)
Once the pipeline is created, we connect to the bus' message signal (via
`QGlib::connect()`) to dispatch state change signals:
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::onBusMessage(const QGst::MessagePtr & message)
{
switch (message->type()) {
@ -708,7 +708,7 @@ void Player::handlePipelineStateChange(const QGst::StateChangedMessagePtr & scm)
Finally, we tell `playbin2` what to play by setting the `uri` property:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
m_pipeline->setProperty("uri", realUri);
```
@ -719,7 +719,7 @@ After `Player::setUri()` is called, `MediaApp::openFile()` calls
**Player::play()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::play()
{
if (m_pipeline) {
@ -732,7 +732,7 @@ The other state control methods are equally simple:
**Player state functions**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::pause()
{
if (m_pipeline) {
@ -756,7 +756,7 @@ is emitted on the GStreamer bus which gets picked up by the `Player`:
**Player::onBusMessage()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void Player::onBusMessage(const QGst::MessagePtr & message)
{
switch (message->type()) {
@ -783,7 +783,7 @@ handled:
**MediaApp::onStateChanged()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void MediaApp::onStateChanged()
{
QGst::State newState = m_player->state();
@ -812,7 +812,7 @@ UI to handle:
**MediaApp::onPositionChanged()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
void MediaApp::onPositionChanged()
{
QTime length(0,0);
@ -844,7 +844,7 @@ to `gst_element_query_position()`:
**Player::position()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
QTime Player::position() const
{
if (m_pipeline) {

View file

@ -74,7 +74,7 @@ In simple form, a PIPELINE-DESCRIPTION is a list of element types
separated by exclamation marks (\!). Go ahead and type in the following
command:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
```
@ -98,7 +98,7 @@ spaces). Use the `gst-inspect` tool (explained next) to find out the
available properties for an
element.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 videotestsrc pattern=11 ! ffmpegcolorspace ! autovideosink
```
@ -115,7 +115,7 @@ example.
Named elements are referred to using their name followed by a
dot.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! tee name=t ! queue ! autovideosink t. ! queue ! autovideosink
```
@ -149,7 +149,7 @@ This is useful, for example, when you want to retrieve one particular
stream out of a
demuxer:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
```
@ -169,7 +169,7 @@ All in all, we took a webm file, stripped it of audio, and generated a
new matroska file with the video. If we wanted to keep only the
audio:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.audio_00 ! vorbisparse ! matroskamux ! filesink location=sintel_audio.mka
```
@ -195,7 +195,7 @@ saying that GStreamer will choose one output pad at random.
Consider the following
pipeline:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! filesink location=test
```
@ -209,7 +209,7 @@ You can remove this ambiguity, though, by using named pads, as in the
previous sub-section, or by using **Caps
Filters**:
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! video/x-vp8 ! matroskamux ! filesink location=sintel_video.mkv
```
@ -230,7 +230,7 @@ producing for a particular pipeline, run `gst-launch` as usual, with the
Play a media file using `playbin2` (as in [Basic tutorial 1: Hello
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html)):
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm
```
@ -238,7 +238,7 @@ A fully operation playback pipeline, with audio and video (more or less
the same pipeline that `playbin2` will create
internally):
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d ! queue ! vp8dec ! ffmpegcolorspace ! autovideosink d. ! queue ! vorbisdec ! audioconvert ! audioresample ! autoaudiosink
```
@ -248,7 +248,7 @@ with a different codec, and puts them back together in an Ogg container
(just for the sake of
it).
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm name=d ! queue ! theoraenc ! oggmux name=m ! filesink location=sintel.ogg d. ! queue ! audioconvert ! audioresample ! flacenc ! m.
```
@ -257,7 +257,7 @@ operation whenever the frame size is different in the input and the
output caps. The output caps are set by the Caps Filter to
320x200.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! queue ! videoscale ! video/x-raw-yuv,width=320,height=200 ! ffmpegcolorspace ! autovideosink
```
@ -279,7 +279,7 @@ This tool has three modes of operation:
Let's see an example of the third mode:
``` theme: Default; brush: plain; gutter: true
```
gst-inspect-0.10 vp8dec
 
Factory Details:
@ -400,7 +400,7 @@ which basically control the amount of verbosity of the output.
Let's see an
example:
``` theme: Default; brush: plain; gutter: false
```
gst-discoverer-0.10 http://docs.gstreamer.com/media/sintel_trailer-480p.webm -v
Analyzing http://docs.gstreamer.com/media/sintel_trailer-480p.webm

View file

@ -28,7 +28,7 @@ The debug output is controlled with the `GST_DEBUG` environment
variable. Heres an example with
`GST_DEBUG=2`:
``` theme: Default; brush: plain; gutter: false
```
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
```
@ -97,7 +97,7 @@ specific messages.
The content of each line in the debug output
is:
``` theme: Default; brush: plain; gutter: false
```
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
```
@ -159,7 +159,7 @@ as the Debug category in the output log).
To change the category to something more meaningful, add these two lines
at the top of your code:
``` theme: Default; brush: cpp; gutter: true
``` lang=c
GST_DEBUG_CATEGORY_STATIC (my_category);
#define GST_CAT_DEFAULT my_category
```
@ -167,7 +167,7 @@ GST_DEBUG_CATEGORY_STATIC (my_category);
And then this one after you have initialized GStreamer with
`gst_init()`:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
GST_DEBUG_CATEGORY_INIT (my_category, "my category", 0, "This is my very own");
```

View file

@ -62,7 +62,7 @@ Copy this code into a text file named `basic-tutorial-12.c`.
**basic-tutorial-12.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
#include <string.h>
@ -195,7 +195,7 @@ therefore, the initialization code is very simple and should be
self-explanative by now. The only new bit is the detection of live
streams:
``` first-line: 74; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Start playing */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -221,7 +221,7 @@ them, so we take note of the result of `gst_element_set_state()` in the
Lets now review the interesting parts of the message parsing callback:
``` first-line: 31; theme: Default; brush: cpp; gutter: true
``` lang=c
case GST_MESSAGE_BUFFERING: {
gint percent = 0;
@ -254,7 +254,7 @@ network becomes slow or unresponsive and our buffer depletes, we will
receive new buffering messages with levels below 100% so we will pause
the pipeline again until enough buffer has been built up.
``` theme: Default; brush: cpp; gutter: false
``` lang=c
case GST_MESSAGE_CLOCK_LOST:
/* Get a new clock */
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);

View file

@ -69,7 +69,7 @@ Copy this code into a text file named `basic-tutorial-13.c`.
**basic-tutorial-13.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <gst/gst.h>
@ -250,7 +250,7 @@ keystrokes and a GLib main loop is executed.
Then, in the keyboard handler function:
``` first-line: 45; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Process keyboard input */
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
gchar *str = NULL;
@ -270,7 +270,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
Pause / Playing toggle is handled with `gst_element_set_state()` as in
previous tutorials.
``` first-line: 59; theme: Default; brush: cpp; gutter: true
``` lang=c
case 's':
if (g_ascii_isupper (str[0])) {
data->rate *= 2.0;
@ -290,7 +290,7 @@ reverse the current playback direction. In both cases, the
`rate` variable is updated and `send_seek_event` is called. Lets
review this function.
``` first-line: 13; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Send seek event to change rate */
static void send_seek_event (CustomData *data) {
gint64 position;
@ -312,7 +312,7 @@ want to move, we jump to the current position. Using a Step Event would
be simpler, but this event is not currently fully functional, as
explained in the Introduction.
``` first-line: 25; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create the seek event */
if (data->rate > 0) {
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
@ -329,7 +329,7 @@ position. Regardless of the playback direction, the start position must
be smaller than the stop position, so the two playback directions are
treated differently.
``` first-line: 34; theme: Default; brush: cpp; gutter: true
``` lang=c
if (data->video_sink == NULL) {
/* If we have not done so, obtain the sink through which we will send the seek events */
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
@ -343,7 +343,7 @@ at this time instead at initialization time because the actual sink may
change depending on the media contents, and this wont be known until
the pipeline is PLAYING and some media has been read.
``` first-line: 39; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Send the event */
gst_element_send_event (data->video_sink, seek_event);
```
@ -354,7 +354,7 @@ The new Event is finally sent to the selected sink with
Back to the keyboard handler, we still miss the frame stepping code,
which is really simple:
``` first-line: 71; theme: Default; brush: cpp; gutter: true
``` lang=c
case 'n':
if (data->video_sink == NULL) {
/* If we have not done so, obtain the sink through which we will send the step events */

View file

@ -37,11 +37,11 @@ a `decodebin2` element. It acts like a demuxer, so it offers as many
source pads as streams are found in the
media.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! ffmpegcolorspace ! autovideosink
```
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! audioconvert ! autoaudiosink
```
@ -55,7 +55,7 @@ replaces the old `decodebin` element. It acts like a demuxer, so it
offers as many source pads as streams are found in the
media.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin2 ! autovideosink
```
@ -69,7 +69,7 @@ using a `typefind` element or by setting the `typefind` property
of `filesrc` to
`TRUE`.
``` theme: Default; brush: cpp; gutter: false
``` lang=c
gst-launch-0.10 filesrc location=f:\\media\\sintel\\sintel_trailer-480p.webm ! decodebin2 ! autovideosink
```
@ -79,7 +79,7 @@ This element writes to a file all the media it receives. Use the
`location` property to specify the file
name.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 audiotestsrc ! vorbisenc ! oggmux ! filesink location=test.ogg
```
@ -91,7 +91,7 @@ This element receives data as a client over the network via HTTP using
the SOUP library. Set the URL to retrieve through the `location`
property.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin2 ! autovideosink
```
@ -106,7 +106,7 @@ are “guaranteed” to work.
This element produces a video pattern (selectable among many different
options with the `pattern` property). Use it to test video pipelines.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
```
@ -115,7 +115,7 @@ gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
This element produces an audio wave (selectable among many different
options with the `wave` property). Use it to test video pipelines.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 audiotestsrc ! audioconvert ! autoaudiosink
```
@ -137,7 +137,7 @@ elements whose Caps are unknown at design time, like `autovideosink`, or
that can vary depending on external factors, like decoding a
user-provided file.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
```
@ -157,7 +157,7 @@ It is therefore a good idea to always use it whenever the actual frame
rate is unknown at design time, just in
case.
``` theme: Default; brush: cpp; gutter: false
``` lang=c
gst-launch-0.10 videotestsrc ! video/x-raw-rgb,framerate=30/1 ! videorate ! video/x-raw-rgb,framerate=1/1 ! ffmpegcolorspace ! autovideosink
```
@ -178,7 +178,7 @@ user, it is a good idea to use a `videoscale` element, since not all
video sinks are capable of performing scaling
operations.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! videoscale ! video/x-raw-yuv,width=178,height=100 ! ffmpegcolorspace ! autovideosink
```
@ -195,7 +195,7 @@ Like `ffmpegcolorspace` does for video, you use this to solve
negotiation problems with audio, and it is generally safe to use it
liberally, since this element does nothing if it is not needed.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 audiotestsrc ! audioconvert ! autoaudiosink
```
@ -208,7 +208,7 @@ Again, use it to solve negotiation problems regarding sampling rates and
do not fear to use it
generously.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! audioresample ! audio/x-raw-float,rate=4000 ! audioconvert ! autoaudiosink
```
@ -295,7 +295,7 @@ separate threads for each branch. Otherwise a blocked dataflow in one
branch would stall the other
branches.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 audiotestsrc ! tee name=t ! queue ! audioconvert ! autoaudiosink t. ! queue ! wavescope ! ffmpegcolorspace ! autovideosink
```
@ -311,7 +311,7 @@ the `capsfilter` element. This element does not modify data as such,
but enforces limitations on the data
format.
``` theme: Default; brush: bash; gutter: false
``` lang=bash
gst-launch-0.10 videotestsrc ! video/x-raw-gray ! ffmpegcolorspace ! autovideosink
```
@ -338,7 +338,7 @@ equation. It can be very verbose when combined with the `-v` switch
of `gst-launch`, so use the `silent` property to remove any unwanted
noise.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 audiotestsrc num-buffers=1000 ! fakesink sync=false
```
@ -350,7 +350,7 @@ checking, or buffer dropping. Read its documentation to learn all the
things this seemingly harmless element can
do.
``` theme: Default; brush: plain; gutter: false
```
gst-launch-0.10 audiotestsrc ! identity drop-probability=0.1 ! audioconvert ! autoaudiosink
```

View file

@ -36,7 +36,7 @@ Copy this code into a text file named `basic-tutorial-15.c`..
**basic-tutorial-15.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <clutter-gst/clutter-gst.h>
/* Setup the video texture once its size is known */
@ -165,7 +165,7 @@ how to integrate GStreamer with it. This is accomplished through the
clutter-gst library, so its header must be included (and the program
must link against it):
``` first-line: 1; theme: Default; brush: cpp; gutter: true
``` lang=c
#include <clutter-gst/clutter-gst.h>
```
@ -173,7 +173,7 @@ The first thing this library does is initialize both GStreamer and
Clutter, so you must call ` clutter-gst-init()` instead of initializing
these libraries yourself.
``` first-line: 43; theme: Default; brush: cpp; gutter: true
``` lang=c
/* clutter-gst takes care of initializing Clutter and GStreamer */
if (clutter_gst_init (&argc, &argv) != CLUTTER_INIT_SUCCESS) {
g_error ("Failed to initialize clutter\n");
@ -186,7 +186,7 @@ create a texture. Just remember to disable texture slicing to allow for
proper
integration:
``` first-line: 55; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create new texture and disable slicing so the video is properly mapped onto it */
texture = CLUTTER_ACTOR (g_object_new (CLUTTER_TYPE_TEXTURE, "disable-slicing", TRUE, NULL));
g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL);
@ -195,7 +195,7 @@ g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL);
We connect to the size-change signal so we can perform final setup once
the video size is known.
``` theme: Default; brush: cpp; gutter: true
``` lang=c
/* Instantiate the Clutter sink */
sink = gst_element_factory_make ("autocluttersink", NULL);
if (sink == NULL) {
@ -216,14 +216,14 @@ release of the SDK, so, if it cannot be found, the
simpler `cluttersink` element is created
instead.
``` first-line: 73; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Link GStreamer with Clutter by passing the Clutter texture to the Clutter sink*/
g_object_set (sink, "texture", texture, NULL);
```
This texture is everything GStreamer needs to know about Clutter.
``` first-line: 76; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Add the Clutter sink to the pipeline */
g_object_set (pipeline, "video-sink", sink, NULL);
```

View file

@ -35,7 +35,7 @@ in the SDK installation).
**basic-tutorial-4.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
/* Structure to contain all our information, so we can pass it around */

View file

@ -84,7 +84,7 @@ in the SDK installation).
**basic-tutorial-7.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
int main(int argc, char *argv[]) {

View file

@ -366,7 +366,7 @@ Always Pads, and manually link the Request Pads of the `tee` element.
Regarding the configuration of the `appsrc` and `appsink` elements:
``` first-line: 159; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Configure appsrc */
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
audio_caps = gst_caps_from_string (audio_caps_text);
@ -387,7 +387,7 @@ fired by `appsrc` when its internal queue of data is running low or
almost full, respectively. We will use these signals to start and stop
(respectively) our signal generation process.
``` first-line: 166; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Configure appsink */
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
@ -404,7 +404,7 @@ Starting the pipeline, waiting for messages and final cleanup is done as
usual. Let's review the callbacks we have just
registered:
``` first-line: 67; theme: Default; brush: cpp; gutter: true
``` lang=c
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
* to the mainloop to start pushing data into the appsrc */
static void start_feed (GstElement *source, guint size, CustomData *data) {
@ -433,7 +433,7 @@ We take note of the sourceid that `g_idle_add()` returns, so we can
disable it
later.
``` first-line: 76; theme: Default; brush: cpp; gutter: true
``` lang=c
/* This callback triggers when appsrc has enough data and we can stop sending.
* We remove the idle handler from the mainloop */
static void stop_feed (GstElement *source, CustomData *data) {
@ -450,7 +450,7 @@ enough so we stop pushing data. Here we simply remove the idle function
by using `g_source_remove()` (The idle function is implemented as a
`GSource`).
``` first-line: 22; theme: Default; brush: cpp; gutter: true
``` lang=c
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
* The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
* and is removed when appsrc has enough data (enough-data signal).
@ -500,7 +500,7 @@ We will skip over the waveform generation, since it is outside the scope
of this tutorial (it is simply a funny way of generating a pretty
psychedelic wave).
``` first-line: 53; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Push the buffer into the appsrc */
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
@ -514,7 +514,7 @@ tutorial 1: Playbin2
usage](Playback+tutorial+1+Playbin2+usage.markdown)), and then
`gst_buffer_unref()` it since we no longer need it.
``` first-line: 86; theme: Default; brush: cpp; gutter: true
``` lang=c
/* The appsink has received a buffer */
static void new_buffer (GstElement *sink, CustomData *data) {
GstBuffer *buffer;

View file

@ -81,7 +81,7 @@ in the SDK installation).
**basic-tutorial-9.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <gst/gst.h>
#include <gst/pbutils/pbutils.h>
@ -328,7 +328,7 @@ int main (int argc, char **argv) {
These are the main steps to use the `GstDiscoverer`:
``` first-line: 182; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Instantiate the Discoverer */
data.discoverer = gst_discoverer_new (5 * GST_SECOND, &err);
if (!data.discoverer) {
@ -342,7 +342,7 @@ if (!data.discoverer) {
parameter is the timeout per file, in nanoseconds (use the
`GST_SECOND` macro for simplicity).
``` first-line: 190; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Connect to the interesting signals */
g_signal_connect (data.discoverer, "discovered", G_CALLBACK (on_discovered_cb), &data);
g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &data);
@ -351,7 +351,7 @@ g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &dat
Connect to the interesting signals, as usual. We discuss them in the
snippet for their callbacks.
``` first-line: 194; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Start the discoverer process (nothing to do yet) */
gst_discoverer_start (data.discoverer);
```
@ -360,7 +360,7 @@ gst_discoverer_start (data.discoverer);
not provided any URI to discover yet. This is done
next:
``` first-line: 197; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Add a request to process asynchronously the URI passed through the command line */
if (!gst_discoverer_discover_uri_async (data.discoverer, uri)) {
g_print ("Failed to start discovering URI '%s'\n", uri);
@ -375,7 +375,7 @@ discovery process for each of them finishes, the registered callback
functions will be fired
up.
``` first-line: 204; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create a GLib Main Loop and set it to run, so we can wait for the signals */
data.loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.loop);
@ -385,7 +385,7 @@ The usual GLib main loop is instantiated and executed. We will get out
of it when `g_main_loop_quit()` is called from the
`on_finished_cb` callback.
``` first-line: 208; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Stop the discoverer process */
gst_discoverer_stop (data.discoverer);
```
@ -396,7 +396,7 @@ Once we are done with the discoverer, we stop it with
Let's review now the callbacks we have
registered:
``` first-line: 85; theme: Default; brush: cpp; gutter: true
``` lang=c
/* This function is called every time the discoverer has information regarding
* one of the URIs we provided.*/
static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info, GError *err, CustomData *data) {
@ -417,7 +417,7 @@ case we had multiple discover process running, which is not the case in
this example) with `gst_discoverer_info_get_uri()` and the discovery
result with `gst_discoverer_info_get_result()`.
``` first-line: 95; theme: Default; brush: cpp; gutter: true
``` lang=c
switch (result) {
case GST_DISCOVERER_URI_INVALID:
g_print ("Invalid URI '%s'\n", uri);
@ -467,7 +467,7 @@ If no error happened, information can be retrieved from the
Bits of information which are made of lists, like tags and stream info,
needs some extra parsing:
``` first-line: 133; theme: Default; brush: cpp; gutter: true
``` lang=c
tags = gst_discoverer_info_get_tags (info);
if (tags) {
g_print ("Tags:\n");
@ -482,7 +482,7 @@ or a specific tag could be searched for with
`gst_tag_list_get_string()`). The code for `print_tag_foreach` is pretty
much self-explicative.
``` first-line: 143; theme: Default; brush: cpp; gutter: false
``` lang=c
sinfo = gst_discoverer_info_get_stream_info (info);
if (!sinfo)
return;
@ -499,7 +499,7 @@ a `GstDiscovererStreamInfo` structure that is parsed in
the `print_topology` function, and then discarded
with `gst_discoverer_stream_info_unref()`.
``` first-line: 60; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Print information regarding a stream and its substreams, if any */
static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
GstDiscovererStreamInfo *next;

View file

@ -61,7 +61,7 @@ distribution:
And copy it to the `/etc/apt/sources.list.d/` directory on your system:
``` theme: Default; brush: plain; gutter: false
```
sudo cp gstreamer-sdk.list /etc/apt/sources.list.d/
```
@ -70,7 +70,7 @@ be added and the apt repository list needs to be refreshed. This can be
done by
running:
``` theme: Default; brush: plain; gutter: false
```
wget -q -O - http://www.freedesktop.org/software/gstreamer-sdk/sdk.gpg | sudo apt-key add -
sudo apt-get update
```
@ -78,7 +78,7 @@ sudo apt-get update
Now that the new repositories are available, install the SDK with the
following command:
``` theme: Default; brush: plain; gutter: false
```
sudo apt-get install gstreamer-sdk-dev
```
@ -103,7 +103,7 @@ distribution:
And copy it to the `/etc/apt/sources.list.d/` directory on your system:
``` theme: Default; brush: plain; gutter: false
```
su -c 'cp gstreamer-sdk.list /etc/apt/sources.list.d/'
```
@ -112,7 +112,7 @@ be added and the apt repository list needs to be refreshed. This can be
done by
running:
``` theme: Default; brush: plain; gutter: false
```
su -c 'wget -q -O - http://www.freedesktop.org/software/gstreamer-sdk/sdk.gpg | apt-key add -'
su -c 'apt-get update'
```
@ -120,7 +120,7 @@ su -c 'apt-get update'
Now that the new repositories are available, install the SDK with the
following command:
``` theme: Default; brush: plain; gutter: false
```
su -c 'apt-get install gstreamer-sdk-dev'
```
@ -141,21 +141,21 @@ distribution:
And copy it to the `/etc/yum.repos.d/` directory on your system:
``` theme: Default; brush: plain; gutter: false
```
su -c 'cp gstreamer-sdk.repo /etc/yum.repos.d/'
```
After adding the repositories, the yum repository list needs to be
refreshed. This can be done by running:
``` theme: Default; brush: plain; gutter: false
```
su -c 'yum update'
```
Now that the new repositories are available, install the SDK with the
following command:
``` theme: Default; brush: plain; gutter: false
```
su -c 'yum install gstreamer-sdk-devel'
```
@ -170,7 +170,7 @@ installed in a non-standard location `/opt/gstreamer-sdk`. The shell
script `gst-sdk-shell` sets the required environment variables for
building applications with the GStreamer SDK:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
/opt/gstreamer-sdk/bin/gst-sdk-shell
```
@ -179,7 +179,7 @@ the `gcc` compiler and a text editor. In order to compile code that
requires the GStreamer SDK and uses the GStreamer core library, remember
to add this string to your `gcc` command:
``` theme: Default; brush: plain; gutter: false
```
`pkg-config --cflags --libs gstreamer-0.10`
```
@ -214,7 +214,7 @@ available in a GIT repository and distributed with the SDK.
The GIT repository can be cloned with:
``` theme: Default; brush: plain; gutter: false
```
git clone git://anongit.freedesktop.org/gstreamer-sdk/gst-sdk-tutorials
```
@ -231,7 +231,7 @@ Run `/opt/gstreamer-sdk/bin/gst-sdk-shell` to enter this shell.
Then go to the folder where you copied/cloned the tutorials and
write:
``` theme: Default; brush: plain; gutter: false
```
gcc basic-tutorial-1.c -o basic-tutorial-1 `pkg-config --cflags --libs gstreamer-0.10`
```
@ -254,7 +254,7 @@ Using the file name of the tutorial you are interested in
To run the tutorials, simply execute the desired tutorial (**from within
the `gst-sdk-shell`**):
``` theme: Default; brush: cpp; gutter: false
``` lang=c
./basic-tutorial-1
```

View file

@ -31,7 +31,7 @@ should replace `$INSTALL_PATH` with the path where your installer copied
the SDK's disk image files (the `/tmp` directory is good place to
install it as it will be removed at the end of the installation):
``` theme: Default; brush: bash; gutter: false
``` lang=bash
hdiutil attach $INSTALL_PATH/gstreamer-sdk-2012.7-x86.dmg
cd /Volumes/gstreamer-sdk-2012.7-x86/
installer -pkg gstreamer-sdk-2012.7-x86.pkg -target "/"
@ -47,7 +47,7 @@ simply copy the framework to the application's Frameworks folder as
defined in the [bundle programming
guide](https://developer.apple.com/library/mac/documentation/CoreFoundation/Conceptual/CFBundles/BundleTypes/BundleTypes.html#//apple_ref/doc/uid/10000123i-CH101-SW19):
``` theme: Default; brush: bash; gutter: false
``` lang=bash
cp -r /Library/Frameworks/GStreamer.framework ~/MyApp.app/Contents/Frameworks
```
@ -56,7 +56,7 @@ different architectures, installed in the system. Make sure you only
copy the version you need and that you update accordingly the link
`GStreamer.framework/Version/Current`:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
$ ls -l Frameworks/GStreamer.framework/Version/Current
lrwxr-xr-x 1 fluendo staff 21 Jun 5 18:46 Frameworks/GStreamer.framework/Versions/Current -> ../Versions/0.10/x86
```
@ -274,7 +274,7 @@ We can get the list of paths used by an object file to locate its
dependent dynamic libraries
using [otool](https://developer.apple.com/library/mac/#documentation/darwin/reference/manpages/man1/otool.1.html):
``` theme: Default; brush: bash; gutter: false
``` lang=bash
$ otool -L /Library/Frameworks/GStreamer.framework/Commands/gst-launch-0.10
/Library/Frameworks/GStreamer.framework/Commands/gst-launch-0.10:
/System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation (compatibility version 150.0.0, current version 550.43.0)
@ -293,7 +293,7 @@ This full path is extracted from the dynamic library  ***install name***
install name of a library can be retrieved with
[otool](https://developer.apple.com/library/mac/#documentation/darwin/reference/manpages/man1/otool.1.html) too:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
$ otool -D /Library/Frameworks/GStreamer.framework/Libraries/libgstreamer-0.10.dylib
/Library/Frameworks/GStreamer.framework/Libraries/libgstreamer-0.10.dylib:
/Library/Frameworks/GStreamer.framework/Versions/0.10/x86/lib/libgstreamer-0.10.0.dylib
@ -348,7 +348,7 @@ When looking for binaries to fix, we will run the script in the
following
directories:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/lib /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/libexec /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/bin /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r

View file

@ -28,7 +28,7 @@ In the Cerbero installation directory you will find the
`cerbero-uninstalled` script. Execute it without parameters to see the
list of commands it accepts:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled
```
@ -37,7 +37,7 @@ list of commands it accepts:
The first step is to create an empty recipe that you can then tailor to
your needs:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled add-recipe my-app 1.0
```
@ -204,7 +204,7 @@ Alternatively, you can pass some options to cerbero-uninstalled so some
of these attributes are already set for you. For
example:
``` theme: Default; brush: python; gutter: false
```
./cerbero-uninstalled add-recipe --licenses "LGPL" --deps "glib,gtk+" --origin "git://git.my-app.com" --commit "git-commit-to-use" my-app 1.0
```
@ -212,7 +212,7 @@ See `./cerbero-uninstalled add-recipe -h` for help.
As an example, this is the recipe used to build the Snappy media player:
``` theme: Default; brush: python; gutter: false
```
class Recipe(recipe.Recipe):
name = 'snappy'
version = '0.2+git'
@ -242,7 +242,7 @@ Snappy.
Once the recipe is ready, instruct Cerbero to build it:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled build my-app
```
@ -257,7 +257,7 @@ files in `cerbero/packages`.
Now, to create an empty package, do:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled add-package my-app 1.0
```
@ -410,7 +410,7 @@ Alternatively you can also pass some options to `cerbero-uninstalled`,
for
example:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled add-package my-app 1.0 --license "LGPL" --codename MyApp --vendor MyAppVendor --url "http://www.my-app.com" --files=my-app:bins:libs --files-devel=my-app:devel --platform-files=linux:my-app:linux_specific --platform-files-devel=linux:my-app:linux_specific_devel,windows:my-app:windows_specific_devel --deps base-system --includes gstreamer-core
```
@ -419,7 +419,7 @@ See `./cerbero-uninstalled add-package -h` for help.
As an example, this is the package file that is used for packaging the
`gstreamer-core` package:
``` theme: Default; brush: python; gutter: false
```
class Package(package.Package):
name = 'gstreamer-codecs'
shortdesc = 'GStreamer codecs'
@ -472,7 +472,7 @@ packages\_prefix as the ones in your Cerbero configuration file.
Finally, build your package by using:
``` theme: Default; brush: bash; gutter: false
``` lang=bash
./cerbero-uninstalled package your-package 
```

View file

@ -64,7 +64,7 @@ it in the SDK installation).
**playback-tutorial-1.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
/* Structure to contain all our information, so we can pass it around */
@ -314,7 +314,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
# Walkthrough
``` first-line: 3; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */
@ -337,7 +337,7 @@ streams of each type, and the currently playing one. Also, we are going
to use a different mechanism to wait for messages that allows
interactivity, so we need a GLib's main loop object.
``` first-line: 18; theme: Default; brush: cpp; gutter: true
``` lang=c
/* playbin2 flags */
typedef enum {
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
@ -356,7 +356,7 @@ be retrieved at runtime without using this trick, but in a far more
cumbersome
way.
``` first-line: 25; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Forward definition for the message and keyboard processing functions */
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data);
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data);
@ -375,7 +375,7 @@ the pipeline, and use directly the  `playbin2` element.
We focus on some of the other properties of `playbin2`, though:
``` first-line: 50; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set flags to show Audio and Video but ignore Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO;
@ -419,7 +419,7 @@ values (this is why we read the current value of the flags with
`g_object_get()` before overwriting it with
`g_object_set()`).
``` first-line: 56; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set connection speed. This will affect some internal decisions of playbin2 */
g_object_set (data.playbin2, "connection-speed", 56, NULL);
```
@ -435,13 +435,13 @@ We have set all these properties one by one, but we could have all of
them with a single call to
`g_object_set()`:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", "flags", flags, "connection-speed", 56, NULL);
```
This is why `g_object_set()` requires a NULL as the last parameter.
``` first-line: 63; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
@ -459,7 +459,7 @@ GStreamer has little to do with it besides the Navigation interface
discussed briefly in [Tutorial 17: DVD
playback](http://docs.gstreamer.com/display/GstSDK/Tutorial+17%3A+DVD+playback).
``` first-line: 79; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create a GLib Main Loop and set it to run */
data.main_loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (data.main_loop);
@ -476,7 +476,7 @@ times: `handle_message` when a message appears on the bus, and
There is nothing new in handle\_message, except that when the pipeline
moves to the PLAYING state, it will call the `analyze_streams` function:
``` first-line: 92; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Extract some metadata from the streams and print it on the screen */
static void analyze_streams (CustomData *data) {
gint i;
@ -495,7 +495,7 @@ media and prints it on the screen. The number of video, audio and
subtitle streams is directly available through the `n-video`,
`n-audio` and `n-text` properties.
``` first-line: 108; theme: Default; brush: cpp; gutter: true
``` lang=c
for (i = 0; i < data->n_video; i++) {
tags = NULL;
/* Retrieve the stream's video tags */
@ -534,7 +534,7 @@ name if the tags is standardized, and the list can be found in the
`GST_TAG_*_CODEC` (audio, video or
text).
``` first-line: 158; theme: Default; brush: cpp; gutter: true
``` lang=c
g_object_get (data->playbin2, "current-video", &data->current_video, NULL);
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL);
g_object_get (data->playbin2, "current-text", &data->current_text, NULL);
@ -550,7 +550,7 @@ never make any assumption. Multiple internal conditions can make
in which the streams are listed can change from one run to another, so
checking the metadata to identify one particular stream becomes crucial.
``` first-line: 202; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Process keyboard input */
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
gchar *str = NULL;

View file

@ -41,7 +41,7 @@ it in the SDK installation).
**playback-tutorial-2.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
/* Structure to contain all our information, so we can pass it around */
@ -297,7 +297,7 @@ This tutorial is copied from [Playback tutorial 1: Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html) with some
changes, so let's review only the changes.
``` first-line: 50; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set the subtitle URI to play and some font description */
g_object_set (data.playbin2, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL);
g_object_set (data.playbin2, "subtitle-font-desc", "Sans, 18", NULL);
@ -351,7 +351,7 @@ Extra-Expanded, Ultra-Expanded
 
``` first-line: 54; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set flags to show Audio, Video and Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT;

View file

@ -32,7 +32,7 @@ Copy this code into a text file named `playback-tutorial-3.c`.
**playback-tutorial-3.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
#include <string.h>
@ -190,7 +190,7 @@ int main(int argc, char *argv[]) {
To use an `appsrc` as the source for the pipeline, simply instantiate a
`playbin2` and set its URI to `appsrc://`
``` first-line: 131; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create the playbin2 element */
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL);
```
@ -199,7 +199,7 @@ data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL);
`source-setup` signal to allow the application to configure
it:
``` first-line: 133; theme: Default; brush: cpp; gutter: true
``` lang=c
g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data);
```
@ -208,7 +208,7 @@ since, once the signal handler returns, `playbin2` will instantiate the
next element in the pipeline according to these
caps:
``` first-line: 100; theme: Default; brush: cpp; gutter: true
``` lang=c
/* This function is called when playbin2 has created the appsrc element, so we have
* a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {

View file

@ -54,7 +54,7 @@ Copy this code into a text file named `playback-tutorial-4.c`.
**playback-tutorial-4.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
#include <string.h>
@ -260,7 +260,7 @@ only the differences.
#### Setup
``` first-line: 133; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set the download flag */
g_object_get (pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_DOWNLOAD;
@ -271,7 +271,7 @@ By setting this flag, `playbin2` instructs its internal queue (a
`queue2` element, actually) to store all downloaded
data.
``` first-line: 157; theme: Default; brush: cpp; gutter: true
``` lang=c
g_signal_connect (pipeline, "deep-notify::temp-location", G_CALLBACK (got_location), NULL);
```
@ -282,7 +282,7 @@ changes, indicating that the `queue2` has decided where to store the
downloaded
data.
``` first-line: 18; theme: Default; brush: cpp; gutter: true
``` lang=c
static void got_location (GstObject *gstobject, GstObject *prop_object, GParamSpec *prop, gpointer data) {
gchar *location;
g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL);
@ -313,7 +313,7 @@ removed. As the comment reads, you can keep it by setting the
In `main` we also install a timer which we use to refresh the UI every
second.
``` first-line: 159; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Register a function that GLib will call every second */
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
```
@ -332,7 +332,7 @@ pipeline is paused). Keep in mind that if your network is fast enough,
you will not see the download bar (the dashes) advance at all; it will
be completely full from the beginning.
``` first-line: 70; theme: Default; brush: cpp; gutter: true
``` lang=c
static gboolean refresh_ui (CustomData *data) {
GstQuery *query;
gboolean result;
@ -356,7 +356,7 @@ succeeded. The answer to the query is contained in the same
`GstQuery` structure we created, and can be retrieved using multiple
parse methods:
``` first-line: 85; theme: Default; brush: cpp; gutter: true
``` lang=c
n_ranges = gst_query_get_n_buffering_ranges (query);
for (range = 0; range < n_ranges; range++) {
gint64 start, stop;
@ -380,7 +380,7 @@ range) depends on what we requested in the
`gst_query_new_buffering()` call. In this case, PERCENTAGE. These
values are used to generate the graph.
``` first-line: 94; theme: Default; brush: cpp; gutter: true
``` lang=c
if (gst_element_query_position (data->pipeline, &format, &position) &&
GST_CLOCK_TIME_IS_VALID (position) &&
gst_element_query_duration (data->pipeline, &format, &duration) &&
@ -402,7 +402,7 @@ depending on the buffering level. If it is below 100%, the code in the
an `X`. If the buffering level is 100% the pipeline is in the
`PLAYING` state and we print a `>`.
``` first-line: 102; theme: Default; brush: cpp; gutter: true
``` lang=c
if (data->buffering_level < 100) {
g_print (" Buffering: %3d%%", data->buffering_level);
} else {
@ -415,7 +415,7 @@ information (and delete it otherwise).
#### Limiting the size of the downloaded file
``` first-line: 138; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Uncomment this line to limit the amount of downloaded data */
/* g_object_set (pipeline, "ring-buffer-max-size", (guint64)4000000, NULL); */
```

View file

@ -44,7 +44,7 @@ Copy this code into a text file named `playback-tutorial-5.c`.
**playback-tutorial-5.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <string.h>
#include <gst/gst.h>
#include <gst/interfaces/colorbalance.h>
@ -225,7 +225,7 @@ The `main()` function is fairly simple. A `playbin2` pipeline is
instantiated and set to run, and a keyboard watch is installed so
keystrokes can be monitored.
``` first-line: 45; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Output the current values of all Color Balance channels */
static void print_current_values (GstElement *pipeline) {
const GList *channels, *l;
@ -255,7 +255,7 @@ retrieve the current value.
In this example, the minimum and maximum values are used to output the
current value as a percentage.
``` first-line: 10; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Process a color balance command */
static void update_color_channel (const gchar *channel_name, gboolean increase, GstColorBalance *cb) {
gdouble step;
@ -283,7 +283,7 @@ parsed looking for the channel with the specified name. Obviously, this
list could be parsed only once and the pointers to the channels be
stored and indexed by something more efficient than a string.
``` first-line: 30; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Change the channel's value */
step = 0.1 * (channel->max_value - channel->min_value);
value = gst_color_balance_get_value (cb, channel);

View file

@ -41,7 +41,7 @@ Copy this code into a text file named `playback-tutorial-6.c`.
**playback-tutorial-6.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
/* playbin2 flags */
@ -163,7 +163,7 @@ First off, we indicate `playbin2` that we want an audio visualization by
setting the `GST_PLAY_FLAG_VIS` flag. If the media already contains
video, this flag has no effect.
``` first-line: 66; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set the visualization flag */
g_object_get (pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIS;
@ -175,7 +175,7 @@ If no visualization plugin is enforced by the user, `playbin2` will use
available). The rest of the tutorial shows how to find out the available
visualization elements and enforce one to `playbin2`.
``` first-line: 32; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Get a list of all visualization plugins */
list = gst_registry_feature_filter (gst_registry_get_default (), filter_vis_features, FALSE, NULL);
```
@ -185,7 +185,7 @@ GStreamer registry and selects those for which
the `filter_vis_features` function returns TRUE. This function selects
only the Visualization plugins:
``` first-line: 8; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Return TRUE if this is a Visualization element */
static gboolean filter_vis_features (GstPluginFeature *feature, gpointer data) {
GstElementFactory *factory;
@ -215,7 +215,7 @@ is a “string describing the type of element, as an unordered list
separated with slashes (/)”. Examples of classes are “Source/Network”,
“Codec/Decoder/Video”, “Codec/Encoder/Audio” or “Visualization”.
``` first-line: 35; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Print their names */
g_print("Available visualization plugins:\n");
for (walk = list; walk != NULL; walk = g_list_next (walk)) {
@ -236,7 +236,7 @@ Once we have the list of Visualization plugins, we print their names
(`gst_element_factory_get_longname()`) and choose one (in this case,
GOOM).
``` first-line: 57; theme: Default; brush: cpp; gutter: true
``` lang=c
/* We have now selected a factory for the visualization element */
g_print ("Selected '%s'\n", gst_element_factory_get_longname (selected_factory));
vis_plugin = gst_element_factory_create (selected_factory, NULL);
@ -247,7 +247,7 @@ if (!vis_plugin)
The selected factory is used to instantiate an actual `GstElement` which
is then passed to `playbin2` through the `vis-plugin` property:
``` first-line: 71; theme: Default; brush: cpp; gutter: true
``` lang=c
/* set vis plugin for playbin2 */
g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
```

View file

@ -55,7 +55,7 @@ Copy this code into a text file named `playback-tutorial-7.c`.
**playback-tutorial7.c**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <gst/gst.h>
int main(int argc, char *argv[]) {
@ -139,7 +139,7 @@ int main(int argc, char *argv[]) {
# Walkthrough
``` first-line: 15; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create the elements inside the sink bin */
equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer");
convert = gst_element_factory_make ("audioconvert", "convert");
@ -155,7 +155,7 @@ All the Elements that compose our sink-bin are instantiated. We use an
between, because we are not sure of the capabilities of the audio sink
(since they are hardware-dependant).
``` first-line: 24; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Create the sink bin, add the elements and link them */
bin = gst_bin_new ("audio_sink_bin");
gst_bin_add_many (GST_BIN (bin), equalizer, convert, sink, NULL);
@ -165,7 +165,7 @@ gst_element_link_many (equalizer, convert, sink, NULL);
This adds the new Elements to the Bin and links them just as we would do
if this was a pipeline.
``` first-line: 28; theme: Default; brush: cpp; gutter: true
``` lang=c
pad = gst_element_get_static_pad (equalizer, "sink");
ghost_pad = gst_ghost_pad_new ("sink", pad);
gst_pad_set_active (ghost_pad, TRUE);
@ -194,7 +194,7 @@ with `gst_object_unref()`.
At this point, we have a functional sink-bin, which we can use as the
audio sink in `playbin2`. We just need to instruct `playbin2` to use it:
``` first-line: 38; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Set playbin2's audio sink to be our sink bin */
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
```
@ -202,7 +202,7 @@ g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
It is as simple as setting the `audio-sink` property on `playbin2` to
the newly created sink.
``` first-line: 34; theme: Default; brush: cpp; gutter: true
``` lang=c
/* Configure the equalizer */
g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL);
g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL);

View file

@ -173,7 +173,7 @@ type. Therefore, the easiest way to make sure hardware acceleration is
enabled or disabled is by changing the rank of the associated element,
as shown in this code:
``` theme: Default; brush: cpp; gutter: true
``` lang=c
static void enable_factory (const gchar *name, gboolean enable) {
GstRegistry *registry = NULL;
GstElementFactory *factory = NULL;

View file

@ -65,13 +65,13 @@ with the g\[st\]\_\<class\> prefix removed and converted to camel case.
For example,
``` theme: Default; brush: cpp; gutter: false
``` lang=c
gboolean gst_caps_is_emtpy(const GstCaps *caps);
```
becomes:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
namespace QGst {
class Caps {
bool isEmpty() const;
@ -104,7 +104,7 @@ to call `g_object_ref()`` and g_object_unref()`.
QtGStreamer provides access to the underlying C objects, in case you
need them. This is accessible with a simple cast:
``` theme: Default; brush: cpp; gutter: false
``` lang=c
ElementPtr qgstElement = QGst::ElementFactory::make("playbin2");
GstElement* gstElement = GST_ELEMENT(qgstElement);
```

View file

@ -24,7 +24,7 @@ First, the files. These are also available in the
**CMakeLists.txt**
``` theme: Default; brush: plain; gutter: true
```
project(qtgst-example-appsink-src)
find_package(QtGStreamer REQUIRED)
find_package(Qt4 REQUIRED)
@ -37,7 +37,7 @@ target_link_libraries(appsink-src ${QTGSTREAMER_UTILS_LIBRARIES} ${QT_QTCORE_LIB
**main.cpp**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
#include <iostream>
#include <QtCore/QCoreApplication>
#include <QGlib/Error>
@ -146,7 +146,7 @@ As this is a very simple example, most of the action happens in the
**GStreamer Initialization**
``` theme: Default; brush: cpp; gutter: false
``` lang=c
QGst::init(&argc, &argv);
```
@ -154,7 +154,7 @@ Now we can construct the first half of the pipeline:
**Pipeline Setup**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
const char *caps = "audio/x-raw-int,channels=1,rate=8000,"
"signed=(boolean)true,width=16,depth=16,endianness=1234";
 
@ -188,7 +188,7 @@ The second half of the pipeline is created similarly:
**Second Pipeline**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
/* sink pipeline */
QString pipe2Descr = QString("appsrc name=\"mysrc\" caps=\"%1\" ! autoaudiosink").arg(caps);
pipeline2 = QGst::Parse::launch(pipe2Descr).dynamicCast<QGst::Pipeline>();
@ -201,7 +201,7 @@ Finally, the pipeline is started:
**Starting the pipeline**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
/* start playing */
pipeline1->setState(QGst::StatePlaying);
pipeline2->setState(QGst::StatePlaying);
@ -214,7 +214,7 @@ ready for processing:
**MySink::newBuffer()**
``` theme: Default; brush: cpp; gutter: true
``` lang=c
virtual QGst::FlowReturn newBuffer()
{
m_src->pushBuffer(pullBuffer());
@ -227,7 +227,7 @@ Our implementation takes the new buffer and pushes it into the
**Player::Player()**
``` theme: Default; brush: cpp; gutter: false
``` lang=c
Player::Player(int argc, char **argv)
: QCoreApplication(argc, argv), m_sink(&m_src)
```

View file

@ -24,7 +24,7 @@ the Windows Installer functionality and offers a number of options to
suit your needs. You can review these options by
executing `msiexec` without parameters. For example:
``` theme: Default; brush: plain; gutter: false
```
msiexec /i gstreamer-sdk-2012.9-x86.msi
```
@ -40,7 +40,7 @@ installer to deploy to your applications folder (or a
subfolder). Again, use the `msiexec` parameters that suit you best. For
example:
``` theme: Default; brush: plain; gutter: false
```
msiexec /passive INSTALLDIR=C:\Desired\Folder /i gstreamer-sdk-2012.9-x86.msi
```

View file

@ -81,13 +81,13 @@ Add directories separated with ':' to the plugin search path
## Example
``` theme: Default; brush: plain; gutter: false
```
gst-inspect-0.10 audiotestsrc
```
should produce:
``` theme: Default; brush: plain; gutter: false
```
Factory Details:
Long name: Audio test source
Class: Source/Audio

View file

@ -24,7 +24,7 @@ The UI uses storyboards and contains a single `View` with a centered
**ViewController.h**
``` theme: Default; brush: plain; gutter: true
```
#import <UIKit/UIKit.h>
 
@interface ViewController : UIViewController {
@ -51,7 +51,7 @@ the [Android tutorials](Android%2Btutorials.html).
**GStreamerBackend.m**
``` theme: Default; brush: plain; gutter: true
```
#import "GStreamerBackend.h"
#include <gst/gst.h>
@ -85,7 +85,7 @@ GStreamer version to display at the label. That's it\!
**ViewController.m**
``` theme: Default; brush: plain; gutter: true
```
#import "ViewController.h"
#import "GStreamerBackend.h"

View file

@ -60,7 +60,7 @@ behalf:
**ViewController.m**
``` theme: Default; brush: plain; gutter: true
```
#import "ViewController.h"
#import "GStreamerBackend.h"
#import <UIKit/UIKit.h>
@ -130,7 +130,7 @@ behalf:
An instance of the `GStreamerBackend` in stored inside the class:
``` first-line: 5; theme: Default; brush: plain; gutter: true
```
@interface ViewController () {
GStreamerBackend *gst_backend;
}
@ -139,7 +139,7 @@ An instance of the `GStreamerBackend` in stored inside the class:
This instance is created in the `viewDidLoad` function through a custom
`init:` method in the `GStreamerBackend`:
``` first-line: 17; theme: Default; brush: plain; gutter: true
```
- (void)viewDidLoad
{
[super viewDidLoad];
@ -158,7 +158,7 @@ The Play and Pause buttons are also disabled in the
`viewDidLoad` function, and they are not re-enabled until the
`GStreamerBackend` reports that it is initialized and ready.
``` first-line: 33; theme: Default; brush: plain; gutter: true
```
/* Called when the Play button is pressed */
-(IBAction) play:(id)sender
{
@ -176,7 +176,7 @@ These two methods are called when the user presses the Play or Pause
buttons, and simply forward the call to the appropriate method in the
`GStreamerBackend`.
``` first-line: 49; theme: Default; brush: plain; gutter: true
```
-(void) gstreamerInitialized
{
dispatch_async(dispatch_get_main_queue(), ^{
@ -196,7 +196,7 @@ the
[dispatch\_async()](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man3/dispatch_async.3.html) call
wrapping all UI code.
``` first-line: 58; theme: Default; brush: plain; gutter: true
```
-(void) gstreamerSetUIMessage:(NSString *)message
{
dispatch_async(dispatch_get_main_queue(), ^{
@ -221,7 +221,7 @@ the `GStreamerBackendDelegate` protocol:
**GStreamerBackend.m**
``` theme: Default; brush: plain; gutter: true
```
#import "GStreamerBackend.h"
#include <gst/gst.h>
@ -404,7 +404,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
#### Interface methods:
``` first-line: 26; theme: Default; brush: plain; gutter: true
```
-(id) init:(id) uiDelegate
{
if (self = [super init])
@ -434,7 +434,7 @@ warns the application when interesting things happen.
threshold, so we can see the debug output from within Xcode and keep
track of our application progress.
``` first-line: 44; theme: Default; brush: plain; gutter: true
```
-(void) dealloc
{
if (pipeline) {
@ -449,7 +449,7 @@ track of our application progress.
The `dealloc` method takes care of bringing the pipeline to the NULL
state and releasing it.
``` first-line: 54; theme: Default; brush: plain; gutter: true
```
-(void) play
{
if(gst_element_set_state(pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
@ -470,7 +470,7 @@ desired state and warn the application if something fails.
#### Private methods:
``` first-line: 72; theme: Default; brush: plain; gutter: true
```
/* Change the message on the UI through the UI delegate */
-(void)setUIMessage:(gchar*) message
{
@ -488,7 +488,7 @@ into `NSString *` and displays them through the
implementation of this method is marked as `@optional`, and hence the
check for its existence in the delegate with `respondsToSelector:`
``` first-line: 82; theme: Default; brush: plain; gutter: true
```
/* Retrieve errors from the bus and show them on the UI */
static void error_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self)
{
@ -534,7 +534,7 @@ through the `userdata` pointer of the callbacks (the `self` pointer in
these implementations). This is discussed below when registering the
callbacks in the `app_function`.
``` first-line: 111; theme: Default; brush: plain; gutter: true
```
/* Check if all conditions are met to report GStreamer as initialized.
* These conditions will change depending on the application */
-(void) check_initialization_complete
@ -562,7 +562,7 @@ It exists with almost identical content in the Android tutorial, which
exemplifies how the same code can run on both platforms with little
change.
``` first-line: 134; theme: Default; brush: plain; gutter: true
```
/* Create our own GLib Main Context and make it the default one */
context = g_main_context_new ();
g_main_context_push_thread_default(context);
@ -574,7 +574,7 @@ libraries which might not have been properly disposed of. A new context
is created with `g_main_context_new()` and then it is made the default
one for the thread with `g_main_context_push_thread_default()`.
``` first-line: 138; theme: Default; brush: plain; gutter: true
```
/* Build pipeline */
pipeline = gst_parse_launch("audiotestsrc ! audioconvert ! audioresample ! autoaudiosink", &error);
if (error) {
@ -591,7 +591,7 @@ this case, it is simply an  `audiotestsrc` (which produces a continuous
tone) and an `autoaudiosink`, with accompanying adapter
elements.
``` first-line: 148; theme: Default; brush: plain; gutter: true
```
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (pipeline);
bus_source = gst_bus_create_watch (bus);
@ -616,7 +616,7 @@ because it travels through C-land untouched. It re-emerges at the
different callbacks through the userdata pointer and cast again to a
`GStreamerBackend *`.
``` first-line: 158; theme: Default; brush: plain; gutter: true
```
/* Create a GLib Main Loop and set it to run */
GST_DEBUG ("Entering main loop...");
main_loop = g_main_loop_new (context, FALSE);

View file

@ -41,7 +41,7 @@ outlets):
**ViewController.h**
``` theme: Default; brush: plain; gutter: true
```
#import <UIKit/UIKit.h>
#import "GStreamerBackendDelegate.h"
@ -73,7 +73,7 @@ behalf:
**ViewController.m**
``` theme: Default; brush: plain; gutter: true
```
#import "ViewController.h"
#import "GStreamerBackend.h"
#import <UIKit/UIKit.h>
@ -167,7 +167,7 @@ behalf:
We expand the class to remember the width and height of the media we are
currently playing:
``` first-line: 5; theme: Default; brush: plain; gutter: true
```
@interface ViewController () {
GStreamerBackend *gst_backend;
int media_width;
@ -179,7 +179,7 @@ In later tutorials this data is retrieved from the GStreamer pipeline,
but in this tutorial, for simplicitys sake, the width and height of the
media is constant and initialized in `viewDidLoad`:
``` first-line: 19; theme: Default; brush: plain; gutter: true
```
- (void)viewDidLoad
{
[super viewDidLoad];
@ -203,7 +203,7 @@ The rest of the `ViewController `code is the same as the previous
tutorial, except for the code that adapts the `video_view` size to the
media size, respecting its aspect ratio:
``` first-line: 51; theme: Default; brush: plain; gutter: true
```
- (void)viewDidLayoutSubviews
{
CGFloat view_width = video_container_view.bounds.size.width;
@ -250,7 +250,7 @@ the `GStreamerBackendDelegate` protocol:
**GStreamerBackend.m**
``` theme: Default; brush: plain; gutter: true
```
#import "GStreamerBackend.h"
#include <gst/gst.h>
@ -446,7 +446,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
The main differences with the previous tutorial are related to the
handling of the `XOverlay` interface:
``` first-line: 15; theme: Default; brush: plain; gutter: true
```
@implementation GStreamerBackend {
id ui_delegate; /* Class that we use to interact with the user interface */
GstElement *pipeline; /* The running pipeline */
@ -461,7 +461,7 @@ handling of the `XOverlay` interface:
The class is expanded to keep track of the video sink element in the
pipeline and the `UIView *` onto which rendering is to occur.
``` first-line: 29; theme: Default; brush: plain; gutter: true
```
-(id) init:(id) uiDelegate videoView:(UIView *)video_view
{
if (self = [super init])
@ -485,7 +485,7 @@ pipeline and the `UIView *` onto which rendering is to occur.
The constructor accepts the `UIView *` as a new parameter, which, at
this point, is simply remembered in `ui_video_view`.
``` first-line: 142; theme: Default; brush: plain; gutter: true
```
/* Build pipeline */
pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink", &error);
```
@ -497,7 +497,7 @@ choses the appropriate sink for the platform (currently,
`eglglessink` is the only option for
iOS).
``` first-line: 152; theme: Default; brush: plain; gutter: true
```
/* Set the pipeline to READY, so it can already accept a window handle */
gst_element_set_state(pipeline, GST_STATE_READY);
@ -541,7 +541,7 @@ To this avail, we create the `EaglUIView` class, derived from
**EaglUIView.m**
``` theme: Default; brush: plain; gutter: true
```
#import "EaglUIVIew.h"
#import <QuartzCore/QuartzCore.h>

View file

@ -52,7 +52,7 @@ duration.
**VideoViewController.h**
``` theme: Default; brush: plain; gutter: true
```
#import <UIKit/UIKit.h>
#import "GStreamerBackendDelegate.h"
@ -101,7 +101,7 @@ this view is collapsed by default. Click here to expand…
**VideoViewController.m**
``` theme: Default; brush: plain; gutter: true
```
#import "VideoViewController.h"
#import "GStreamerBackend.h"
#import <UIKit/UIKit.h>
@ -311,7 +311,7 @@ because we will not offer the same functionalities. We keep track of
this in the `is_local_media` variable, which is set when the URI is set,
in the `gstreamerInitialized` method:
``` first-line: 154; theme: Default; brush: plain; gutter: true
```
-(void) gstreamerInitialized
{
dispatch_async(dispatch_get_main_queue(), ^{
@ -331,7 +331,7 @@ Every time the size of the media changes (which could happen mid-stream,
for some kind of streams), or when it is first detected,
`GStreamerBackend`  calls our `mediaSizeChanged()` callback:
``` first-line: 173; theme: Default; brush: plain; gutter: true
```
-(void) mediaSizeChanged:(NSInteger)width height:(NSInteger)height
{
media_width = width;
@ -370,7 +370,7 @@ call our `setCurrentPosition` method so we can update the position of
the thumb in the Seek Bar. Again we do so from the UI thread, using
`dispatch_async()`.
``` first-line: 184; theme: Default; brush: plain; gutter: true
```
-(void) setCurrentPosition:(NSInteger)position duration:(NSInteger)duration
{
/* Ignore messages from the pipeline if the time sliders is being dragged */
@ -397,7 +397,7 @@ which we will use to display the current position and duration in
takes care of it, and must be called every time the Seek Bar is
updated:
``` first-line: 24; theme: Default; brush: plain; gutter: true
```
/* The text widget acts as an slave for the seek bar, so it reflects what the seek bar shows, whether
* it is an actual pipeline position or the position the user is currently dragging to. */
- (void) updateTimeWidget
@ -438,7 +438,7 @@ outlets are connected. We will be notified when the user starts dragging
the Slider, when the Slider position changes and when the users releases
the Slider.
``` first-line: 112; theme: Default; brush: plain; gutter: true
```
/* Called when the user starts to drag the time slider */
- (IBAction)sliderTouchDown:(id)sender {
[gst_backend pause];
@ -452,7 +452,7 @@ do not want it to keep moving. We also mark that a drag operation is in
progress in the
`dragging_slider` variable.
``` first-line: 102; theme: Default; brush: plain; gutter: true
```
/* Called when the time slider position has changed, either because the user dragged it or
* we programmatically changed its position. dragging_slider tells us which one happened */
- (IBAction)sliderValueChanged:(id)sender {
@ -475,7 +475,7 @@ Otherwise, the seek operation will be performed when the thumb is
released, and the only thing we do here is update the textual time
widget.
``` first-line: 118; theme: Default; brush: plain; gutter: true
```
/* Called when the user stops dragging the time slider */
- (IBAction)sliderTouchUp:(id)sender {
dragging_slider = NO;
@ -509,7 +509,7 @@ this view is collapsed by default. Click here to expand…
**GStreamerBackend.m**
``` theme: Default; brush: plain; gutter: true
```
#import "GStreamerBackend.h"
#include <gst/gst.h>
@ -911,7 +911,7 @@ The UI code will call `setUri` whenever it wants to change the playing
URI (in this tutorial the URI never changes, but it does in the next
one):
``` first-line: 79; theme: Default; brush: plain; gutter: true
```
-(void) setUri:(NSString*)uri
{
const char *char_uri = [uri UTF8String];
@ -934,7 +934,7 @@ do not. Therefore, in the READY to PAUSED state change, once the Caps of
the decoded media are known, we inspect them
in `check_media_size()`:
``` first-line: 244; theme: Default; brush: plain; gutter: true
```
/* Retrieve the video sink's Caps and tell the application about the media size */
static void check_media_size (GStreamerBackend *self) {
GstElement *video_sink;
@ -988,7 +988,7 @@ To keep the UI updated, a GLib timer is installed in
the `app_function` that fires 4 times per second (or every 250ms),
right before entering the main loop:
``` first-line: 365; theme: Default; brush: plain; gutter: true
```
/* Register a function that GLib will call 4 times per second */
timeout_source = g_timeout_source_new (250);
g_source_set_callback (timeout_source, (GSourceFunc)refresh_ui, (__bridge void *)self, NULL);
@ -999,7 +999,7 @@ right before entering the main loop:
Then, in the refresh\_ui
method:
``` first-line: 120; theme: Default; brush: plain; gutter: true
```
/* If we have pipeline and it is running, query the current position and clip duration and inform
* the application */
static gboolean refresh_ui (GStreamerBackend *self) {
@ -1051,7 +1051,7 @@ see how to overcome these problems.
In `setPosition`:
``` first-line: 86; theme: Default; brush: plain; gutter: true
```
-(void) setPosition:(NSInteger)milliseconds
{
gint64 position = (gint64)(milliseconds * GST_MSECOND);
@ -1069,7 +1069,7 @@ away; otherwise, store the desired position in
the `desired_position` variable. Then, in
the `state_changed_cb()` callback:
``` first-line: 292; theme: Default; brush: plain; gutter: true
```
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED)
{
check_media_size(self);
@ -1105,7 +1105,7 @@ once this period elapses.
To achieve this, all seek requests are routed through
the `execute_seek()` method:
``` first-line: 145; theme: Default; brush: plain; gutter: true
```
/* Perform seek, if we are not too close to the previous seek. Otherwise, schedule the seek for
* some time in the future. */
static void execute_seek (gint64 position, GStreamerBackend *self) {
@ -1176,7 +1176,7 @@ using buffering. The same procedure is used here, by listening to the
buffering
messages:
``` first-line: 361; theme: Default; brush: plain; gutter: true
```
g_signal_connect (G_OBJECT (bus), "message::buffering", (GCallback)buffering_cb, (__bridge void *)self);
```
@ -1186,7 +1186,7 @@ source):
 
``` first-line: 215; theme: Default; brush: plain; gutter: true
```
/* Called when buffering messages are received. We inform the UI about the current buffering level and
* keep the pipeline paused until 100% buffering is reached. At that point, set the desired state. */
static void buffering_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *self) {