# GStreamer SDK documentation : Playback tutorial 1: Playbin2 usage This page last changed on Jun 26, 2012 by xartigas. # Goal We have already worked with the `playbin2` element, which is capable of building a complete playback pipeline without much work on our side. This tutorial shows how to further customize `playbin2` in case its default values do not suit our particular needs. We will learn: - How to find out how many streams a file contains, and how to switch among them. - How to gather information regarding each stream. As a side note, even though its name is `playbin2`, you can pronounce it “playbin”, since the original `playbin` element is deprecated and nobody should be using it. # Introduction More often than not, multiple audio, video and subtitle streams can be found embedded in a single file. The most common case are regular movies, which contain one video and one audio stream (Stereo or 5.1 audio tracks are considered a single stream). It is also increasingly common to find movies with one video and multiple audio streams, to account for different languages. In this case, the user selects one audio stream, and the application will only play that one, ignoring the others. To be able to select the appropriate stream, the user needs to know certain information about them, for example, their language. This information is embedded in the streams in the form of “metadata” (annexed data), and this tutorial shows how to retrieve it. Subtitles can also be embedded in a file, along with audio and video, but they are dealt with in more detail in [Playback tutorial 2: Subtitle management](Playback%2Btutorial%2B2%253A%2BSubtitle%2Bmanagement.html). Finally, multiple video streams can also be found in a single file, for example, in DVD with multiple angles of the same scene, but they are somewhat rare.
Embedding multiple streams inside a single file is called “multiplexing” or “muxing”, and such file is then known as a “container”. Common container formats are Matroska (.mkv), Quicktime (.qt, .mov, .mp4), Ogg (.ogg) or Webm (.webm). Retrieving the individual streams from within the container is called “demultiplexing” or “demuxing”. |
This rather unintuitive way of retrieving the tag list is called an Action Signal. Action signals are emitted by the application to a specific element, which then performs an action and returns a result. They behave like a dynamic function call, in which methods of a class are identified by their name (the signal's name) instead of their memory address. These signals are listed In the documentation along with the regular signals, and are tagged “Action”. See |