Import initial automatic conversion
285
2012.11+Brahmaputra.markdown
Normal file
|
@ -0,0 +1,285 @@
|
|||
# GStreamer SDK documentation : 2012.11 Brahmaputra
|
||||
|
||||
This page last changed on Nov 28, 2012 by slomo.
|
||||
|
||||
# Release – GStreamer SDK 2012.11 Brahmaputra
|
||||
|
||||
**2012-11-28 // [http://www.gstreamer.com](http://www.gstreamer.com/)**
|
||||
|
||||
This release is targeted at media playback applications for desktop
|
||||
systems.
|
||||
|
||||
For more information about the GStreamer SDK and the latest versions
|
||||
please visit [http://www.gstreamer.com](http://www.gstreamer.com/)
|
||||
|
||||
## System Requirements
|
||||
|
||||
The GStreamer SDK currently supports Microsoft Windows, Mac OS X,
|
||||
different Linux distributions and Android.
|
||||
|
||||
Future releases of the GStreamer SDK will add support for iOS and
|
||||
possibly other platforms.
|
||||
|
||||
### Linux
|
||||
|
||||
The supported Linux distributions are currently
|
||||
|
||||
- Ubuntu 12.04 (Precise Pangolin)
|
||||
|
||||
- Ubuntu 12.10 (Quantal Quetzal)
|
||||
|
||||
- Debian 6.0 (Squeeze)
|
||||
|
||||
- Debian 7.0 (Wheezy)
|
||||
|
||||
- Fedora 16
|
||||
|
||||
- Fedora 17
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Support for more Linux distributions will be added on demand later.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Linux](Installing%2Bon%2BLinux.html)
|
||||
|
||||
### Mac OS X
|
||||
|
||||
The supported Mac OS X versions are currently
|
||||
|
||||
- Snow Leopard (10.6)
|
||||
- Lion (10.7)
|
||||
- Mountain Lion (10.8)
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit) with universal binaries.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Mac OS
|
||||
X](Installing%2Bon%2BMac%2BOS%2BX.html)
|
||||
|
||||
### Microsoft Windows
|
||||
|
||||
The supported Windows versions are
|
||||
|
||||
- Windows XP
|
||||
- Windows Vista
|
||||
- Windows 7
|
||||
- Windows 8
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Developing applications with the GStreamer SDK is supported with
|
||||
the following development environments
|
||||
|
||||
- Microsoft Visual Studio 2010 or 2012 (including the free Visual C++
|
||||
Express
|
||||
edition)
|
||||
|
||||
<http://www.microsoft.com/visualstudio/eng/products/visual-studio-overview>
|
||||
|
||||
- MinGW/MSYS
|
||||
|
||||
[http://mingw.org](http://mingw.org/)
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Windows](Installing%2Bon%2BWindows.html)
|
||||
|
||||
### Android
|
||||
|
||||
The supported Android versions are
|
||||
|
||||
- 2.3 (Gingerbread, API level 9/10)
|
||||
- 3.1/3.2 (Honeycomb, API level 12/13)
|
||||
- 4.0 (Ice Cream Sandwhich, API level 15)
|
||||
- 4.1/4.2 (Jelly Bean, API level 16)
|
||||
|
||||
Developing applications with the GStreamer SDK for Android is supported
|
||||
from Linux, Mac OS X and Windows systems using the Android SDK and NDK.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing for Android
|
||||
development](Installing%2Bfor%2BAndroid%2Bdevelopment.html).
|
||||
|
||||
## Changes since 2012.9 Amazon
|
||||
|
||||
- Support for Android platforms
|
||||
- Support for Windows 8
|
||||
- Support for 10-bit YUV color formats
|
||||
- Improvements and bugfixes to the SDK build process on all platforms
|
||||
- Lots of other, smaller bugfixes to GStreamer and other
|
||||
software
|
||||
- Closed [bugreports](https://bugs.freedesktop.org/buglist.cgi?resolution=---&resolution=FIXED&query_format=advanced&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&version=2012.11&product=GStreamer%20SDK&list_id=131136)
|
||||
|
||||
## Compatibility
|
||||
|
||||
The GStreamer SDK Brahmaputra is compatible with the 0.10 release series
|
||||
of GStreamer and fully compatible with GStreamer SDK Amazon Release.
|
||||
|
||||
## Features
|
||||
|
||||
The GStreamer SDK Amazon is targeted at media playback applications for
|
||||
desktop systems. It contains the required components and plugins for
|
||||
media playback.
|
||||
|
||||
- Local media playback, live streaming, progressive streaming and DVD
|
||||
playback
|
||||
- Supported video codecs: Theora, VP8, Dirac, MJPEG, h.264\*,
|
||||
h.263\*, MPEG2\*, MPEG4\*, WMV/VC1\*, DV, ...
|
||||
- Supported audio codecs: Vorbis, FLAC, Opus, Speex, WavPack,
|
||||
AAC\*, MP3\*, WMA\*, Dolby Digital (AC3)\*, DTS/DCA\*, AMR
|
||||
NB/WB\*, ...
|
||||
- Supported container formats: Ogg, WebM, Matroska, MP4,
|
||||
Quicktime, AVI, FLV, 3GPP, WAV, DV, Real Media\*, ASF\*, MPEG
|
||||
PS/TS\*, ...
|
||||
- Supported protocols: local files, HTTP, Shoutcast/Icecast, HLS,
|
||||
RTSP, RTP and MMS\*
|
||||
- Application and GUI toolkit integration
|
||||
- Automatic container/codecs discovery
|
||||
- Metadata extraction
|
||||
- Subtitle support
|
||||
- Audio visualization
|
||||
- On the fly stream switching between different audio/subtitle streams
|
||||
- Absolute position seeking, including remote seeking
|
||||
- Fast/slow forward/reverse playback and frame stepping
|
||||
- Automatic video deinterlacing, scaling and color balance post
|
||||
processing
|
||||
- Compressed audio passthrough
|
||||
- Clutter texture rendering
|
||||
|
||||
> \* May require additional licenses from third parties in some
|
||||
> countries and not installed by default with the GStreamer SDK.
|
||||
> Properly licensed plugins can be obtained from different companies or
|
||||
> licenses can be directly obtained from the relevant licensors.
|
||||
|
||||
Although this release is targeted at playback applications only it
|
||||
also contains encoders for some codecs, muxers for some container
|
||||
formats and some other plugins that are not strictly playback related.
|
||||
These use-cases are currently not officially supported by the GStreamer
|
||||
SDK but will usually work and will be officially supported in future
|
||||
releases of the GStreamer SDK.
|
||||
|
||||
The GStreamer SDK Brahmaputra contains the following major components,
|
||||
some of them being optional or not used on some platforms.
|
||||
|
||||
- GLib 2.34.2
|
||||
- GStreamer core and base 0.10.36
|
||||
- GStreamer good plugins 0.10.31
|
||||
- GStreamer bad plugins 0.10.23
|
||||
- GStreamer ugly plugins 0.10.19
|
||||
- GStreamer Python bindings 0.10.22\*
|
||||
- GTK+ 2.24.11 and Python bindings\*
|
||||
- clutter 1.8.4 and clutter-gst 1.6.0\*
|
||||
|
||||
> \* Not available on Android platforms.
|
||||
|
||||
## Known Issues
|
||||
|
||||
- Switching between different audio streams can take some time until
|
||||
the switch takes effect
|
||||
- Using the native decoders (e.g. h.264) on OS X Lion (10.7) does not
|
||||
work currently
|
||||
- [Other known
|
||||
issues](https://bugs.freedesktop.org/buglist.cgi?resolution=---&resolution=FIXED&query_format=advanced&bug_status=NEW&bug_status=ASSIGNED&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&version=2012.11&product=GStreamer%20SDK&list_id=85256)
|
||||
|
||||
## Legal Information
|
||||
|
||||
### Installer, default installation
|
||||
|
||||
The installer (Microsoft Windows) and the default installation
|
||||
(GNU/Linux) contain and install the minimal default installation. At
|
||||
install time or later, the downloading of optional components is also
|
||||
possible, but read on for certain legal cautions you might want to take.
|
||||
All downloads are from
|
||||
the [freedesktop.org](http://www.freedesktop.org/) website, for
|
||||
registered/approved users only.
|
||||
|
||||
### Licensing of SDK
|
||||
|
||||
GStreamer SDK minimal default installation only contains packages which
|
||||
are licensed under the GNU LGPL license v.2.1. This license gives you
|
||||
the Freedom to use, modify, make copies of the software either in the
|
||||
original or in a modified form, provided that the software you
|
||||
redistribute is licensed under the same licensing terms. This only
|
||||
extends to the software itself and modified versions of it, but you are
|
||||
free to link the LGPL software as a library used by other software under
|
||||
whichever license. In other words, it is a weak copyleft license.
|
||||
|
||||
Therefore, it is possible to use the SDK to build applications that are
|
||||
then distributed under a different license, including a proprietary one,
|
||||
provided that reverse engineering is not prohibited for debugging
|
||||
modifications purposes. Only the pieces of the SDK that are under the
|
||||
LGPL need to be kept under the LGPL, and the corresponding source code
|
||||
must be distributed along with the application (or an irrevocable offer
|
||||
to do so for at least three years from distribution). Please consult
|
||||
section 6 of the LGPL for further details as to what the corresponding
|
||||
source code must contain. Some portions of the minimal default
|
||||
installation may be under different licenses, which are both more
|
||||
liberal than the LGPL (they are less strict conditions for granting the
|
||||
license) and compatible with the LGPL. This is advised locally.
|
||||
|
||||
### Optional packages
|
||||
|
||||
There are two types of optional packages (GPL and Patented), which are
|
||||
under a different license or have other issues concerning patentability
|
||||
(or both).
|
||||
|
||||
#### GPL code
|
||||
|
||||
Part of the optional packages are under the GNU GPL v.2 or v.3. This
|
||||
means that you cannot link the GPL software in a program unless the same
|
||||
program is also under the GPL, but you are invited to seek competent
|
||||
advice on how this works in your precise case and design choices. GPL is
|
||||
called “strong copyleft” because the condition to distributed under the
|
||||
same license has the largest possible scope and extends to all
|
||||
derivative works.
|
||||
|
||||
#### Patents
|
||||
|
||||
Certain software, and in particular software that implements multimedia
|
||||
standard formats such as Mp3, MPEG 2 video and audio, h.264, MPEG 4
|
||||
audio and video, AC3, etc, can have patent issues. In certain countries
|
||||
patents are granted on software and even software-only solution are by
|
||||
and large considered patentable and are patented (such as in the United
|
||||
States). In certain others, patents on pure software solutions are
|
||||
formally prohibited, but granted (this is the case of Europe), and in
|
||||
others again are neither allowed nor granted.
|
||||
|
||||
It is up to you to make sure that in the countries where the SDK is
|
||||
used, products are made using it and product are distributed, a license
|
||||
from the applicable patent holders is required or not. Receiving the SDK
|
||||
– or links to other downloadable software – does not provide any license
|
||||
expressed or implied over these patents, except in very limited
|
||||
conditions where the license so provides. No representation is made.
|
||||
|
||||
In certain cases, the optional packages are distributed only as source
|
||||
code. It is up to the receiver to make sure that in the applicable
|
||||
circumstances compiling the same code for a given platform or
|
||||
distributing the object code is not an act that infringes one or more
|
||||
patents.
|
||||
|
||||
### Software is as-is
|
||||
|
||||
All software and the entire SDK is provided as-is, without any
|
||||
warranty whatsoever. The individual licenses have particular language
|
||||
disclaiming liability: we invite you to read all of them. Should you
|
||||
need a warranty on the fact that software works as intended or have any
|
||||
kind of indemnification, you have the option to subscribe a software
|
||||
maintenance agreement with a company or entity that is in that business.
|
||||
Fluendo and Collabora, as well as some other companies, provide software
|
||||
maintenance agreements under certain conditions, you are invited to
|
||||
contact them in order to receive further details and discuss of the
|
||||
commercial terms.
|
||||
|
||||
## Contact
|
||||
|
||||
Web: [http://www.gstreamer.com](http://www.gstreamer.com/)
|
||||
|
||||
Documentation: [http://docs.gstreamer.com](http://docs.gstreamer.com/)
|
||||
|
||||
Commercial support: <http://gstreamer.com/contact>
|
||||
|
||||
Bug
|
||||
tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
249
2012.5+Amazon.markdown
Normal file
|
@ -0,0 +1,249 @@
|
|||
# GStreamer SDK documentation : 2012.5 Amazon
|
||||
|
||||
This page last changed on Jun 15, 2012 by slomo.
|
||||
|
||||
# Release – GStreamer SDK 2012.5 Amazon – Release
|
||||
|
||||
**2012-06-07 // <http://www.gstreamer.com>**
|
||||
|
||||
This release is targeted at media playback applications for desktop
|
||||
systems.
|
||||
|
||||
For more information about the GStreamer SDK and the latest versions
|
||||
please visit <http://www.gstreamer.com>
|
||||
|
||||
## System Requirements
|
||||
|
||||
The GStreamer SDK currently supports Microsoft Windows, Mac OS X and
|
||||
different Linux distributions. Future releases of the GStreamer SDK will
|
||||
add support for Android, iOS and possibly other platforms.
|
||||
|
||||
### Linux
|
||||
|
||||
The supported Linux distributions are currently
|
||||
|
||||
- Ubuntu 11.10 (Oneiric Ocelot)
|
||||
- Ubuntu 12.04 (Precise Pangolin)
|
||||
- Debian 6.0 (Squeeze)
|
||||
- Fedora 16
|
||||
- Fedora 17
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Support for more Linux distributions will be added on demand later.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Linux](Installing%2Bon%2BLinux.html)
|
||||
|
||||
### Mac OS X
|
||||
|
||||
The supported Mac OS X versions are currently
|
||||
|
||||
- Snow Leopard (10.6)
|
||||
- Lion (10.7)
|
||||
- Mountain Lion (10.8) (experimental)
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Mac OS
|
||||
X](Installing%2Bon%2BMac%2BOS%2BX.html)
|
||||
|
||||
### Microsoft Windows
|
||||
|
||||
The supported Windows versions are
|
||||
|
||||
- Windows XP
|
||||
- Windows Vista
|
||||
- Windows 7
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Developing applications with the GStreamer SDK is supported with
|
||||
the following development environments
|
||||
|
||||
- Microsoft Visual Studio 2010 (including the free Visual C++ Express
|
||||
edition)
|
||||
|
||||
<http://www.microsoft.com/visualstudio/en-us/products/2010-editions>
|
||||
|
||||
- MinGW/MSYS
|
||||
|
||||
[http://mingw.org](http://mingw.org/)
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Windows](Installing%2Bon%2BWindows.html)
|
||||
|
||||
## Compatibility
|
||||
|
||||
The GStreamer SDK Amazon is compatible with the 0.10 release series of
|
||||
GStreamer.
|
||||
|
||||
## Features
|
||||
|
||||
The GStreamer SDK Amazon is targeted at media playback applications for
|
||||
desktop systems. It contains the required components and plugins for
|
||||
media playback.
|
||||
|
||||
- Local media playback, live streaming, progressive streaming and DVD
|
||||
playback
|
||||
- Supported video codecs: Theora, VP8, Dirac, MJPEG, h.264\*,
|
||||
h.263\*, MPEG2\*, MPEG4\*, WMV/VC1\*, ...
|
||||
- Supported audio codecs: Vorbis, FLAC, Speex, WavPack, AAC\*,
|
||||
MP3\*, WMA\*, Dolby Digital (AC3)\*, DTS/DCA\*, AMR NB/WB\*, ...
|
||||
- Supported container formats: Ogg, WebM, Matroska, MP4,
|
||||
Quicktime, AVI, FLV, 3GPP, WAV, Real Media\*, ASF\*, MPEG
|
||||
PS/TS\*, ...
|
||||
- Supported protocols: local files, HTTP, Shoutcast/Icecast, HLS,
|
||||
RTSP, RTP and MMS\*
|
||||
- Application and GUI toolkit integration
|
||||
- Automatic container/codecs discovery
|
||||
- Metadata extraction
|
||||
- Subtitle support
|
||||
- Audio visualization
|
||||
- On the fly stream switching between different audio/subtitle streams
|
||||
- Absolute position seeking, including remote seeking
|
||||
- Fast/slow forward/reverse playback and frame stepping
|
||||
- Automatic video deinterlacing, scaling and color balance post
|
||||
processing
|
||||
- Compressed audio passthrough
|
||||
- Clutter texture rendering
|
||||
|
||||
> \* May require additional licenses from third parties in some
|
||||
> countries and not installed by default with the GStreamer SDK.
|
||||
> Properly licensed plugins can be obtained from different companies or
|
||||
> licenses can be directly obtained from the relevant licensors.
|
||||
|
||||
Although this release is targeted at playback applications only it
|
||||
also contains encoders for some codecs, muxers for some container
|
||||
formats and some other plugins that are not strictly playback related.
|
||||
These use-cases are currently not officially supported by the GStreamer
|
||||
SDK but will usually work and will be officially supported in future
|
||||
releases of the GStreamer SDK.
|
||||
|
||||
|
||||
|
||||
The GStreamer SDK Amazon contains the following major components, some
|
||||
of them being optional or not used on some platforms.
|
||||
|
||||
- GLib 2.32.1
|
||||
- GStreamer core and base 0.10.36
|
||||
- GStreamer good plugins 0.10.31
|
||||
- GStreamer bad plugins 0.10.23
|
||||
- GStreamer ugly plugins 0.10.19
|
||||
- GStreamer Python bindings 0.10.22
|
||||
- GTK+ 2.24.10 and Python bindings
|
||||
- clutter 1.8.4 and clutter-gst 1.4.6
|
||||
|
||||
## Known Issues
|
||||
|
||||
- Switching between different audio streams can take some time until
|
||||
the switch takes effect
|
||||
- Using the native decoders (e.g. h.264) on OS X Lion (10.7) does not
|
||||
work currently
|
||||
- [Other known
|
||||
issues](https://bugs.freedesktop.org/buglist.cgi?resolution=---&resolution=FIXED&query_format=advanced&bug_status=NEW&bug_status=ASSIGNED&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&version=2012.5&product=GStreamer%20SDK&list_id=85256)
|
||||
|
||||
## Legal Information
|
||||
|
||||
### Installer, default installation
|
||||
|
||||
The installer (Microsoft Windows) and the default installation
|
||||
(GNU/Linux) contain and install the minimal default installation. At
|
||||
install time or later, the downloading of optional components is also
|
||||
possible, but read on for certain legal cautions you might want to take.
|
||||
All downloads are from the [freedesktop.org](http://www.freedesktop.org)
|
||||
website, for registered/approved users only.
|
||||
|
||||
### Licensing of SDK
|
||||
|
||||
GStreamer SDK minimal default installation only contains packages which
|
||||
are licensed under the GNU LGPL license v.2.1. This license gives you
|
||||
the Freedom to use, modify, make copies of the software either in the
|
||||
original or in a modified form, provided that the software you
|
||||
redistribute is licensed under the same licensing terms. This only
|
||||
extends to the software itself and modified versions of it, but you are
|
||||
free to link the LGPL software as a library used by other software under
|
||||
whichever license. In other words, it is a weak copyleft license.
|
||||
|
||||
Therefore, it is possible to use the SDK to build applications that are
|
||||
then distributed under a different license, including a proprietary one,
|
||||
provided that reverse engineering is not prohibited for debugging
|
||||
modifications purposes. Only the pieces of the SDK that are under the
|
||||
LGPL need to be kept under the LGPL, and the corresponding source code
|
||||
must be distributed along with the application (or an irrevocable offer
|
||||
to do so for at least three years from distribution). Please consult
|
||||
section 6 of the LGPL for further details as to what the corresponding
|
||||
source code must contain. Some portions of the minimal default
|
||||
installation may be under different licenses, which are both more
|
||||
liberal than the LGPL (they are less strict conditions for granting the
|
||||
license) and compatible with the LGPL. This is advised locally.
|
||||
|
||||
### Optional packages
|
||||
|
||||
There are two types of optional packages (GPL and Patented), which are
|
||||
under a different license or have other issues concerning patentability
|
||||
(or both).
|
||||
|
||||
#### GPL code
|
||||
|
||||
Part of the optional packages are under the GNU GPL v.2 or v.3. This
|
||||
means that you cannot link the GPL software in a program unless the same
|
||||
program is also under the GPL, but you are invited to seek competent
|
||||
advice on how this works in your precise case and design choices. GPL is
|
||||
called “strong copyleft” because the condition to distributed under the
|
||||
same license has the largest possible scope and extends to all
|
||||
derivative works.
|
||||
|
||||
#### Patents
|
||||
|
||||
Certain software, and in particular software that implements multimedia
|
||||
standard formats such as Mp3, MPEG 2 video and audio, h.264, MPEG 4
|
||||
audio and video, AC3, etc, can have patent issues. In certain countries
|
||||
patents are granted on software and even software-only solution are by
|
||||
and large considered patentable and are patented (such as in the United
|
||||
States). In certain others, patents on pure software solutions are
|
||||
formally prohibited, but granted (this is the case of Europe), and in
|
||||
others again are neither allowed nor granted.
|
||||
|
||||
It is up to you to make sure that in the countries where the SDK is
|
||||
used, products are made using it and product are distributed, a license
|
||||
from the applicable patent holders is required or not. Receiving the SDK
|
||||
– or links to other downloadable software – does not provide any license
|
||||
expressed or implied over these patents, except in very limited
|
||||
conditions where the license so provides. No representation is made.
|
||||
|
||||
In certain cases, the optional packages are distributed only as source
|
||||
code. It is up to the receiver to make sure that in the applicable
|
||||
circumstances compiling the same code for a given platform or
|
||||
distributing the object code is not an act that infringes one or more
|
||||
patents.
|
||||
|
||||
### Software is as-is
|
||||
|
||||
All software and the entire SDK is provided as-is, without any
|
||||
warranty whatsoever. The individual licenses have particular language
|
||||
disclaiming liability: we invite you to read all of them. Should you
|
||||
need a warranty on the fact that software works as intended or have any
|
||||
kind of indemnification, you have the option to subscribe a software
|
||||
maintenance agreement with a company or entity that is in that business.
|
||||
Fluendo and Collabora, as well as some other companies, provide software
|
||||
maintenance agreements under certain conditions, you are invited to
|
||||
contact them in order to receive further details and discuss of the
|
||||
commercial terms.
|
||||
|
||||
## Contact
|
||||
|
||||
Web: <http://www.gstreamer.com>
|
||||
|
||||
Documentation: <http://docs.gstreamer.com>
|
||||
|
||||
Commercial support: <http://gstreamer.com/contact>
|
||||
|
||||
Bug
|
||||
tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
|
||||
|
||||
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:28
|
||||
|
267
2012.7+Amazon+%28Bugfix+Release+1%29.markdown
Normal file
|
@ -0,0 +1,267 @@
|
|||
# GStreamer SDK documentation : 2012.7 Amazon (Bugfix Release 1)
|
||||
|
||||
This page last changed on Jul 11, 2012 by slomo.
|
||||
|
||||
# Release – GStreamer SDK 2012.7 Amazon – Bugfix Release 1
|
||||
|
||||
**2012-07-06 // [http://www.gstreamer.com](http://www.gstreamer.com/)**
|
||||
|
||||
This release is targeted at media playback applications for desktop
|
||||
systems.
|
||||
|
||||
For more information about the GStreamer SDK and the latest versions
|
||||
please visit [http://www.gstreamer.com](http://www.gstreamer.com/)
|
||||
|
||||
## System Requirements
|
||||
|
||||
The GStreamer SDK currently supports Microsoft Windows, Mac OS X and
|
||||
different Linux distributions. Future releases of the GStreamer SDK will
|
||||
add support for Android, iOS and possibly other platforms.
|
||||
|
||||
### Linux
|
||||
|
||||
The supported Linux distributions are currently
|
||||
|
||||
- Ubuntu 11.10 (Oneiric Ocelot)
|
||||
- Ubuntu 12.04 (Precise Pangolin)
|
||||
- Debian 6.0 (Squeeze)
|
||||
- Fedora 16
|
||||
- Fedora 17
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Support for more Linux distributions will be added on demand later.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Linux](Installing%2Bon%2BLinux.html)
|
||||
|
||||
### Mac OS X
|
||||
|
||||
The supported Mac OS X versions are currently
|
||||
|
||||
- Snow Leopard (10.6)
|
||||
- Lion (10.7)
|
||||
- Mountain Lion (10.8) (experimental)
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Mac OS
|
||||
X](Installing%2Bon%2BMac%2BOS%2BX.html)
|
||||
|
||||
### Microsoft Windows
|
||||
|
||||
The supported Windows versions are
|
||||
|
||||
- Windows XP
|
||||
- Windows Vista
|
||||
- Windows 7
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Developing applications with the GStreamer SDK is supported with
|
||||
the following development environments
|
||||
|
||||
- Microsoft Visual Studio 2010 (including the free Visual C++ Express
|
||||
edition)
|
||||
|
||||
<http://www.microsoft.com/visualstudio/en-us/products/2010-editions>
|
||||
|
||||
- MinGW/MSYS
|
||||
|
||||
[http://mingw.org](http://mingw.org/)
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Windows](Installing%2Bon%2BWindows.html)
|
||||
|
||||
## Changes since 2012.5 Amazon
|
||||
|
||||
- The audio sink on OSX (osxaudiosink) supports compressed audio
|
||||
passthrough (SPDIF) and takes the user's speaker configuration into
|
||||
account
|
||||
- The Windows Direct3D video sink (d3dvideosink) got many bugfixes for
|
||||
memory/resource leaks fixes
|
||||
- clutter-gst was updated to 1.6.0 to correctly work with the Fluendo
|
||||
VAAPI plugins
|
||||
- Support for static linking of GStreamer was improved
|
||||
- Various improvements to the Mac OS X framework
|
||||
- Many new tutorials were added
|
||||
- Lots of other, smaller bugfixes to GStreamer and other software
|
||||
- Closed
|
||||
[bugreports](https://bugs.freedesktop.org/buglist.cgi?resolution=---&resolution=FIXED&query_format=advanced&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&version=2012.5&product=GStreamer%20SDK&list_id=85256)
|
||||
|
||||
## Compatibility
|
||||
|
||||
The GStreamer SDK Amazon (Bugfix Release 1) is compatible with the 0.10
|
||||
release series of GStreamer and fully compatible with GStreamer SDK
|
||||
Amazon Release.
|
||||
|
||||
## Features
|
||||
|
||||
The GStreamer SDK Amazon is targeted at media playback applications for
|
||||
desktop systems. It contains the required components and plugins for
|
||||
media playback.
|
||||
|
||||
- Local media playback, live streaming, progressive streaming and DVD
|
||||
playback
|
||||
- Supported video codecs: Theora, VP8, Dirac, MJPEG, h.264\*,
|
||||
h.263\*, MPEG2\*, MPEG4\*, WMV/VC1\*, ...
|
||||
- Supported audio codecs: Vorbis, FLAC, Speex, WavPack, AAC\*,
|
||||
MP3\*, WMA\*, Dolby Digital (AC3)\*, DTS/DCA\*, AMR NB/WB\*, ...
|
||||
- Supported container formats: Ogg, WebM, Matroska, MP4,
|
||||
Quicktime, AVI, FLV, 3GPP, WAV, Real Media\*, ASF\*, MPEG
|
||||
PS/TS\*, ...
|
||||
- Supported protocols: local files, HTTP, Shoutcast/Icecast, HLS,
|
||||
RTSP, RTP and MMS\*
|
||||
- Application and GUI toolkit integration
|
||||
- Automatic container/codecs discovery
|
||||
- Metadata extraction
|
||||
- Subtitle support
|
||||
- Audio visualization
|
||||
- On the fly stream switching between different audio/subtitle streams
|
||||
- Absolute position seeking, including remote seeking
|
||||
- Fast/slow forward/reverse playback and frame stepping
|
||||
- Automatic video deinterlacing, scaling and color balance post
|
||||
processing
|
||||
- Compressed audio passthrough
|
||||
- Clutter texture rendering
|
||||
|
||||
> \* May require additional licenses from third parties in some
|
||||
> countries and not installed by default with the GStreamer SDK.
|
||||
> Properly licensed plugins can be obtained from different companies or
|
||||
> licenses can be directly obtained from the relevant licensors.
|
||||
|
||||
Although this release is targeted at playback applications only it
|
||||
also contains encoders for some codecs, muxers for some container
|
||||
formats and some other plugins that are not strictly playback related.
|
||||
These use-cases are currently not officially supported by the GStreamer
|
||||
SDK but will usually work and will be officially supported in future
|
||||
releases of the GStreamer SDK.
|
||||
|
||||
|
||||
|
||||
The GStreamer SDK Amazon contains the following major components, some
|
||||
of them being optional or not used on some platforms.
|
||||
|
||||
- GLib 2.32.1
|
||||
- GStreamer core and base 0.10.36
|
||||
- GStreamer good plugins 0.10.31
|
||||
- GStreamer bad plugins 0.10.23
|
||||
- GStreamer ugly plugins 0.10.19
|
||||
- GStreamer Python bindings 0.10.22
|
||||
- GTK+ 2.24.10 and Python bindings
|
||||
- clutter 1.8.4 and clutter-gst 1.6.0
|
||||
|
||||
## Known Issues
|
||||
|
||||
- Switching between different audio streams can take some time until
|
||||
the switch takes effect
|
||||
- Using the native decoders (e.g. h.264) on OS X Lion (10.7) does not
|
||||
work currently
|
||||
- [Other known
|
||||
issues](https://bugs.freedesktop.org/buglist.cgi?resolution=---&resolution=FIXED&query_format=advanced&bug_status=NEW&bug_status=ASSIGNED&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&version=2012.7&product=GStreamer%20SDK&list_id=85256)
|
||||
|
||||
## Legal Information
|
||||
|
||||
### Installer, default installation
|
||||
|
||||
The installer (Microsoft Windows) and the default installation
|
||||
(GNU/Linux) contain and install the minimal default installation. At
|
||||
install time or later, the downloading of optional components is also
|
||||
possible, but read on for certain legal cautions you might want to take.
|
||||
All downloads are from
|
||||
the [freedesktop.org](http://www.freedesktop.org/) website, for
|
||||
registered/approved users only.
|
||||
|
||||
### Licensing of SDK
|
||||
|
||||
GStreamer SDK minimal default installation only contains packages which
|
||||
are licensed under the GNU LGPL license v.2.1. This license gives you
|
||||
the Freedom to use, modify, make copies of the software either in the
|
||||
original or in a modified form, provided that the software you
|
||||
redistribute is licensed under the same licensing terms. This only
|
||||
extends to the software itself and modified versions of it, but you are
|
||||
free to link the LGPL software as a library used by other software under
|
||||
whichever license. In other words, it is a weak copyleft license.
|
||||
|
||||
Therefore, it is possible to use the SDK to build applications that are
|
||||
then distributed under a different license, including a proprietary one,
|
||||
provided that reverse engineering is not prohibited for debugging
|
||||
modifications purposes. Only the pieces of the SDK that are under the
|
||||
LGPL need to be kept under the LGPL, and the corresponding source code
|
||||
must be distributed along with the application (or an irrevocable offer
|
||||
to do so for at least three years from distribution). Please consult
|
||||
section 6 of the LGPL for further details as to what the corresponding
|
||||
source code must contain. Some portions of the minimal default
|
||||
installation may be under different licenses, which are both more
|
||||
liberal than the LGPL (they are less strict conditions for granting the
|
||||
license) and compatible with the LGPL. This is advised locally.
|
||||
|
||||
### Optional packages
|
||||
|
||||
There are two types of optional packages (GPL and Patented), which are
|
||||
under a different license or have other issues concerning patentability
|
||||
(or both).
|
||||
|
||||
#### GPL code
|
||||
|
||||
Part of the optional packages are under the GNU GPL v.2 or v.3. This
|
||||
means that you cannot link the GPL software in a program unless the same
|
||||
program is also under the GPL, but you are invited to seek competent
|
||||
advice on how this works in your precise case and design choices. GPL is
|
||||
called “strong copyleft” because the condition to distributed under the
|
||||
same license has the largest possible scope and extends to all
|
||||
derivative works.
|
||||
|
||||
#### Patents
|
||||
|
||||
Certain software, and in particular software that implements multimedia
|
||||
standard formats such as Mp3, MPEG 2 video and audio, h.264, MPEG 4
|
||||
audio and video, AC3, etc, can have patent issues. In certain countries
|
||||
patents are granted on software and even software-only solution are by
|
||||
and large considered patentable and are patented (such as in the United
|
||||
States). In certain others, patents on pure software solutions are
|
||||
formally prohibited, but granted (this is the case of Europe), and in
|
||||
others again are neither allowed nor granted.
|
||||
|
||||
It is up to you to make sure that in the countries where the SDK is
|
||||
used, products are made using it and product are distributed, a license
|
||||
from the applicable patent holders is required or not. Receiving the SDK
|
||||
– or links to other downloadable software – does not provide any license
|
||||
expressed or implied over these patents, except in very limited
|
||||
conditions where the license so provides. No representation is made.
|
||||
|
||||
In certain cases, the optional packages are distributed only as source
|
||||
code. It is up to the receiver to make sure that in the applicable
|
||||
circumstances compiling the same code for a given platform or
|
||||
distributing the object code is not an act that infringes one or more
|
||||
patents.
|
||||
|
||||
### Software is as-is
|
||||
|
||||
All software and the entire SDK is provided as-is, without any
|
||||
warranty whatsoever. The individual licenses have particular language
|
||||
disclaiming liability: we invite you to read all of them. Should you
|
||||
need a warranty on the fact that software works as intended or have any
|
||||
kind of indemnification, you have the option to subscribe a software
|
||||
maintenance agreement with a company or entity that is in that business.
|
||||
Fluendo and Collabora, as well as some other companies, provide software
|
||||
maintenance agreements under certain conditions, you are invited to
|
||||
contact them in order to receive further details and discuss of the
|
||||
commercial terms.
|
||||
|
||||
## Contact
|
||||
|
||||
Web: [http://www.gstreamer.com](http://www.gstreamer.com/)
|
||||
|
||||
Documentation: <http://docs.gstreamer.com>
|
||||
|
||||
Commercial support: <http://gstreamer.com/contact>
|
||||
|
||||
Bug
|
||||
tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
|
||||
|
||||
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:28
|
||||
|
265
2012.9+Amazon+%28Bugfix+Release+2%29.markdown
Normal file
|
@ -0,0 +1,265 @@
|
|||
# GStreamer SDK documentation : 2012.9 Amazon (Bugfix Release 2)
|
||||
|
||||
This page last changed on Sep 18, 2012 by ylatuya.
|
||||
|
||||
# Release – GStreamer SDK 2012.9 Amazon – Bugfix Release 2
|
||||
|
||||
**2012-09-18 // [http://www.gstreamer.com](http://www.gstreamer.com/)**
|
||||
|
||||
This release is targeted at media playback applications for desktop
|
||||
systems.
|
||||
|
||||
For more information about the GStreamer SDK and the latest versions
|
||||
please visit [http://www.gstreamer.com](http://www.gstreamer.com/)
|
||||
|
||||
## System Requirements
|
||||
|
||||
The GStreamer SDK currently supports Microsoft Windows, Mac OS X and
|
||||
different Linux distributions. Future releases of the GStreamer SDK will
|
||||
add support for Android, iOS and possibly other platforms.
|
||||
|
||||
### Linux
|
||||
|
||||
The supported Linux distributions are currently
|
||||
|
||||
- Ubuntu 11.10 (Oneiric Ocelot)
|
||||
- Ubuntu 12.04 (Precise Pangolin)
|
||||
- Debian 6.0 (Squeeze)
|
||||
- Fedora 16
|
||||
- Fedora 17
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Support for more Linux distributions will be added on demand later.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Linux](Installing%2Bon%2BLinux.html)
|
||||
|
||||
### Mac OS X
|
||||
|
||||
The supported Mac OS X versions are currently
|
||||
|
||||
- Snow Leopard (10.6)
|
||||
- Lion (10.7)
|
||||
- Mountain Lion (10.8) (experimental)
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit) with universal binaries.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Mac OS
|
||||
X](Installing%2Bon%2BMac%2BOS%2BX.html)
|
||||
|
||||
### Microsoft Windows
|
||||
|
||||
The supported Windows versions are
|
||||
|
||||
- Windows XP
|
||||
- Windows Vista
|
||||
- Windows 7
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Developing applications with the GStreamer SDK is supported with
|
||||
the following development environments
|
||||
|
||||
- Microsoft Visual Studio 2010 (including the free Visual C++ Express
|
||||
edition)
|
||||
|
||||
<http://www.microsoft.com/visualstudio/en-us/products/2010-editions>
|
||||
|
||||
- MinGW/MSYS
|
||||
|
||||
[http://mingw.org](http://mingw.org/)
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Windows](Installing%2Bon%2BWindows.html)
|
||||
|
||||
## Changes since 2012.7 Amazon
|
||||
|
||||
- Universal binaries were added for Mac OS X
|
||||
- A new gstreamer-editing package was added with GES and GNonlin
|
||||
- A new gstreamer-capture package was added with several source
|
||||
plugins.
|
||||
- A scalling bug of the video window was fixed for Gtk applications in
|
||||
Mac OS X
|
||||
- The python bindings for Windows now include the missing
|
||||
libpyglib-2.0-python.pyd file
|
||||
- Lots of other, smaller bugfixes to GStreamer and other software
|
||||
- Closed
|
||||
[bugreports](https://bugs.freedesktop.org/buglist.cgi?resolution=---&resolution=FIXED&query_format=advanced&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&version=2012.7&product=GStreamer%20SDK&list_id=131136)
|
||||
|
||||
## Compatibility
|
||||
|
||||
The GStreamer SDK Amazon (Bugfix Release 2) is compatible with the 0.10
|
||||
release series of GStreamer and fully compatible with GStreamer SDK
|
||||
Amazon Release.
|
||||
|
||||
## Features
|
||||
|
||||
The GStreamer SDK Amazon is targeted at media playback applications for
|
||||
desktop systems. It contains the required components and plugins for
|
||||
media playback.
|
||||
|
||||
- Local media playback, live streaming, progressive streaming and DVD
|
||||
playback
|
||||
- Supported video codecs: Theora, VP8, Dirac, MJPEG, h.264\*,
|
||||
h.263\*, MPEG2\*, MPEG4\*, WMV/VC1\*, ...
|
||||
- Supported audio codecs: Vorbis, FLAC, Speex, WavPack, AAC\*,
|
||||
MP3\*, WMA\*, Dolby Digital (AC3)\*, DTS/DCA\*, AMR NB/WB\*, ...
|
||||
- Supported container formats: Ogg, WebM, Matroska, MP4,
|
||||
Quicktime, AVI, FLV, 3GPP, WAV, Real Media\*, ASF\*, MPEG
|
||||
PS/TS\*, ...
|
||||
- Supported protocols: local files, HTTP, Shoutcast/Icecast, HLS,
|
||||
RTSP, RTP and MMS\*
|
||||
- Application and GUI toolkit integration
|
||||
- Automatic container/codecs discovery
|
||||
- Metadata extraction
|
||||
- Subtitle support
|
||||
- Audio visualization
|
||||
- On the fly stream switching between different audio/subtitle streams
|
||||
- Absolute position seeking, including remote seeking
|
||||
- Fast/slow forward/reverse playback and frame stepping
|
||||
- Automatic video deinterlacing, scaling and color balance post
|
||||
processing
|
||||
- Compressed audio passthrough
|
||||
- Clutter texture rendering
|
||||
|
||||
> \* May require additional licenses from third parties in some
|
||||
> countries and not installed by default with the GStreamer SDK.
|
||||
> Properly licensed plugins can be obtained from different companies or
|
||||
> licenses can be directly obtained from the relevant licensors.
|
||||
|
||||
Although this release is targeted at playback applications only it
|
||||
also contains encoders for some codecs, muxers for some container
|
||||
formats and some other plugins that are not strictly playback related.
|
||||
These use-cases are currently not officially supported by the GStreamer
|
||||
SDK but will usually work and will be officially supported in future
|
||||
releases of the GStreamer SDK.
|
||||
|
||||
|
||||
|
||||
The GStreamer SDK Amazon contains the following major components, some
|
||||
of them being optional or not used on some platforms.
|
||||
|
||||
- GLib 2.32.1
|
||||
- GStreamer core and base 0.10.36
|
||||
- GStreamer good plugins 0.10.31
|
||||
- GStreamer bad plugins 0.10.23
|
||||
- GStreamer ugly plugins 0.10.19
|
||||
- GStreamer Python bindings 0.10.22
|
||||
- GTK+ 2.24.10 and Python bindings
|
||||
- clutter 1.8.4 and clutter-gst 1.6.0
|
||||
|
||||
## Known Issues
|
||||
|
||||
- Switching between different audio streams can take some time until
|
||||
the switch takes effect
|
||||
- Using the native decoders (e.g. h.264) on OS X Lion (10.7) does not
|
||||
work currently
|
||||
- [Other known
|
||||
issues](https://bugs.freedesktop.org/buglist.cgi?resolution=---&resolution=FIXED&query_format=advanced&bug_status=NEW&bug_status=ASSIGNED&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&version=2012.7&product=GStreamer%20SDK&list_id=85256)
|
||||
|
||||
## Legal Information
|
||||
|
||||
### Installer, default installation
|
||||
|
||||
The installer (Microsoft Windows) and the default installation
|
||||
(GNU/Linux) contain and install the minimal default installation. At
|
||||
install time or later, the downloading of optional components is also
|
||||
possible, but read on for certain legal cautions you might want to take.
|
||||
All downloads are from
|
||||
the [freedesktop.org](http://www.freedesktop.org/) website, for
|
||||
registered/approved users only.
|
||||
|
||||
### Licensing of SDK
|
||||
|
||||
GStreamer SDK minimal default installation only contains packages which
|
||||
are licensed under the GNU LGPL license v.2.1. This license gives you
|
||||
the Freedom to use, modify, make copies of the software either in the
|
||||
original or in a modified form, provided that the software you
|
||||
redistribute is licensed under the same licensing terms. This only
|
||||
extends to the software itself and modified versions of it, but you are
|
||||
free to link the LGPL software as a library used by other software under
|
||||
whichever license. In other words, it is a weak copyleft license.
|
||||
|
||||
Therefore, it is possible to use the SDK to build applications that are
|
||||
then distributed under a different license, including a proprietary one,
|
||||
provided that reverse engineering is not prohibited for debugging
|
||||
modifications purposes. Only the pieces of the SDK that are under the
|
||||
LGPL need to be kept under the LGPL, and the corresponding source code
|
||||
must be distributed along with the application (or an irrevocable offer
|
||||
to do so for at least three years from distribution). Please consult
|
||||
section 6 of the LGPL for further details as to what the corresponding
|
||||
source code must contain. Some portions of the minimal default
|
||||
installation may be under different licenses, which are both more
|
||||
liberal than the LGPL (they are less strict conditions for granting the
|
||||
license) and compatible with the LGPL. This is advised locally.
|
||||
|
||||
### Optional packages
|
||||
|
||||
There are two types of optional packages (GPL and Patented), which are
|
||||
under a different license or have other issues concerning patentability
|
||||
(or both).
|
||||
|
||||
#### GPL code
|
||||
|
||||
Part of the optional packages are under the GNU GPL v.2 or v.3. This
|
||||
means that you cannot link the GPL software in a program unless the same
|
||||
program is also under the GPL, but you are invited to seek competent
|
||||
advice on how this works in your precise case and design choices. GPL is
|
||||
called “strong copyleft” because the condition to distributed under the
|
||||
same license has the largest possible scope and extends to all
|
||||
derivative works.
|
||||
|
||||
#### Patents
|
||||
|
||||
Certain software, and in particular software that implements multimedia
|
||||
standard formats such as Mp3, MPEG 2 video and audio, h.264, MPEG 4
|
||||
audio and video, AC3, etc, can have patent issues. In certain countries
|
||||
patents are granted on software and even software-only solution are by
|
||||
and large considered patentable and are patented (such as in the United
|
||||
States). In certain others, patents on pure software solutions are
|
||||
formally prohibited, but granted (this is the case of Europe), and in
|
||||
others again are neither allowed nor granted.
|
||||
|
||||
It is up to you to make sure that in the countries where the SDK is
|
||||
used, products are made using it and product are distributed, a license
|
||||
from the applicable patent holders is required or not. Receiving the SDK
|
||||
– or links to other downloadable software – does not provide any license
|
||||
expressed or implied over these patents, except in very limited
|
||||
conditions where the license so provides. No representation is made.
|
||||
|
||||
In certain cases, the optional packages are distributed only as source
|
||||
code. It is up to the receiver to make sure that in the applicable
|
||||
circumstances compiling the same code for a given platform or
|
||||
distributing the object code is not an act that infringes one or more
|
||||
patents.
|
||||
|
||||
### Software is as-is
|
||||
|
||||
All software and the entire SDK is provided as-is, without any
|
||||
warranty whatsoever. The individual licenses have particular language
|
||||
disclaiming liability: we invite you to read all of them. Should you
|
||||
need a warranty on the fact that software works as intended or have any
|
||||
kind of indemnification, you have the option to subscribe a software
|
||||
maintenance agreement with a company or entity that is in that business.
|
||||
Fluendo and Collabora, as well as some other companies, provide software
|
||||
maintenance agreements under certain conditions, you are invited to
|
||||
contact them in order to receive further details and discuss of the
|
||||
commercial terms.
|
||||
|
||||
## Contact
|
||||
|
||||
Web: [http://www.gstreamer.com](http://www.gstreamer.com/)
|
||||
|
||||
Documentation: <http://docs.gstreamer.com>
|
||||
|
||||
Commercial support: <http://gstreamer.com/contact>
|
||||
|
||||
Bug
|
||||
tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
|
||||
|
||||
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:28
|
||||
|
306
2013.6+Congo.markdown
Normal file
|
@ -0,0 +1,306 @@
|
|||
# GStreamer SDK documentation : 2013.6 Congo
|
||||
|
||||
This page last changed on Jun 11, 2013 by ylatuya.
|
||||
|
||||
# Release – GStreamer SDK 2013.6 Congo
|
||||
|
||||
**2013-06-12 // [http://www.gstreamer.com](http://www.gstreamer.com/)**
|
||||
|
||||
This release is targeted at media playback applications for desktop and
|
||||
mobile systems.
|
||||
|
||||
For more information about the GStreamer SDK and the latest versions
|
||||
please visit [http://www.gstreamer.com](http://www.gstreamer.com/)
|
||||
|
||||
## System Requirements
|
||||
|
||||
The GStreamer SDK currently supports Microsoft Windows, Mac OS X,
|
||||
different Linux distributions, Android and iOS.
|
||||
|
||||
Future releases of the GStreamer SDK will add support for possibly other
|
||||
platforms.
|
||||
|
||||
### Linux
|
||||
|
||||
The supported Linux distributions are currently
|
||||
|
||||
- Ubuntu 12.04 (Precise Pangolin)
|
||||
|
||||
- Ubuntu 12.10 (Quantal Quetzal)
|
||||
|
||||
- Ubuntu 13.04 (Raring Ringtail)
|
||||
|
||||
- Debian 6.0 (Squeeze)
|
||||
|
||||
- Debian 7.0 (Wheezy)
|
||||
|
||||
- Fedora 17
|
||||
|
||||
- Fedora 18
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Support for more Linux distributions will be added on demand later.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Linux](Installing%2Bon%2BLinux.html)
|
||||
|
||||
### Mac OS X
|
||||
|
||||
The supported Mac OS X versions are currently
|
||||
|
||||
- Snow Leopard (10.6)
|
||||
- Lion (10.7)
|
||||
- Mountain Lion (10.8)
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit) with universal binaries.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Mac OS
|
||||
X](Installing%2Bon%2BMac%2BOS%2BX.html)
|
||||
|
||||
### Microsoft Windows
|
||||
|
||||
The supported Windows versions are
|
||||
|
||||
- Windows XP
|
||||
- Windows Vista
|
||||
- Windows 7
|
||||
- Windows 8
|
||||
|
||||
for x86 (32 bit) and x86-64 (64 bit).
|
||||
|
||||
Developing applications with the GStreamer SDK is supported with
|
||||
the following development environments
|
||||
|
||||
- Microsoft Visual Studio 2010 or 2012 (including the free Visual C++
|
||||
Express
|
||||
edition)
|
||||
|
||||
<http://www.microsoft.com/visualstudio/eng/products/visual-studio-overview>
|
||||
|
||||
- MinGW/MSYS
|
||||
|
||||
[http://mingw.org](http://mingw.org/)
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing on Windows](Installing%2Bon%2BWindows.html)
|
||||
|
||||
### Android
|
||||
|
||||
The supported Android versions are
|
||||
|
||||
- 2.3 (Gingerbread, API level 9/10)
|
||||
- 3.1/3.2 (Honeycomb, API level 12/13)
|
||||
- 4.0 (Ice Cream Sandwhich, API level 15)
|
||||
- 4.1/4.2 (Jelly Bean, API level 16/17)
|
||||
|
||||
for ARM.
|
||||
|
||||
Developing applications with the GStreamer SDK for Android is supported
|
||||
from Linux, Mac OS X and Windows systems using the Android SDK and NDK.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing for Android
|
||||
development](Installing%2Bfor%2BAndroid%2Bdevelopment.html).
|
||||
|
||||
### iOS
|
||||
|
||||
The supported iOS versions are 6.0 and newer for ARM and x86 (iOS
|
||||
simulator).
|
||||
|
||||
Developing applications with the GStreamer SDK for iOS is supported from
|
||||
Mac OS X only and requires a recent XCode version.
|
||||
|
||||
For installation instructions and development environment setup
|
||||
instructions see [Installing for iOS
|
||||
development](Installing%2Bfor%2BiOS%2Bdevelopment.html).
|
||||
|
||||
## Changes since 2012.11 Brahmaputra
|
||||
|
||||
- Support for iOS platforms
|
||||
- Support for Ubuntu 13.04 and Fedora 18
|
||||
- Support for Android NDK r8e and newer
|
||||
- Update to gcc 4.7.3 and use MSVC lib.exe for generating .lib files
|
||||
for the Windows builds
|
||||
- The system audio/video codecs on Mac OS X 10.8 can be used from
|
||||
GStreamer now
|
||||
- Several RTP/RTSP and MPEG TS improvements
|
||||
- Fixed audio capture in Windows
|
||||
- Improvements and bugfixes to the SDK build process on all platforms
|
||||
- Lots of other, smaller bugfixes to GStreamer and other
|
||||
software
|
||||
- Closed [bugreports](https://bugs.freedesktop.org/buglist.cgi?list_id=310239&resolution=FIXED&chfieldto=2013-06-12&query_format=advanced&chfield=resolution&chfieldfrom=2012-11-28&chfieldvalue=FIXED&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&product=GStreamer%20SDK)
|
||||
|
||||
## Compatibility
|
||||
|
||||
The GStreamer SDK Congo is compatible with the 0.10 release series of
|
||||
GStreamer and fully compatible with GStreamer SDK Amazon and Brahmaputra
|
||||
Releases.
|
||||
|
||||
## Features
|
||||
|
||||
The GStreamer SDK Congo is targeted at media playback applications for
|
||||
desktop and mobile systems. It contains the required components
|
||||
and plugins for media playback.
|
||||
|
||||
- Local media playback, live streaming, progressive streaming and DVD
|
||||
playback
|
||||
- Supported video codecs: Theora, VP8, Dirac, MJPEG, JPEG2000,
|
||||
h.264\*, h.263\*, MPEG2\*, MPEG4\*, WMV/VC1\*, DV, ...
|
||||
- Supported audio codecs: Vorbis, FLAC, Opus, Speex, WavPack,
|
||||
AAC\*, MP3\*, WMA\*, Dolby Digital (AC3)\*, DTS/DCA\*, AMR
|
||||
NB/WB\*, ...
|
||||
- Supported container formats: Ogg, WebM, Matroska, MP4,
|
||||
Quicktime, AVI, FLV, 3GPP, WAV, DV, Real Media\*, ASF\*, MPEG
|
||||
PS/TS\*, ...
|
||||
- Supported protocols: local files, HTTP, Shoutcast/Icecast, HLS,
|
||||
RTSP, RTP and MMS\*
|
||||
- Application and GUI toolkit integration
|
||||
- Automatic container/codecs discovery
|
||||
- Metadata extraction
|
||||
- Subtitle support
|
||||
- Audio visualization
|
||||
- On the fly stream switching between different audio/subtitle streams
|
||||
- Absolute position seeking, including remote seeking
|
||||
- Fast/slow forward/reverse playback and frame stepping
|
||||
- Automatic video deinterlacing, scaling and color balance post
|
||||
processing
|
||||
- Compressed audio passthrough
|
||||
- Clutter texture rendering
|
||||
|
||||
> \* May require additional licenses from third parties in some
|
||||
> countries and not installed by default with the GStreamer SDK.
|
||||
> Properly licensed plugins can be obtained from different companies or
|
||||
> licenses can be directly obtained from the relevant licensors.
|
||||
|
||||
Although this release is targeted at playback applications only it
|
||||
also contains encoders for some codecs, muxers for some container
|
||||
formats and some other plugins that are not strictly playback related.
|
||||
These use-cases are currently not officially supported by the GStreamer
|
||||
SDK but will usually work and will be officially supported in future
|
||||
releases of the GStreamer SDK.
|
||||
|
||||
The GStreamer SDK Congo contains the following major components, some of
|
||||
them being optional or not used on some platforms.
|
||||
|
||||
- GLib 2.36.1
|
||||
- GStreamer core and base 0.10.36
|
||||
- GStreamer good plugins 0.10.31
|
||||
- GStreamer bad plugins 0.10.23
|
||||
- GStreamer ugly plugins 0.10.19
|
||||
- GStreamer Python bindings 0.10.22\*
|
||||
- GTK+ 2.24.11 and Python bindings\*
|
||||
- clutter 1.8.4 and clutter-gst 1.6.0\*
|
||||
|
||||
> \* Not available on Android and iOS platforms.
|
||||
|
||||
## Known Issues
|
||||
|
||||
- Switching between different audio streams can take some time until
|
||||
the switch takes effect
|
||||
- [Other known
|
||||
issues](https://bugs.freedesktop.org/buglist.cgi?resolution=---&resolution=FIXED&query_format=advanced&bug_status=NEW&bug_status=ASSIGNED&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&version=2013.6&product=GStreamer%20SDK&list_id=85256)
|
||||
|
||||
## Legal Information
|
||||
|
||||
### Installer, default installation
|
||||
|
||||
The installer (Microsoft Windows) and the default installation
|
||||
(GNU/Linux) contain and install the minimal default installation. At
|
||||
install time or later, the downloading of optional components is also
|
||||
possible, but read on for certain legal cautions you might want to take.
|
||||
All downloads are from
|
||||
the [freedesktop.org](http://www.freedesktop.org/) website, for
|
||||
registered/approved users only.
|
||||
|
||||
### Licensing of SDK
|
||||
|
||||
GStreamer SDK minimal default installation only contains packages which
|
||||
are licensed under the GNU LGPL license v.2.1. This license gives you
|
||||
the Freedom to use, modify, make copies of the software either in the
|
||||
original or in a modified form, provided that the software you
|
||||
redistribute is licensed under the same licensing terms. This only
|
||||
extends to the software itself and modified versions of it, but you are
|
||||
free to link the LGPL software as a library used by other software under
|
||||
whichever license. In other words, it is a weak copyleft license.
|
||||
|
||||
Therefore, it is possible to use the SDK to build applications that are
|
||||
then distributed under a different license, including a proprietary one,
|
||||
provided that reverse engineering is not prohibited for debugging
|
||||
modifications purposes. Only the pieces of the SDK that are under the
|
||||
LGPL need to be kept under the LGPL, and the corresponding source code
|
||||
must be distributed along with the application (or an irrevocable offer
|
||||
to do so for at least three years from distribution). Please consult
|
||||
section 6 of the LGPL for further details as to what the corresponding
|
||||
source code must contain. Some portions of the minimal default
|
||||
installation may be under different licenses, which are both more
|
||||
liberal than the LGPL (they are less strict conditions for granting the
|
||||
license) and compatible with the LGPL. This is advised locally.
|
||||
|
||||
### Optional packages
|
||||
|
||||
There are two types of optional packages (GPL and Patented), which are
|
||||
under a different license or have other issues concerning patentability
|
||||
(or both).
|
||||
|
||||
#### GPL code
|
||||
|
||||
Part of the optional packages are under the GNU GPL v.2 or v.3. This
|
||||
means that you cannot link the GPL software in a program unless the same
|
||||
program is also under the GPL, but you are invited to seek competent
|
||||
advice on how this works in your precise case and design choices. GPL is
|
||||
called “strong copyleft” because the condition to distributed under the
|
||||
same license has the largest possible scope and extends to all
|
||||
derivative works.
|
||||
|
||||
#### Patents
|
||||
|
||||
Certain software, and in particular software that implements multimedia
|
||||
standard formats such as Mp3, MPEG 2 video and audio, h.264, MPEG 4
|
||||
audio and video, AC3, etc, can have patent issues. In certain countries
|
||||
patents are granted on software and even software-only solution are by
|
||||
and large considered patentable and are patented (such as in the United
|
||||
States). In certain others, patents on pure software solutions are
|
||||
formally prohibited, but granted (this is the case of Europe), and in
|
||||
others again are neither allowed nor granted.
|
||||
|
||||
It is up to you to make sure that in the countries where the SDK is
|
||||
used, products are made using it and product are distributed, a license
|
||||
from the applicable patent holders is required or not. Receiving the SDK
|
||||
– or links to other downloadable software – does not provide any license
|
||||
expressed or implied over these patents, except in very limited
|
||||
conditions where the license so provides. No representation is made.
|
||||
|
||||
In certain cases, the optional packages are distributed only as source
|
||||
code. It is up to the receiver to make sure that in the applicable
|
||||
circumstances compiling the same code for a given platform or
|
||||
distributing the object code is not an act that infringes one or more
|
||||
patents.
|
||||
|
||||
### Software is as-is
|
||||
|
||||
All software and the entire SDK is provided as-is, without any
|
||||
warranty whatsoever. The individual licenses have particular language
|
||||
disclaiming liability: we invite you to read all of them. Should you
|
||||
need a warranty on the fact that software works as intended or have any
|
||||
kind of indemnification, you have the option to subscribe a software
|
||||
maintenance agreement with a company or entity that is in that business.
|
||||
Fluendo and Collabora, as well as some other companies, provide software
|
||||
maintenance agreements under certain conditions, you are invited to
|
||||
contact them in order to receive further details and discuss of the
|
||||
commercial terms.
|
||||
|
||||
## Contact
|
||||
|
||||
Web: [http://www.gstreamer.com](http://www.gstreamer.com/)
|
||||
|
||||
Documentation: [http://docs.gstreamer.com](http://docs.gstreamer.com/)
|
||||
|
||||
Commercial support: <http://gstreamer.com/contact>
|
||||
|
||||
Bug
|
||||
tracker: <https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK>
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
297
Android+tutorial+1%3A+Link+against+GStreamer.markdown
Normal file
|
@ -0,0 +1,297 @@
|
|||
# GStreamer SDK documentation : Android tutorial 1: Link against GStreamer
|
||||
|
||||
This page last changed on May 02, 2013 by xartigas.
|
||||
|
||||
# Goal![](attachments/thumbnails/2687057/2654326)
|
||||
|
||||
This first Android tutorial is extremely simple: it just retrieves the
|
||||
GStreamer version and displays it on the screen. It exemplifies how to
|
||||
access GStreamer C code from Java and verifies that there have been no
|
||||
linkage problems.
|
||||
|
||||
# Hello GStreamer \[Java code\]
|
||||
|
||||
In the `share/gst-sdk/tutorials` folder of your GStreamer SDK
|
||||
installation path you should find an `android-tutorial-1` directory,
|
||||
with the usual Android NDK structure: a `src` folder for the Java code,
|
||||
a `jni` folder for the C code and a `res` folder for UI resources.
|
||||
|
||||
We recommend that you open this project in Eclipse (as explained
|
||||
in [Installing for Android
|
||||
development](Installing%2Bfor%2BAndroid%2Bdevelopment.html)) so you can
|
||||
easily see how all the pieces fit together.
|
||||
|
||||
Let’s first introduce the Java code, then the C code and finally the
|
||||
makefile that allows GStreamer integration.
|
||||
|
||||
**src/com/gst\_sdk\_tutorials/tutorial\_1/Tutorial1.java**
|
||||
|
||||
``` theme: Default; brush: java; gutter: true
|
||||
package com.gst_sdk_tutorials.tutorial_1;
|
||||
|
||||
import android.app.Activity;
|
||||
import android.os.Bundle;
|
||||
import android.widget.TextView;
|
||||
import android.widget.Toast;
|
||||
|
||||
import com.gstreamer.GStreamer;
|
||||
|
||||
public class Tutorial1 extends Activity {
|
||||
private native String nativeGetGStreamerInfo();
|
||||
|
||||
// Called when the activity is first created.
|
||||
@Override
|
||||
public void onCreate(Bundle savedInstanceState)
|
||||
{
|
||||
super.onCreate(savedInstanceState);
|
||||
|
||||
try {
|
||||
GStreamer.init(this);
|
||||
} catch (Exception e) {
|
||||
Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
|
||||
finish();
|
||||
return;
|
||||
}
|
||||
|
||||
setContentView(R.layout.main);
|
||||
|
||||
TextView tv = (TextView)findViewById(R.id.textview_info);
|
||||
tv.setText("Welcome to " + nativeGetGStreamerInfo() + " !");
|
||||
}
|
||||
|
||||
static {
|
||||
System.loadLibrary("gstreamer_android");
|
||||
System.loadLibrary("tutorial-1");
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
Calls from Java to C happen through native methods, like the one
|
||||
declared here:
|
||||
|
||||
``` first-line: 11; theme: Default; brush: java; gutter: true
|
||||
private native String nativeGetGStreamerInfo();
|
||||
```
|
||||
|
||||
This tells Java that there exists a method with this signature somewhere
|
||||
so it compiles happily. It is your responsibility to ensure that, **at
|
||||
runtime**, this method is accessible. This is accomplished by the C code
|
||||
shown later.
|
||||
|
||||
The first bit of code that gets actually executed is the static
|
||||
initializer of the class:
|
||||
|
||||
``` first-line: 33; theme: Default; brush: java; gutter: true
|
||||
static {
|
||||
System.loadLibrary("gstreamer_android");
|
||||
System.loadLibrary("tutorial-1");
|
||||
}
|
||||
```
|
||||
|
||||
It loads `libgstreamer_android.so`, which contains all GStreamer
|
||||
methods, and `libtutorial-1.so`, which contains the C part of this
|
||||
tutorial, explained below.
|
||||
|
||||
Upon loading, each of these libraries’ `JNI_OnLoad()` method is
|
||||
executed. It basically registers the native methods that these libraries
|
||||
expose. The GStreamer library only exposes a `init()` method, which
|
||||
initializes GStreamer and registers all plugins (The tutorial library is
|
||||
explained later below).
|
||||
|
||||
``` first-line: 19; theme: Default; brush: java; gutter: true
|
||||
try {
|
||||
GStreamer.init(this);
|
||||
} catch (Exception e) {
|
||||
Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
|
||||
finish();
|
||||
return;
|
||||
}
|
||||
```
|
||||
|
||||
Next, in the `OnCreate()` method of the
|
||||
[Activity](http://developer.android.com/reference/android/app/Activity.html)
|
||||
we actually initialize GStreamer by calling `GStreamer.init()`. This
|
||||
method requires a
|
||||
[Context](http://developer.android.com/reference/android/content/Context.html)
|
||||
so it cannot be called from the static initializer, but there is no
|
||||
danger in calling it multiple times, as all but the first time the calls
|
||||
will be ignored.
|
||||
|
||||
Should initialization fail, the `init()` method would throw an
|
||||
[Exception](http://developer.android.com/reference/java/lang/Exception.html)
|
||||
with the details provided by the GStreamer library.
|
||||
|
||||
``` first-line: 29; theme: Default; brush: java; gutter: true
|
||||
TextView tv = (TextView)findViewById(R.id.textview_info);
|
||||
tv.setText("Welcome to " + nativeGetGStreamerInfo() + " !");
|
||||
```
|
||||
|
||||
Then, the native method `nativeGetGStreamerInfo()` is called and a
|
||||
string is retrieved, which is used to format the content of the
|
||||
[TextView](http://developer.android.com/reference/android/widget/TextView.html)
|
||||
in the UI.
|
||||
|
||||
This finishes the UI part of this tutorial. Let’s take a look at the C
|
||||
code:
|
||||
|
||||
# Hello GStreamer \[C code\]
|
||||
|
||||
**jni/tutorial-1.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <string.h>
|
||||
#include <jni.h>
|
||||
#include <android/log.h>
|
||||
#include <gst/gst.h>
|
||||
|
||||
/*
|
||||
* Java Bindings
|
||||
*/
|
||||
jstring gst_native_get_gstreamer_info (JNIEnv* env, jobject thiz) {
|
||||
char *version_utf8 = gst_version_string();
|
||||
jstring *version_jstring = (*env)->NewStringUTF(env, version_utf8);
|
||||
g_free (version_utf8);
|
||||
return version_jstring;
|
||||
}
|
||||
|
||||
static JNINativeMethod native_methods[] = {
|
||||
{ "nativeGetGStreamerInfo", "()Ljava/lang/String;", (void *) gst_native_get_gstreamer_info}
|
||||
};
|
||||
|
||||
jint JNI_OnLoad(JavaVM *vm, void *reserved) {
|
||||
JNIEnv *env = NULL;
|
||||
|
||||
if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) {
|
||||
__android_log_print (ANDROID_LOG_ERROR, "tutorial-1", "Could not retrieve JNIEnv");
|
||||
return 0;
|
||||
}
|
||||
jclass klass = (*env)->FindClass (env, "com/gst_sdk_tutorials/tutorial_1/Tutorial1");
|
||||
(*env)->RegisterNatives (env, klass, native_methods, G_N_ELEMENTS(native_methods));
|
||||
|
||||
return JNI_VERSION_1_4;
|
||||
}
|
||||
```
|
||||
|
||||
The `JNI_OnLoad()` method is executed every time the Java Virtual
|
||||
Machine (VM) loads a library.
|
||||
|
||||
Here, we retrieve the JNI environment needed to make calls that interact
|
||||
with Java:
|
||||
|
||||
``` first-line: 21; theme: Default; brush: cpp; gutter: true
|
||||
JNIEnv *env = NULL;
|
||||
|
||||
if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) {
|
||||
__android_log_print (ANDROID_LOG_ERROR, "tutorial-1", "Could not retrieve JNIEnv");
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
And then locate the class containing the UI part of this tutorial using
|
||||
`
|
||||
FindClass()`:
|
||||
|
||||
``` first-line: 27; theme: Default; brush: cpp; gutter: true
|
||||
jclass klass = (*env)->FindClass (env, "com/gst_sdk_tutorials/tutorial_1/Tutorial1");
|
||||
```
|
||||
|
||||
Finally, we register our native methods with `RegisterNatives()`, this
|
||||
is, we provide the code for the methods we advertised in Java using the
|
||||
**`native`**
|
||||
keyword:
|
||||
|
||||
``` first-line: 28; theme: Default; brush: cpp; gutter: true
|
||||
(*env)->RegisterNatives (env, klass, native_methods, G_N_ELEMENTS(native_methods));
|
||||
```
|
||||
|
||||
The `native_methods` array describes each one of the methods to register
|
||||
(only one in this tutorial). For each method, it provides its Java
|
||||
name, its [type
|
||||
signature](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/types.html#wp276)
|
||||
and a pointer to the C function implementing it:
|
||||
|
||||
``` first-line: 16; theme: Default; brush: cpp; gutter: true
|
||||
static JNINativeMethod native_methods[] = {
|
||||
{ "nativeGetGStreamerInfo", "()Ljava/lang/String;", (void *) gst_native_get_gstreamer_info}
|
||||
};
|
||||
```
|
||||
|
||||
The only native method used in this tutorial
|
||||
is `nativeGetGStreamerInfo()`:
|
||||
|
||||
``` first-line: 9; theme: Default; brush: cpp; gutter: true
|
||||
jstring gst_native_get_gstreamer_info (JNIEnv* env, jobject thiz) {
|
||||
char *version_utf8 = gst_version_string();
|
||||
jstring *version_jstring = (*env)->NewStringUTF(env, version_utf8);
|
||||
g_free (version_utf8);
|
||||
return version_jstring;
|
||||
}
|
||||
```
|
||||
|
||||
It simply calls `gst_version_string()` to obtain a string describing
|
||||
this version of GStreamer. This [Modified
|
||||
UTF8](http://en.wikipedia.org/wiki/UTF-8#Modified_UTF-8) string is then
|
||||
converted to [UTF16](http://en.wikipedia.org/wiki/UTF-16) by `
|
||||
NewStringUTF()` as required by Java and returned. Java will be
|
||||
responsible for freeing the memory used by the new UTF16 String, but we
|
||||
must free the `char *` returned by `gst_version_string()`.
|
||||
|
||||
# Hello GStreamer \[Android.mk\]
|
||||
|
||||
**jni/Android.mk**
|
||||
|
||||
``` theme: Default; brush: ruby; gutter: true
|
||||
LOCAL_PATH := $(call my-dir)
|
||||
|
||||
include $(CLEAR_VARS)
|
||||
|
||||
LOCAL_MODULE := tutorial-1
|
||||
LOCAL_SRC_FILES := tutorial-1.c
|
||||
LOCAL_SHARED_LIBRARIES := gstreamer_android
|
||||
LOCAL_LDLIBS := -llog
|
||||
include $(BUILD_SHARED_LIBRARY)
|
||||
|
||||
ifndef GSTREAMER_SDK_ROOT
|
||||
ifndef GSTREAMER_SDK_ROOT_ANDROID
|
||||
$(error GSTREAMER_SDK_ROOT_ANDROID is not defined!)
|
||||
endif
|
||||
GSTREAMER_SDK_ROOT := $(GSTREAMER_SDK_ROOT_ANDROID)
|
||||
endif
|
||||
GSTREAMER_NDK_BUILD_PATH := $(GSTREAMER_SDK_ROOT)/share/gst-android/ndk-build/
|
||||
GSTREAMER_PLUGINS := coreelements
|
||||
include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer.mk
|
||||
```
|
||||
|
||||
This is a barebones makefile for a project with GStreamer support. It
|
||||
simply states that it depends on the `libgstreamer_android.so` library
|
||||
(line 7), and requires the `coreelements` plugin (line 18). More complex
|
||||
applications will probably add more libraries and plugins
|
||||
to `Android.mk`
|
||||
|
||||
# Conclusion
|
||||
|
||||
This ends the first Android tutorial. It has shown that, besides the
|
||||
interconnection between Java and C (which abides to the standard JNI
|
||||
procedure), adding GStreamer support to an Android application is not
|
||||
any more complicated than adding it to a desktop application.
|
||||
|
||||
The following tutorials detail the few places in which care has to be
|
||||
taken when developing specifically for the Android platform.
|
||||
|
||||
As usual, it has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorial1-screenshot.png](attachments/2687057/2654411.png)
|
||||
(image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorial1-screenshot.png](attachments/2687057/2654416.png)
|
||||
(image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorial1-screenshot.png](attachments/2687057/2654326.png)
|
||||
(image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
1066
Android+tutorial+2%3A+A+running+pipeline.markdown
Normal file
932
Android+tutorial+3%3A+Video.markdown
Normal file
|
@ -0,0 +1,932 @@
|
|||
# GStreamer SDK documentation : Android tutorial 3: Video
|
||||
|
||||
This page last changed on Nov 05, 2012 by xartigas.
|
||||
|
||||
# Goal ![](attachments/thumbnails/2687065/2654413)
|
||||
|
||||
Except for [Basic tutorial 5: GUI toolkit
|
||||
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html),
|
||||
which embedded a video window on a GTK application, all tutorials so far
|
||||
relied on GStreamer video sinks to create a window to display their
|
||||
contents. The video sink on Android is not capable of creating its own
|
||||
window, so a drawing surface always needs to be provided. This tutorial
|
||||
shows:
|
||||
|
||||
- How to allocate a drawing surface on the Android layout and pass it
|
||||
to GStreamer
|
||||
- How to keep GStreamer posted on changes to the surface
|
||||
|
||||
# Introduction
|
||||
|
||||
Since Android does not provide a windowing system, a GStreamer video
|
||||
sink cannot create pop-up windows as it would do on a Desktop platform.
|
||||
Fortunately, the `XOverlay` interface allows providing video sinks with
|
||||
an already created window onto which they can draw, as we have seen in
|
||||
[Basic tutorial 5: GUI toolkit
|
||||
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html).
|
||||
|
||||
In this tutorial, a
|
||||
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html)
|
||||
widget (actually, a subclass of it) is placed on the main layout. When
|
||||
Android informs the application that a surface has been created for this
|
||||
widget, we pass it to the C code which stores it. The
|
||||
`check_initialization_complete()` method explained in the previous
|
||||
tutorial is extended so that GStreamer is not considered initialized
|
||||
until a main loop is running and a drawing surface has been received.
|
||||
|
||||
# A video surface on Android \[Java code\]
|
||||
|
||||
**src/com/gst\_sdk\_tutorials/tutorial\_3/Tutorial3.java**
|
||||
|
||||
``` theme: Default; brush: java; gutter: true
|
||||
package com.gst_sdk_tutorials.tutorial_3;
|
||||
|
||||
import android.app.Activity;
|
||||
import android.os.Bundle;
|
||||
import android.util.Log;
|
||||
import android.view.SurfaceHolder;
|
||||
import android.view.SurfaceView;
|
||||
import android.view.View;
|
||||
import android.view.View.OnClickListener;
|
||||
import android.widget.ImageButton;
|
||||
import android.widget.TextView;
|
||||
import android.widget.Toast;
|
||||
|
||||
import com.gstreamer.GStreamer;
|
||||
|
||||
public class Tutorial3 extends Activity implements SurfaceHolder.Callback {
|
||||
private native void nativeInit(); // Initialize native code, build pipeline, etc
|
||||
private native void nativeFinalize(); // Destroy pipeline and shutdown native code
|
||||
private native void nativePlay(); // Set pipeline to PLAYING
|
||||
private native void nativePause(); // Set pipeline to PAUSED
|
||||
private static native boolean nativeClassInit(); // Initialize native class: cache Method IDs for callbacks
|
||||
private native void nativeSurfaceInit(Object surface);
|
||||
private native void nativeSurfaceFinalize();
|
||||
private long native_custom_data; // Native code will use this to keep private data
|
||||
|
||||
private boolean is_playing_desired; // Whether the user asked to go to PLAYING
|
||||
|
||||
// Called when the activity is first created.
|
||||
@Override
|
||||
public void onCreate(Bundle savedInstanceState)
|
||||
{
|
||||
super.onCreate(savedInstanceState);
|
||||
|
||||
// Initialize GStreamer and warn if it fails
|
||||
try {
|
||||
GStreamer.init(this);
|
||||
} catch (Exception e) {
|
||||
Toast.makeText(this, e.getMessage(), Toast.LENGTH_LONG).show();
|
||||
finish();
|
||||
return;
|
||||
}
|
||||
|
||||
setContentView(R.layout.main);
|
||||
|
||||
ImageButton play = (ImageButton) this.findViewById(R.id.button_play);
|
||||
play.setOnClickListener(new OnClickListener() {
|
||||
public void onClick(View v) {
|
||||
is_playing_desired = true;
|
||||
nativePlay();
|
||||
}
|
||||
});
|
||||
|
||||
ImageButton pause = (ImageButton) this.findViewById(R.id.button_stop);
|
||||
pause.setOnClickListener(new OnClickListener() {
|
||||
public void onClick(View v) {
|
||||
is_playing_desired = false;
|
||||
nativePause();
|
||||
}
|
||||
});
|
||||
|
||||
SurfaceView sv = (SurfaceView) this.findViewById(R.id.surface_video);
|
||||
SurfaceHolder sh = sv.getHolder();
|
||||
sh.addCallback(this);
|
||||
|
||||
if (savedInstanceState != null) {
|
||||
is_playing_desired = savedInstanceState.getBoolean("playing");
|
||||
Log.i ("GStreamer", "Activity created. Saved state is playing:" + is_playing_desired);
|
||||
} else {
|
||||
is_playing_desired = false;
|
||||
Log.i ("GStreamer", "Activity created. There is no saved state, playing: false");
|
||||
}
|
||||
|
||||
// Start with disabled buttons, until native code is initialized
|
||||
this.findViewById(R.id.button_play).setEnabled(false);
|
||||
this.findViewById(R.id.button_stop).setEnabled(false);
|
||||
|
||||
nativeInit();
|
||||
}
|
||||
|
||||
protected void onSaveInstanceState (Bundle outState) {
|
||||
Log.d ("GStreamer", "Saving state, playing:" + is_playing_desired);
|
||||
outState.putBoolean("playing", is_playing_desired);
|
||||
}
|
||||
|
||||
protected void onDestroy() {
|
||||
nativeFinalize();
|
||||
super.onDestroy();
|
||||
}
|
||||
|
||||
// Called from native code. This sets the content of the TextView from the UI thread.
|
||||
private void setMessage(final String message) {
|
||||
final TextView tv = (TextView) this.findViewById(R.id.textview_message);
|
||||
runOnUiThread (new Runnable() {
|
||||
public void run() {
|
||||
tv.setText(message);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Called from native code. Native code calls this once it has created its pipeline and
|
||||
// the main loop is running, so it is ready to accept commands.
|
||||
private void onGStreamerInitialized () {
|
||||
Log.i ("GStreamer", "Gst initialized. Restoring state, playing:" + is_playing_desired);
|
||||
// Restore previous playing state
|
||||
if (is_playing_desired) {
|
||||
nativePlay();
|
||||
} else {
|
||||
nativePause();
|
||||
}
|
||||
|
||||
// Re-enable buttons, now that GStreamer is initialized
|
||||
final Activity activity = this;
|
||||
runOnUiThread(new Runnable() {
|
||||
public void run() {
|
||||
activity.findViewById(R.id.button_play).setEnabled(true);
|
||||
activity.findViewById(R.id.button_stop).setEnabled(true);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
static {
|
||||
System.loadLibrary("gstreamer_android");
|
||||
System.loadLibrary("tutorial-3");
|
||||
nativeClassInit();
|
||||
}
|
||||
|
||||
public void surfaceChanged(SurfaceHolder holder, int format, int width,
|
||||
int height) {
|
||||
Log.d("GStreamer", "Surface changed to format " + format + " width "
|
||||
+ width + " height " + height);
|
||||
nativeSurfaceInit (holder.getSurface());
|
||||
}
|
||||
|
||||
public void surfaceCreated(SurfaceHolder holder) {
|
||||
Log.d("GStreamer", "Surface created: " + holder.getSurface());
|
||||
}
|
||||
|
||||
public void surfaceDestroyed(SurfaceHolder holder) {
|
||||
Log.d("GStreamer", "Surface destroyed");
|
||||
nativeSurfaceFinalize ();
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
This tutorial continues where the previous one left, adding a video
|
||||
surface to the layout and changing the GStreamer pipeline to produce
|
||||
video instead of audio. Only the parts of the code that are new will be
|
||||
discussed.
|
||||
|
||||
``` first-line: 22; theme: Default; brush: java; gutter: true
|
||||
private native void nativeSurfaceInit(Object surface);
|
||||
private native void nativeSurfaceFinalize();
|
||||
```
|
||||
|
||||
Two new entry points to the C code are defined,
|
||||
`nativeSurfaceInit()` and `nativeSurfaceFinalize()`, which we will call
|
||||
when the video surface becomes available and when it is about to be
|
||||
destroyed, respectively.
|
||||
|
||||
``` first-line: 61; theme: Default; brush: java; gutter: true
|
||||
SurfaceView sv = (SurfaceView) this.findViewById(R.id.surface_video);
|
||||
SurfaceHolder sh = sv.getHolder();
|
||||
sh.addCallback(this);
|
||||
```
|
||||
|
||||
In `onCreate()`, we retrieve the
|
||||
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html),
|
||||
and then register ourselves to receive notifications about the surface
|
||||
state through the
|
||||
[SurfaceHolder](http://developer.android.com/reference/android/view/SurfaceHolder.html)
|
||||
interface. This is why we declared this Activity as implementing the
|
||||
[SurfaceHolder.Callback](http://developer.android.com/reference/android/view/SurfaceHolder.Callback.html)
|
||||
interface in line 16.
|
||||
|
||||
``` first-line: 127; theme: Default; brush: java; gutter: true
|
||||
public void surfaceChanged(SurfaceHolder holder, int format, int width,
|
||||
int height) {
|
||||
Log.d("GStreamer", "Surface changed to format " + format + " width "
|
||||
+ width + " height " + height);
|
||||
nativeSurfaceInit (holder.getSurface());
|
||||
}
|
||||
|
||||
public void surfaceCreated(SurfaceHolder holder) {
|
||||
Log.d("GStreamer", "Surface created: " + holder.getSurface());
|
||||
}
|
||||
|
||||
public void surfaceDestroyed(SurfaceHolder holder) {
|
||||
Log.d("GStreamer", "Surface destroyed");
|
||||
nativeSurfaceFinalize ();
|
||||
}
|
||||
```
|
||||
|
||||
This interface is composed of the three methods above, which get called
|
||||
when the geometry of the surface changes, when the surface is created
|
||||
and when it is about to be destroyed. `surfaceChanged()` always gets
|
||||
called at least once, right after `surfaceCreated()`, so we will use it
|
||||
to notify GStreamer about the new surface. We use
|
||||
`surfaceDestroyed()` to tell GStreamer to stop using this surface.
|
||||
|
||||
Let’s review the C code to see what these functions do.
|
||||
|
||||
# A video surface on Android \[C code\]
|
||||
|
||||
**jni/tutorial-3.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <string.h>
|
||||
#include <jni.h>
|
||||
#include <android/log.h>
|
||||
#include <android/native_window.h>
|
||||
#include <android/native_window_jni.h>
|
||||
#include <gst/gst.h>
|
||||
#include <gst/interfaces/xoverlay.h>
|
||||
#include <gst/video/video.h>
|
||||
#include <pthread.h>
|
||||
|
||||
GST_DEBUG_CATEGORY_STATIC (debug_category);
|
||||
#define GST_CAT_DEFAULT debug_category
|
||||
|
||||
/*
|
||||
* These macros provide a way to store the native pointer to CustomData, which might be 32 or 64 bits, into
|
||||
* a jlong, which is always 64 bits, without warnings.
|
||||
*/
|
||||
#if GLIB_SIZEOF_VOID_P == 8
|
||||
# define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
|
||||
# define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
|
||||
#else
|
||||
# define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
|
||||
# define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
|
||||
#endif
|
||||
|
||||
/* Structure to contain all our information, so we can pass it to callbacks */
|
||||
typedef struct _CustomData {
|
||||
jobject app; /* Application instance, used to call its methods. A global reference is kept. */
|
||||
GstElement *pipeline; /* The running pipeline */
|
||||
GMainContext *context; /* GLib context used to run the main loop */
|
||||
GMainLoop *main_loop; /* GLib main loop */
|
||||
gboolean initialized; /* To avoid informing the UI multiple times about the initialization */
|
||||
GstElement *video_sink; /* The video sink element which receives XOverlay commands */
|
||||
ANativeWindow *native_window; /* The Android native window where video will be rendered */
|
||||
} CustomData;
|
||||
|
||||
/* These global variables cache values which are not changing during execution */
|
||||
static pthread_t gst_app_thread;
|
||||
static pthread_key_t current_jni_env;
|
||||
static JavaVM *java_vm;
|
||||
static jfieldID custom_data_field_id;
|
||||
static jmethodID set_message_method_id;
|
||||
static jmethodID on_gstreamer_initialized_method_id;
|
||||
|
||||
/*
|
||||
* Private methods
|
||||
*/
|
||||
|
||||
/* Register this thread with the VM */
|
||||
static JNIEnv *attach_current_thread (void) {
|
||||
JNIEnv *env;
|
||||
JavaVMAttachArgs args;
|
||||
|
||||
GST_DEBUG ("Attaching thread %p", g_thread_self ());
|
||||
args.version = JNI_VERSION_1_4;
|
||||
args.name = NULL;
|
||||
args.group = NULL;
|
||||
|
||||
if ((*java_vm)->AttachCurrentThread (java_vm, &env, &args) < 0) {
|
||||
GST_ERROR ("Failed to attach current thread");
|
||||
return NULL;
|
||||
}
|
||||
|
||||
return env;
|
||||
}
|
||||
|
||||
/* Unregister this thread from the VM */
|
||||
static void detach_current_thread (void *env) {
|
||||
GST_DEBUG ("Detaching thread %p", g_thread_self ());
|
||||
(*java_vm)->DetachCurrentThread (java_vm);
|
||||
}
|
||||
|
||||
/* Retrieve the JNI environment for this thread */
|
||||
static JNIEnv *get_jni_env (void) {
|
||||
JNIEnv *env;
|
||||
|
||||
if ((env = pthread_getspecific (current_jni_env)) == NULL) {
|
||||
env = attach_current_thread ();
|
||||
pthread_setspecific (current_jni_env, env);
|
||||
}
|
||||
|
||||
return env;
|
||||
}
|
||||
|
||||
/* Change the content of the UI's TextView */
|
||||
static void set_ui_message (const gchar *message, CustomData *data) {
|
||||
JNIEnv *env = get_jni_env ();
|
||||
GST_DEBUG ("Setting message to: %s", message);
|
||||
jstring jmessage = (*env)->NewStringUTF(env, message);
|
||||
(*env)->CallVoidMethod (env, data->app, set_message_method_id, jmessage);
|
||||
if ((*env)->ExceptionCheck (env)) {
|
||||
GST_ERROR ("Failed to call Java method");
|
||||
(*env)->ExceptionClear (env);
|
||||
}
|
||||
(*env)->DeleteLocalRef (env, jmessage);
|
||||
}
|
||||
|
||||
/* Retrieve errors from the bus and show them on the UI */
|
||||
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
gchar *message_string;
|
||||
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
message_string = g_strdup_printf ("Error received from element %s: %s", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
set_ui_message (message_string, data);
|
||||
g_free (message_string);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_NULL);
|
||||
}
|
||||
|
||||
/* Notify UI about pipeline state changes */
|
||||
static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
/* Only pay attention to messages coming from the pipeline, not its children */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->pipeline)) {
|
||||
gchar *message = g_strdup_printf("State changed to %s", gst_element_state_get_name(new_state));
|
||||
set_ui_message(message, data);
|
||||
g_free (message);
|
||||
}
|
||||
}
|
||||
|
||||
/* Check if all conditions are met to report GStreamer as initialized.
|
||||
* These conditions will change depending on the application */
|
||||
static void check_initialization_complete (CustomData *data) {
|
||||
JNIEnv *env = get_jni_env ();
|
||||
if (!data->initialized && data->native_window && data->main_loop) {
|
||||
GST_DEBUG ("Initialization complete, notifying application. native_window:%p main_loop:%p", data->native_window, data->main_loop);
|
||||
|
||||
/* The main loop is running and we received a native window, inform the sink about it */
|
||||
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->video_sink), (guintptr)data->native_window);
|
||||
|
||||
(*env)->CallVoidMethod (env, data->app, on_gstreamer_initialized_method_id);
|
||||
if ((*env)->ExceptionCheck (env)) {
|
||||
GST_ERROR ("Failed to call Java method");
|
||||
(*env)->ExceptionClear (env);
|
||||
}
|
||||
data->initialized = TRUE;
|
||||
}
|
||||
}
|
||||
|
||||
/* Main method for the native code. This is executed on its own thread. */
|
||||
static void *app_function (void *userdata) {
|
||||
JavaVMAttachArgs args;
|
||||
GstBus *bus;
|
||||
CustomData *data = (CustomData *)userdata;
|
||||
GSource *bus_source;
|
||||
GError *error = NULL;
|
||||
|
||||
GST_DEBUG ("Creating pipeline in CustomData at %p", data);
|
||||
|
||||
/* Create our own GLib Main Context and make it the default one */
|
||||
data->context = g_main_context_new ();
|
||||
g_main_context_push_thread_default(data->context);
|
||||
|
||||
/* Build pipeline */
|
||||
data->pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink", &error);
|
||||
if (error) {
|
||||
gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
|
||||
g_clear_error (&error);
|
||||
set_ui_message(message, data);
|
||||
g_free (message);
|
||||
return NULL;
|
||||
}
|
||||
|
||||
/* Set the pipeline to READY, so it can already accept a window handle, if we have one */
|
||||
gst_element_set_state(data->pipeline, GST_STATE_READY);
|
||||
|
||||
data->video_sink = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_X_OVERLAY);
|
||||
if (!data->video_sink) {
|
||||
GST_ERROR ("Could not retrieve video sink");
|
||||
return NULL;
|
||||
}
|
||||
|
||||
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
||||
bus = gst_element_get_bus (data->pipeline);
|
||||
bus_source = gst_bus_create_watch (bus);
|
||||
g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func, NULL, NULL);
|
||||
g_source_attach (bus_source, data->context);
|
||||
g_source_unref (bus_source);
|
||||
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, data);
|
||||
gst_object_unref (bus);
|
||||
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
GST_DEBUG ("Entering main loop... (CustomData:%p)", data);
|
||||
data->main_loop = g_main_loop_new (data->context, FALSE);
|
||||
check_initialization_complete (data);
|
||||
g_main_loop_run (data->main_loop);
|
||||
GST_DEBUG ("Exited main loop");
|
||||
g_main_loop_unref (data->main_loop);
|
||||
data->main_loop = NULL;
|
||||
|
||||
/* Free resources */
|
||||
g_main_context_pop_thread_default(data->context);
|
||||
g_main_context_unref (data->context);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (data->video_sink);
|
||||
gst_object_unref (data->pipeline);
|
||||
|
||||
return NULL;
|
||||
}
|
||||
|
||||
/*
|
||||
* Java Bindings
|
||||
*/
|
||||
|
||||
/* Instruct the native code to create its internal data structure, pipeline and thread */
|
||||
static void gst_native_init (JNIEnv* env, jobject thiz) {
|
||||
CustomData *data = g_new0 (CustomData, 1);
|
||||
SET_CUSTOM_DATA (env, thiz, custom_data_field_id, data);
|
||||
GST_DEBUG_CATEGORY_INIT (debug_category, "tutorial-3", 0, "Android tutorial 3");
|
||||
gst_debug_set_threshold_for_name("tutorial-3", GST_LEVEL_DEBUG);
|
||||
GST_DEBUG ("Created CustomData at %p", data);
|
||||
data->app = (*env)->NewGlobalRef (env, thiz);
|
||||
GST_DEBUG ("Created GlobalRef for app object at %p", data->app);
|
||||
pthread_create (&gst_app_thread, NULL, &app_function, data);
|
||||
}
|
||||
|
||||
/* Quit the main loop, remove the native thread and free resources */
|
||||
static void gst_native_finalize (JNIEnv* env, jobject thiz) {
|
||||
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
|
||||
if (!data) return;
|
||||
GST_DEBUG ("Quitting main loop...");
|
||||
g_main_loop_quit (data->main_loop);
|
||||
GST_DEBUG ("Waiting for thread to finish...");
|
||||
pthread_join (gst_app_thread, NULL);
|
||||
GST_DEBUG ("Deleting GlobalRef for app object at %p", data->app);
|
||||
(*env)->DeleteGlobalRef (env, data->app);
|
||||
GST_DEBUG ("Freeing CustomData at %p", data);
|
||||
g_free (data);
|
||||
SET_CUSTOM_DATA (env, thiz, custom_data_field_id, NULL);
|
||||
GST_DEBUG ("Done finalizing");
|
||||
}
|
||||
|
||||
/* Set pipeline to PLAYING state */
|
||||
static void gst_native_play (JNIEnv* env, jobject thiz) {
|
||||
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
|
||||
if (!data) return;
|
||||
GST_DEBUG ("Setting state to PLAYING");
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
}
|
||||
|
||||
/* Set pipeline to PAUSED state */
|
||||
static void gst_native_pause (JNIEnv* env, jobject thiz) {
|
||||
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
|
||||
if (!data) return;
|
||||
GST_DEBUG ("Setting state to PAUSED");
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
}
|
||||
|
||||
/* Static class initializer: retrieve method and field IDs */
|
||||
static jboolean gst_native_class_init (JNIEnv* env, jclass klass) {
|
||||
custom_data_field_id = (*env)->GetFieldID (env, klass, "native_custom_data", "J");
|
||||
set_message_method_id = (*env)->GetMethodID (env, klass, "setMessage", "(Ljava/lang/String;)V");
|
||||
on_gstreamer_initialized_method_id = (*env)->GetMethodID (env, klass, "onGStreamerInitialized", "()V");
|
||||
|
||||
if (!custom_data_field_id || !set_message_method_id || !on_gstreamer_initialized_method_id) {
|
||||
/* We emit this message through the Android log instead of the GStreamer log because the later
|
||||
* has not been initialized yet.
|
||||
*/
|
||||
__android_log_print (ANDROID_LOG_ERROR, "tutorial-3", "The calling class does not implement all necessary interface methods");
|
||||
return JNI_FALSE;
|
||||
}
|
||||
return JNI_TRUE;
|
||||
}
|
||||
|
||||
static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface) {
|
||||
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
|
||||
if (!data) return;
|
||||
ANativeWindow *new_native_window = ANativeWindow_fromSurface(env, surface);
|
||||
GST_DEBUG ("Received surface %p (native window %p)", surface, new_native_window);
|
||||
|
||||
if (data->native_window) {
|
||||
ANativeWindow_release (data->native_window);
|
||||
if (data->native_window == new_native_window) {
|
||||
GST_DEBUG ("New native window is the same as the previous one", data->native_window);
|
||||
if (data->video_sink) {
|
||||
gst_x_overlay_expose(GST_X_OVERLAY (data->video_sink));
|
||||
gst_x_overlay_expose(GST_X_OVERLAY (data->video_sink));
|
||||
}
|
||||
return;
|
||||
} else {
|
||||
GST_DEBUG ("Released previous native window %p", data->native_window);
|
||||
data->initialized = FALSE;
|
||||
}
|
||||
}
|
||||
data->native_window = new_native_window;
|
||||
|
||||
check_initialization_complete (data);
|
||||
}
|
||||
|
||||
static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
|
||||
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
|
||||
if (!data) return;
|
||||
GST_DEBUG ("Releasing Native Window %p", data->native_window);
|
||||
|
||||
if (data->video_sink) {
|
||||
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->video_sink), (guintptr)NULL);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_READY);
|
||||
}
|
||||
|
||||
ANativeWindow_release (data->native_window);
|
||||
data->native_window = NULL;
|
||||
data->initialized = FALSE;
|
||||
}
|
||||
|
||||
/* List of implemented native methods */
|
||||
static JNINativeMethod native_methods[] = {
|
||||
{ "nativeInit", "()V", (void *) gst_native_init},
|
||||
{ "nativeFinalize", "()V", (void *) gst_native_finalize},
|
||||
{ "nativePlay", "()V", (void *) gst_native_play},
|
||||
{ "nativePause", "()V", (void *) gst_native_pause},
|
||||
{ "nativeSurfaceInit", "(Ljava/lang/Object;)V", (void *) gst_native_surface_init},
|
||||
{ "nativeSurfaceFinalize", "()V", (void *) gst_native_surface_finalize},
|
||||
{ "nativeClassInit", "()Z", (void *) gst_native_class_init}
|
||||
};
|
||||
|
||||
/* Library initializer */
|
||||
jint JNI_OnLoad(JavaVM *vm, void *reserved) {
|
||||
JNIEnv *env = NULL;
|
||||
|
||||
java_vm = vm;
|
||||
|
||||
if ((*vm)->GetEnv(vm, (void**) &env, JNI_VERSION_1_4) != JNI_OK) {
|
||||
__android_log_print (ANDROID_LOG_ERROR, "tutorial-3", "Could not retrieve JNIEnv");
|
||||
return 0;
|
||||
}
|
||||
jclass klass = (*env)->FindClass (env, "com/gst_sdk_tutorials/tutorial_3/Tutorial3");
|
||||
(*env)->RegisterNatives (env, klass, native_methods, G_N_ELEMENTS(native_methods));
|
||||
|
||||
pthread_key_create (¤t_jni_env, detach_current_thread);
|
||||
|
||||
return JNI_VERSION_1_4;
|
||||
}
|
||||
```
|
||||
|
||||
First, our `CustomData` structure is augmented to keep a pointer to the
|
||||
video sink element and the native window
|
||||
handle:
|
||||
|
||||
``` first-line: 33; theme: Default; brush: cpp; gutter: true
|
||||
GstElement *video_sink; /* The video sink element which receives XOverlay commands */
|
||||
ANativeWindow *native_window; /* The Android native window where video will be rendered */
|
||||
```
|
||||
|
||||
The `check_initialization_complete()` method is also augmented so that
|
||||
it requires a native window before considering GStreamer to be
|
||||
initialized:
|
||||
|
||||
``` first-line: 127; theme: Default; brush: cpp; gutter: true
|
||||
static void check_initialization_complete (CustomData *data) {
|
||||
JNIEnv *env = get_jni_env ();
|
||||
if (!data->initialized && data->native_window && data->main_loop) {
|
||||
GST_DEBUG ("Initialization complete, notifying application. native_window:%p main_loop:%p", data->native_window, data->main_loop);
|
||||
|
||||
/* The main loop is running and we received a native window, inform the sink about it */
|
||||
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->video_sink), (guintptr)data->native_window);
|
||||
|
||||
(*env)->CallVoidMethod (env, data->app, on_gstreamer_initialized_method_id);
|
||||
if ((*env)->ExceptionCheck (env)) {
|
||||
GST_ERROR ("Failed to call Java method");
|
||||
(*env)->ExceptionClear (env);
|
||||
}
|
||||
data->initialized = TRUE;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Also, once the pipeline has been built and a native window has been
|
||||
received, we inform the video sink of the window handle to use via the
|
||||
`gst_x_overlay_set_window_handle()` method.
|
||||
|
||||
The GStreamer pipeline for this tutorial involves a `videotestsrc`, a
|
||||
`warptv` psychedelic distorter effect (check out other cool video
|
||||
effects in the `GSTREAMER_PLUGINS_EFFECTS` package), and an
|
||||
`autovideosink` which will instantiate the adequate video sink for the
|
||||
platform:
|
||||
|
||||
``` first-line: 159; theme: Default; brush: cpp; gutter: true
|
||||
data->pipeline = gst_parse_launch("videotestsrc ! warptv ! ffmpegcolorspace ! autovideosink ", &error);
|
||||
```
|
||||
|
||||
Here things start to get more
|
||||
interesting:
|
||||
|
||||
``` first-line: 168; theme: Default; brush: cpp; gutter: true
|
||||
/* Set the pipeline to READY, so it can already accept a window handle, if we have one */
|
||||
gst_element_set_state(data->pipeline, GST_STATE_READY);
|
||||
|
||||
data->video_sink = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_X_OVERLAY);
|
||||
if (!data->video_sink) {
|
||||
GST_ERROR ("Could not retrieve video sink");
|
||||
return NULL;
|
||||
}
|
||||
```
|
||||
|
||||
We start by setting the pipeline to the READY state. No data flow occurs
|
||||
yet, but the `autovideosink` will instantiate the actual sink so we can
|
||||
ask for it immediately.
|
||||
|
||||
The `gst_bin_get_by_interface()` method will examine the whole pipeline
|
||||
and return a pointer to an element which supports the requested
|
||||
interface. We are asking for the `XOverlay` interface, explained in
|
||||
[Basic tutorial 5: GUI toolkit
|
||||
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html),
|
||||
which controls how to perform rendering into foreign (non-GStreamer)
|
||||
windows. The internal video sink instantiated by `autovideosink` is the
|
||||
only element in this pipeline implementing it, so it will be returned.
|
||||
|
||||
Now we will implement the two native functions called by the Java code
|
||||
when the drawing surface becomes available or is about to be
|
||||
destroyed:
|
||||
|
||||
``` first-line: 270; theme: Default; brush: cpp; gutter: true
|
||||
static void gst_native_surface_init (JNIEnv *env, jobject thiz, jobject surface) {
|
||||
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
|
||||
if (!data) return;
|
||||
ANativeWindow *new_native_window = ANativeWindow_fromSurface(env, surface);
|
||||
GST_DEBUG ("Received surface %p (native window %p)", surface, new_native_window);
|
||||
|
||||
if (data->native_window) {
|
||||
ANativeWindow_release (data->native_window);
|
||||
if (data->native_window == new_native_window) {
|
||||
GST_DEBUG ("New native window is the same as the previous one", data->native_window);
|
||||
if (data->video_sink) {
|
||||
gst_x_overlay_expose(GST_X_OVERLAY (data->video_sink));
|
||||
gst_x_overlay_expose(GST_X_OVERLAY (data->video_sink));
|
||||
}
|
||||
return;
|
||||
} else {
|
||||
GST_DEBUG ("Released previous native window %p", data->native_window);
|
||||
data->initialized = FALSE;
|
||||
}
|
||||
}
|
||||
data->native_window = new_native_window;
|
||||
|
||||
check_initialization_complete (data);
|
||||
}
|
||||
```
|
||||
|
||||
This method is responsible for providing the video sink with the window
|
||||
handle coming from the Java code. We are passed a
|
||||
[Surface](http://developer.android.com/reference/android/view/Surface.html)
|
||||
object, and we use `ANativeWindow_fromSurface()` to obtain the
|
||||
underlying native window pointer. There is no official online
|
||||
documentation for the NDK, but fortunately the header files are well
|
||||
commented. Native window management functions can be found in
|
||||
`$(ANDROID_NDK_ROOT)\platforms\android-9\arch-arm\usr\include\android\native_window.h` and `native_window_jni.h`
|
||||
|
||||
If we had already stored a native window, the one we just received can
|
||||
either be a new one, or just an update of the one we have. If the
|
||||
pointers are the same, we assume the geometry of the surface has
|
||||
changed, and simply instruct the video sink to redraw itself, via the
|
||||
`gst_x_overlay_expose()` method. The video sink will recover the new
|
||||
size from the surface itself, so we do not need to bother about it
|
||||
here. We need to call `gst_x_overlay_expose()` twice because of the way
|
||||
the surface changes propagate down the OpenGL ES / EGL pipeline (The
|
||||
only video sink available for Android in the GStreamer SDK uses OpenGL
|
||||
ES). By the time we call the first expose, the surface that the sink
|
||||
will pick up still contains the old size.
|
||||
|
||||
On the other hand, if the pointers are different, we mark GStreamer as
|
||||
not being initialized. Next time we call
|
||||
`check_initialization_complete()`, the video sink will be informed of
|
||||
the new window handle.
|
||||
|
||||
We finally store the new window handle and call
|
||||
`check_initialization_complete()` to inform the Java code that
|
||||
everything is set up, if that is the case.
|
||||
|
||||
``` first-line: 295; theme: Default; brush: cpp; gutter: true
|
||||
static void gst_native_surface_finalize (JNIEnv *env, jobject thiz) {
|
||||
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
|
||||
if (!data) return;
|
||||
GST_DEBUG ("Releasing Native Window %p", data->native_window);
|
||||
|
||||
if (data->video_sink) {
|
||||
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->video_sink), (guintptr)NULL);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_READY);
|
||||
}
|
||||
|
||||
ANativeWindow_release (data->native_window);
|
||||
data->native_window = NULL;
|
||||
data->initialized = FALSE;
|
||||
}
|
||||
```
|
||||
|
||||
The complementary function, `gst_native_surface_finalize()` is called
|
||||
when a surface is about to be destroyed and should not be used anymore.
|
||||
Here, we simply instruct the video sink to stop using the window handle
|
||||
and set the pipeline to READY so no rendering occurs. We release the
|
||||
window pointer we had stored with `ANativeWindow_release()`, and mark
|
||||
GStreamer as not being initialized anymore.
|
||||
|
||||
And this is all there is to it, regarding the main code. Only a couple
|
||||
of details remain, the subclass we made for SurfaceView and the
|
||||
`Android.mk` file.
|
||||
|
||||
# GStreamerSurfaceView, a convenient SurfaceView wrapper \[Java code\]
|
||||
|
||||
By default,
|
||||
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) does
|
||||
not have any particular size, so it expands to use all the space the
|
||||
layout can give it. While this might be convenient sometimes, it does
|
||||
not allow a great deal of control. In particular, when the surface does
|
||||
not have the same aspect ratio as the media, the sink will add black
|
||||
borders (the known “letterbox” or “pillarbox” effect), which is an
|
||||
unnecessary work (and a waste of battery).
|
||||
|
||||
The subclass of
|
||||
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) presented
|
||||
here overrides the
|
||||
[onMeasure()](http://developer.android.com/reference/android/view/SurfaceView.html#onMeasure\(int,%20int\)) method
|
||||
to report the actual media size, so the surface can adapt to any layout
|
||||
while preserving the media aspect ratio.
|
||||
|
||||
Since in this tutorial the media size is known beforehand, it is
|
||||
hardcoded in the GStreamerSurfaceView class for simplicity. The next
|
||||
tutorial shows how it can be recovered at runtime and passed onto the
|
||||
surface.
|
||||
|
||||
**src/com/gst\_sdk\_tutorials/tutorial\_3/GStreamerSurfaceView.java**
|
||||
|
||||
``` theme: Default; brush: java; gutter: true
|
||||
package com.gst_sdk_tutorials.tutorial_3;
|
||||
|
||||
import android.content.Context;
|
||||
import android.util.AttributeSet;
|
||||
import android.util.Log;
|
||||
import android.view.SurfaceView;
|
||||
import android.view.View;
|
||||
|
||||
// A simple SurfaceView whose width and height can be set from the outside
|
||||
public class GStreamerSurfaceView extends SurfaceView {
|
||||
public int media_width = 320;
|
||||
public int media_height = 240;
|
||||
|
||||
// Mandatory constructors, they do not do much
|
||||
public GStreamerSurfaceView(Context context, AttributeSet attrs,
|
||||
int defStyle) {
|
||||
super(context, attrs, defStyle);
|
||||
}
|
||||
|
||||
public GStreamerSurfaceView(Context context, AttributeSet attrs) {
|
||||
super(context, attrs);
|
||||
}
|
||||
|
||||
public GStreamerSurfaceView (Context context) {
|
||||
super(context);
|
||||
}
|
||||
|
||||
// Called by the layout manager to find out our size and give us some rules.
|
||||
// We will try to maximize our size, and preserve the media's aspect ratio if
|
||||
// we are given the freedom to do so.
|
||||
@Override
|
||||
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
|
||||
int width = 0, height = 0;
|
||||
int wmode = View.MeasureSpec.getMode(widthMeasureSpec);
|
||||
int hmode = View.MeasureSpec.getMode(heightMeasureSpec);
|
||||
int wsize = View.MeasureSpec.getSize(widthMeasureSpec);
|
||||
int hsize = View.MeasureSpec.getSize(heightMeasureSpec);
|
||||
|
||||
Log.i ("GStreamer", "onMeasure called with " + media_width + "x" + media_height);
|
||||
// Obey width rules
|
||||
switch (wmode) {
|
||||
case View.MeasureSpec.AT_MOST:
|
||||
if (hmode == View.MeasureSpec.EXACTLY) {
|
||||
width = Math.min(hsize * media_width / media_height, wsize);
|
||||
break;
|
||||
}
|
||||
case View.MeasureSpec.EXACTLY:
|
||||
width = wsize;
|
||||
break;
|
||||
case View.MeasureSpec.UNSPECIFIED:
|
||||
width = media_width;
|
||||
}
|
||||
|
||||
// Obey height rules
|
||||
switch (hmode) {
|
||||
case View.MeasureSpec.AT_MOST:
|
||||
if (wmode == View.MeasureSpec.EXACTLY) {
|
||||
height = Math.min(wsize * media_height / media_width, hsize);
|
||||
break;
|
||||
}
|
||||
case View.MeasureSpec.EXACTLY:
|
||||
height = hsize;
|
||||
break;
|
||||
case View.MeasureSpec.UNSPECIFIED:
|
||||
height = media_height;
|
||||
}
|
||||
|
||||
// Finally, calculate best size when both axis are free
|
||||
if (hmode == View.MeasureSpec.AT_MOST && wmode == View.MeasureSpec.AT_MOST) {
|
||||
int correct_height = width * media_height / media_width;
|
||||
int correct_width = height * media_width / media_height;
|
||||
|
||||
if (correct_height < height)
|
||||
height = correct_height;
|
||||
else
|
||||
width = correct_width;
|
||||
}
|
||||
|
||||
// Obey minimum size
|
||||
width = Math.max (getSuggestedMinimumWidth(), width);
|
||||
height = Math.max (getSuggestedMinimumHeight(), height);
|
||||
setMeasuredDimension(width, height);
|
||||
}
|
||||
|
||||
}
|
||||
```
|
||||
|
||||
# A video surface on Android \[Android.mk\]
|
||||
|
||||
**/jni/Android.mk**
|
||||
|
||||
``` theme: Default; brush: ruby; gutter: true
|
||||
LOCAL_PATH := $(call my-dir)
|
||||
|
||||
include $(CLEAR_VARS)
|
||||
|
||||
LOCAL_MODULE := tutorial-3
|
||||
LOCAL_SRC_FILES := tutorial-3.c
|
||||
LOCAL_SHARED_LIBRARIES := gstreamer_android
|
||||
LOCAL_LDLIBS := -llog -landroid
|
||||
include $(BUILD_SHARED_LIBRARY)
|
||||
|
||||
ifndef GSTREAMER_SDK_ROOT
|
||||
ifndef GSTREAMER_SDK_ROOT_ANDROID
|
||||
$(error GSTREAMER_SDK_ROOT_ANDROID is not defined!)
|
||||
endif
|
||||
GSTREAMER_SDK_ROOT := $(GSTREAMER_SDK_ROOT_ANDROID)
|
||||
endif
|
||||
GSTREAMER_NDK_BUILD_PATH := $(GSTREAMER_SDK_ROOT)/share/gst-android/ndk-build/
|
||||
include $(GSTREAMER_NDK_BUILD_PATH)/plugins.mk
|
||||
GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_SYS) $(GSTREAMER_PLUGINS_EFFECTS)
|
||||
GSTREAMER_EXTRA_DEPS := gstreamer-interfaces-0.10 gstreamer-video-0.10
|
||||
include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer.mk
|
||||
```
|
||||
|
||||
Worth mentioning is the `-landroid` library being used to allow
|
||||
interaction with the native windows, and the different plugin
|
||||
packages: `GSTREAMER_PLUGINS_SYS` for the system-dependent video sink
|
||||
and `GSTREAMER_PLUGINS_EFFECTS` for the `warptv` element. This tutorial
|
||||
requires the `gstreamer-interfaces` library to use the
|
||||
`XOverlay` interface, and the `gstreamer-video` library to use the
|
||||
video helper methods.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to display video on Android using a
|
||||
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) and
|
||||
the `XOverlay` interface.
|
||||
- How to be aware of changes in the surface’s size using
|
||||
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html)’s
|
||||
callbacks.
|
||||
- How to report the media size to the Android layout engine.
|
||||
|
||||
The following tutorial plays an actual clip and adds a few more controls
|
||||
to this tutorial in order to build a simple media player.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorial3-screenshot.png](attachments/2687065/2654414.png)
|
||||
(image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorial3-screenshot.png](attachments/2687065/2654415.png)
|
||||
(image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorial3-screenshot.png](attachments/2687065/2654418.png)
|
||||
(image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorial3-screenshot.png](attachments/2687065/2654413.png)
|
||||
(image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
1434
Android+tutorial+4%3A+A+basic+media+player.markdown
Normal file
114
Android+tutorial+5%3A+A+Complete+media+player.markdown
Normal file
|
@ -0,0 +1,114 @@
|
|||
# GStreamer SDK documentation : Android tutorial 5: A Complete media player
|
||||
|
||||
This page last changed on Nov 28, 2012 by xartigas.
|
||||
|
||||
# Goal![](attachments/thumbnails/2687069/2654436)
|
||||
|
||||
This tutorial wants to be the “demo application” that showcases what can
|
||||
be done with GStreamer in the Android platform.
|
||||
|
||||
It is intended to be downloaded in final, compiled, form rather than
|
||||
analyzed for its pedagogical value, since it adds very little GStreamer
|
||||
knowledge over what has already been shown in [Android tutorial 4: A
|
||||
basic media
|
||||
player](Android%2Btutorial%2B4%253A%2BA%2Bbasic%2Bmedia%2Bplayer.html).
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Tutorial 5</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><a href="http://cdn.gstreamer.com/android/arm/com.gst_sdk_tutorials.tutorial_5.Tutorial5-2012.11.apk" class="external-link">GStreamer SDK 2013.6 (Congo) for Android ARM (Tutorial 5 Installable APK)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/android/arm/com.gst_sdk_tutorials.tutorial_5.Tutorial5-2012.11.apk" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/android/arm/com.gst_sdk_tutorials.tutorial_5.Tutorial5-2012.11.apk.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/android/arm/com.gst_sdk_tutorials.tutorial_5.Tutorial5-2012.11.apk.sha1" class="external-link">sha1</a></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Introduction
|
||||
|
||||
The previous tutorial already implemented a basic media player. This one
|
||||
simply adds a few finishing touches. In particular, it adds the
|
||||
capability to choose the media to play, and disables the screensaver
|
||||
during media playback.
|
||||
|
||||
These are not features directly related to GStreamer, and are therefore
|
||||
outside the scope of these tutorials. Only a few implementation pointers
|
||||
are given here.
|
||||
|
||||
# Registering as a media player
|
||||
|
||||
The `AndroidManifest.xml` tells the Android system the capabilities of
|
||||
the application. By specifying in the `intent-filter` of the activity
|
||||
that it understands the `audio/*`, `video/*` and `image/*` MIME types,
|
||||
the tutorial will be offered as an option whenever an application
|
||||
requires such medias to be viewed.
|
||||
|
||||
“Unfortunately”, GStreamer knows more file formats than Android does,
|
||||
so, for some files, Android will not provide a MIME type. For these
|
||||
cases, a new `intent-filter` has to be provided which ignores MIME types
|
||||
and focuses only in the filename extension. This is inconvenient because
|
||||
the list of extensions can be large, but there does not seem to be
|
||||
another option. In this tutorial, only a very short list of extensions
|
||||
is provided, for simplicity.
|
||||
|
||||
Finally, GStreamer can also playback remote files, so URI schemes like
|
||||
`http` are supported in another `intent-filter`. Android does not
|
||||
provide MIME types for remote files, so the filename extension list has
|
||||
to be provided again.
|
||||
|
||||
Once we have informed the system of our capabilities, it will start
|
||||
sending
|
||||
[Intents](http://developer.android.com/reference/android/content/Intent.html)
|
||||
to invoke our activity, which will contain the desired URI to play. In
|
||||
the `onCreate()` method the intent that invoked the activity is
|
||||
retrieved and checked for such URI.
|
||||
|
||||
# Implementing a file chooser dialog
|
||||
|
||||
The UI includes a new button ![](attachments/2687069/2654437.png) which
|
||||
was not present in [Android tutorial 4: A basic media
|
||||
player](Android%2Btutorial%2B4%253A%2BA%2Bbasic%2Bmedia%2Bplayer.html). It
|
||||
invokes a file chooser dialog (based on the [Android File
|
||||
Dialog](http://code.google.com/p/android-file-dialog/) project) that
|
||||
allows you to choose a local media file, no matter what extension or
|
||||
MIME type it has.
|
||||
|
||||
If a new media is selected, it is passed onto the native code (which
|
||||
will set the pipeline to READY, pass the URI onto `playbin2`, and bring
|
||||
the pipeline back to the previous state). The current position is also
|
||||
reset, so the new clip does not start in the previous position.
|
||||
|
||||
# Preventing the screen from turning off
|
||||
|
||||
While watching a movie, there is typically no user activity. After a
|
||||
short period of such inactivity, Android will dim the screen, and then
|
||||
turn it off completely. To prevent this, a [Wake
|
||||
Lock](http://developer.android.com/reference/android/os/PowerManager.WakeLock.html)
|
||||
is used. The application acquires the lock when the Play button is
|
||||
pressed, so the screen is never turned off, and releases it when the
|
||||
Pause button is pressed.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This finishes the series of Android tutorials. Each one of the preceding
|
||||
tutorials has evolved on top of the previous one, showing how to
|
||||
implement a particular set of features, and concluding in this tutorial
|
||||
5. The goal of tutorial 5 is to build a complete media player which can
|
||||
already be used to showcase the integration of GStreamer and Android.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorial5-screenshot.png](attachments/2687069/2654436.png)
|
||||
(image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[ic\_media\_next.png](attachments/2687069/2654438.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[ic\_media\_next.png](attachments/2687069/2654437.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
38
Android+tutorials.markdown
Normal file
|
@ -0,0 +1,38 @@
|
|||
# GStreamer SDK documentation : Android tutorials
|
||||
|
||||
This page last changed on May 02, 2013 by xartigas.
|
||||
|
||||
# Welcome to the GStreamer SDK Android tutorials
|
||||
|
||||
These tutorials describe Android-specific topics. General GStreamer
|
||||
concepts will not be explained in these tutorials, so the [Basic
|
||||
tutorials](Basic%2Btutorials.html) should be reviewed first. The reader
|
||||
should also be familiar with basic Android programming techniques.
|
||||
|
||||
Each Android tutorial builds on top of the previous one and adds
|
||||
progressively more functionality, until a working media player
|
||||
application is obtained in [Android tutorial 5: A Complete media
|
||||
player](Android%2Btutorial%2B5%253A%2BA%2BComplete%2Bmedia%2Bplayer.html).
|
||||
This is the same media player application used to advertise the
|
||||
GStreamer SDK on Android, and the download link can be found in
|
||||
the [Android tutorial 5: A Complete media
|
||||
player](Android%2Btutorial%2B5%253A%2BA%2BComplete%2Bmedia%2Bplayer.html) page.
|
||||
|
||||
Make sure to have read the instructions in [Installing for Android
|
||||
development](Installing%2Bfor%2BAndroid%2Bdevelopment.html) before
|
||||
jumping into the Android tutorials.
|
||||
|
||||
### A note on the documentation
|
||||
|
||||
All Java methods, both Android-specific and generic, are documented in
|
||||
the [Android reference
|
||||
site](http://developer.android.com/reference/packages.html).
|
||||
|
||||
Unfortunately, there is no official online documentation for the NDK.
|
||||
The header files, though, are well commented. If you installed the
|
||||
Android NDK in the `$(ANDROID_NDK_ROOT)` folder, you can find the header
|
||||
files
|
||||
in `$(ANDROID_NDK_ROOT)\platforms\android-9\arch-arm\usr\include\android`.
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
877
Basic+Media+Player.markdown
Normal file
|
@ -0,0 +1,877 @@
|
|||
# GStreamer SDK documentation : Basic Media Player
|
||||
|
||||
This page last changed on May 24, 2013 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
This tutorial shows how to create a basic media player with
|
||||
[Qt](http://qt-project.org/) and
|
||||
[QtGStreamer](http://gstreamer.freedesktop.org/data/doc/gstreamer/head/qt-gstreamer/html/index.html).
|
||||
It assumes that you are already familiar with the basics of Qt and
|
||||
GStreamer. If not, please refer to the other tutorials in this
|
||||
documentation.
|
||||
|
||||
In particular, you will learn:
|
||||
|
||||
- How to create a basic pipeline
|
||||
- How to create a video output
|
||||
- Updating the GUI based on playback time
|
||||
|
||||
# A media player with Qt
|
||||
|
||||
These files are located in the qt-gstreamer SDK's `examples/` directory.
|
||||
|
||||
Due to the length of these samples, they are initially hidden. Click on
|
||||
each file to expand.
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)CMakeLists.txt
|
||||
|
||||
**CMakeLists.txt**
|
||||
|
||||
``` theme: Default; brush: plain; gutter: true
|
||||
project(qtgst-example-player)
|
||||
find_package(QtGStreamer REQUIRED)
|
||||
# automoc is now a built-in tool since CMake 2.8.6.
|
||||
if (${CMAKE_VERSION} VERSION_LESS "2.8.6")
|
||||
find_package(Automoc4 REQUIRED)
|
||||
else()
|
||||
set(CMAKE_AUTOMOC TRUE)
|
||||
macro(automoc4_add_executable)
|
||||
add_executable(${ARGV})
|
||||
endmacro()
|
||||
endif()
|
||||
include_directories(${QTGSTREAMER_INCLUDES} ${CMAKE_CURRENT_BINARY_DIR} ${QT_QTWIDGETS_INCLUDE_DIRS})
|
||||
add_definitions(${QTGSTREAMER_DEFINITIONS})
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${QTGSTREAMER_FLAGS}")
|
||||
set(player_SOURCES main.cpp player.cpp mediaapp.cpp)
|
||||
automoc4_add_executable(player ${player_SOURCES})
|
||||
target_link_libraries(player ${QTGSTREAMER_UI_LIBRARIES} ${QT_QTOPENGL_LIBRARIES} ${QT_QTWIDGETS_LIBRARIES})
|
||||
```
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)main.cpp
|
||||
|
||||
**main.cpp**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include "mediaapp.h"
|
||||
#include <QtWidgets/QApplication>
|
||||
#include <QGst/Init>
|
||||
int main(int argc, char *argv[])
|
||||
{
|
||||
QApplication app(argc, argv);
|
||||
QGst::init(&argc, &argv);
|
||||
MediaApp media;
|
||||
media.show();
|
||||
if (argc == 2) {
|
||||
media.openFile(argv[1]);
|
||||
}
|
||||
return app.exec();
|
||||
}
|
||||
```
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)mediaapp.h
|
||||
|
||||
**mediaapp.h**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#ifndef MEDIAAPP_H
|
||||
#define MEDIAAPP_H
|
||||
#include <QtCore/QTimer>
|
||||
#include <QtWidgets/QWidget>
|
||||
#include <QtWidgets/QStyle>
|
||||
class Player;
|
||||
class QBoxLayout;
|
||||
class QLabel;
|
||||
class QSlider;
|
||||
class QToolButton;
|
||||
class QTimer;
|
||||
class MediaApp : public QWidget
|
||||
{
|
||||
Q_OBJECT
|
||||
public:
|
||||
MediaApp(QWidget *parent = 0);
|
||||
~MediaApp();
|
||||
void openFile(const QString & fileName);
|
||||
private Q_SLOTS:
|
||||
void open();
|
||||
void toggleFullScreen();
|
||||
void onStateChanged();
|
||||
void onPositionChanged();
|
||||
void setPosition(int position);
|
||||
void showControls(bool show = true);
|
||||
void hideControls() { showControls(false); }
|
||||
protected:
|
||||
void mouseMoveEvent(QMouseEvent *event);
|
||||
private:
|
||||
QToolButton *initButton(QStyle::StandardPixmap icon, const QString & tip,
|
||||
QObject *dstobj, const char *slot_method, QLayout *layout);
|
||||
void createUI(QBoxLayout *appLayout);
|
||||
QString m_baseDir;
|
||||
Player *m_player;
|
||||
QToolButton *m_openButton;
|
||||
QToolButton *m_fullScreenButton;
|
||||
QToolButton *m_playButton;
|
||||
QToolButton *m_pauseButton;
|
||||
QToolButton *m_stopButton;
|
||||
QSlider *m_positionSlider;
|
||||
QSlider *m_volumeSlider;
|
||||
QLabel *m_positionLabel;
|
||||
QLabel *m_volumeLabel;
|
||||
QTimer m_fullScreenTimer;
|
||||
};
|
||||
#endif
|
||||
```
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)mediaapp.cpp
|
||||
|
||||
**mediaapp.cpp**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include "mediaapp.h"
|
||||
#include "player.h"
|
||||
#if (QT_VERSION >= QT_VERSION_CHECK(5, 0, 0))
|
||||
#include <QtWidgets/QBoxLayout>
|
||||
#include <QtWidgets/QFileDialog>
|
||||
#include <QtWidgets/QToolButton>
|
||||
#include <QtWidgets/QLabel>
|
||||
#include <QtWidgets/QSlider>
|
||||
#else
|
||||
#include <QtGui/QBoxLayout>
|
||||
#include <QtGui/QFileDialog>
|
||||
#include <QtGui/QToolButton>
|
||||
#include <QtGui/QLabel>
|
||||
#include <QtGui/QSlider>
|
||||
#include <QtGui/QMouseEvent>
|
||||
#endif
|
||||
MediaApp::MediaApp(QWidget *parent)
|
||||
: QWidget(parent)
|
||||
{
|
||||
//create the player
|
||||
m_player = new Player(this);
|
||||
connect(m_player, SIGNAL(positionChanged()), this, SLOT(onPositionChanged()));
|
||||
connect(m_player, SIGNAL(stateChanged()), this, SLOT(onStateChanged()));
|
||||
//m_baseDir is used to remember the last directory that was used.
|
||||
//defaults to the current working directory
|
||||
m_baseDir = QLatin1String(".");
|
||||
//this timer (re-)hides the controls after a few seconds when we are in fullscreen mode
|
||||
m_fullScreenTimer.setSingleShot(true);
|
||||
connect(&m_fullScreenTimer, SIGNAL(timeout()), this, SLOT(hideControls()));
|
||||
//create the UI
|
||||
QVBoxLayout *appLayout = new QVBoxLayout;
|
||||
appLayout->setContentsMargins(0, 0, 0, 0);
|
||||
createUI(appLayout);
|
||||
setLayout(appLayout);
|
||||
onStateChanged(); //set the controls to their default state
|
||||
setWindowTitle(tr("QtGStreamer example player"));
|
||||
resize(400, 400);
|
||||
}
|
||||
MediaApp::~MediaApp()
|
||||
{
|
||||
delete m_player;
|
||||
}
|
||||
void MediaApp::openFile(const QString & fileName)
|
||||
{
|
||||
m_baseDir = QFileInfo(fileName).path();
|
||||
m_player->stop();
|
||||
m_player->setUri(fileName);
|
||||
m_player->play();
|
||||
}
|
||||
void MediaApp::open()
|
||||
{
|
||||
QString fileName = QFileDialog::getOpenFileName(this, tr("Open a Movie"), m_baseDir);
|
||||
if (!fileName.isEmpty()) {
|
||||
openFile(fileName);
|
||||
}
|
||||
}
|
||||
void MediaApp::toggleFullScreen()
|
||||
{
|
||||
if (isFullScreen()) {
|
||||
setMouseTracking(false);
|
||||
m_player->setMouseTracking(false);
|
||||
m_fullScreenTimer.stop();
|
||||
showControls();
|
||||
showNormal();
|
||||
} else {
|
||||
setMouseTracking(true);
|
||||
m_player->setMouseTracking(true);
|
||||
hideControls();
|
||||
showFullScreen();
|
||||
}
|
||||
}
|
||||
void MediaApp::onStateChanged()
|
||||
{
|
||||
QGst::State newState = m_player->state();
|
||||
m_playButton->setEnabled(newState != QGst::StatePlaying);
|
||||
m_pauseButton->setEnabled(newState == QGst::StatePlaying);
|
||||
m_stopButton->setEnabled(newState != QGst::StateNull);
|
||||
m_positionSlider->setEnabled(newState != QGst::StateNull);
|
||||
m_volumeSlider->setEnabled(newState != QGst::StateNull);
|
||||
m_volumeLabel->setEnabled(newState != QGst::StateNull);
|
||||
m_volumeSlider->setValue(m_player->volume());
|
||||
//if we are in Null state, call onPositionChanged() to restore
|
||||
//the position of the slider and the text on the label
|
||||
if (newState == QGst::StateNull) {
|
||||
onPositionChanged();
|
||||
}
|
||||
}
|
||||
/* Called when the positionChanged() is received from the player */
|
||||
void MediaApp::onPositionChanged()
|
||||
{
|
||||
QTime length(0,0);
|
||||
QTime curpos(0,0);
|
||||
if (m_player->state() != QGst::StateReady &&
|
||||
m_player->state() != QGst::StateNull)
|
||||
{
|
||||
length = m_player->length();
|
||||
curpos = m_player->position();
|
||||
}
|
||||
m_positionLabel->setText(curpos.toString("hh:mm:ss.zzz")
|
||||
+ "/" +
|
||||
length.toString("hh:mm:ss.zzz"));
|
||||
if (length != QTime(0,0)) {
|
||||
m_positionSlider->setValue(curpos.msecsTo(QTime(0,0)) * 1000 / length.msecsTo(QTime(0,0)));
|
||||
} else {
|
||||
m_positionSlider->setValue(0);
|
||||
}
|
||||
if (curpos != QTime(0,0)) {
|
||||
m_positionLabel->setEnabled(true);
|
||||
m_positionSlider->setEnabled(true);
|
||||
}
|
||||
}
|
||||
/* Called when the user changes the slider's position */
|
||||
void MediaApp::setPosition(int value)
|
||||
{
|
||||
uint length = -m_player->length().msecsTo(QTime(0,0));
|
||||
if (length != 0 && value > 0) {
|
||||
QTime pos(0,0);
|
||||
pos = pos.addMSecs(length * (value / 1000.0));
|
||||
m_player->setPosition(pos);
|
||||
}
|
||||
}
|
||||
void MediaApp::showControls(bool show)
|
||||
{
|
||||
m_openButton->setVisible(show);
|
||||
m_playButton->setVisible(show);
|
||||
m_pauseButton->setVisible(show);
|
||||
m_stopButton->setVisible(show);
|
||||
m_fullScreenButton->setVisible(show);
|
||||
m_positionSlider->setVisible(show);
|
||||
m_volumeSlider->setVisible(show);
|
||||
m_volumeLabel->setVisible(show);
|
||||
m_positionLabel->setVisible(show);
|
||||
}
|
||||
void MediaApp::mouseMoveEvent(QMouseEvent *event)
|
||||
{
|
||||
Q_UNUSED(event);
|
||||
if (isFullScreen()) {
|
||||
showControls();
|
||||
m_fullScreenTimer.start(3000); //re-hide controls after 3s
|
||||
}
|
||||
}
|
||||
QToolButton *MediaApp::initButton(QStyle::StandardPixmap icon, const QString & tip,
|
||||
QObject *dstobj, const char *slot_method, QLayout *layout)
|
||||
{
|
||||
QToolButton *button = new QToolButton;
|
||||
button->setIcon(style()->standardIcon(icon));
|
||||
button->setIconSize(QSize(36, 36));
|
||||
button->setToolTip(tip);
|
||||
connect(button, SIGNAL(clicked()), dstobj, slot_method);
|
||||
layout->addWidget(button);
|
||||
return button;
|
||||
}
|
||||
void MediaApp::createUI(QBoxLayout *appLayout)
|
||||
{
|
||||
appLayout->addWidget(m_player);
|
||||
m_positionLabel = new QLabel();
|
||||
m_positionSlider = new QSlider(Qt::Horizontal);
|
||||
m_positionSlider->setTickPosition(QSlider::TicksBelow);
|
||||
m_positionSlider->setTickInterval(10);
|
||||
m_positionSlider->setMaximum(1000);
|
||||
connect(m_positionSlider, SIGNAL(sliderMoved(int)), this, SLOT(setPosition(int)));
|
||||
m_volumeSlider = new QSlider(Qt::Horizontal);
|
||||
m_volumeSlider->setTickPosition(QSlider::TicksLeft);
|
||||
m_volumeSlider->setTickInterval(2);
|
||||
m_volumeSlider->setMaximum(10);
|
||||
m_volumeSlider->setMaximumSize(64,32);
|
||||
connect(m_volumeSlider, SIGNAL(sliderMoved(int)), m_player, SLOT(setVolume(int)));
|
||||
QGridLayout *posLayout = new QGridLayout;
|
||||
posLayout->addWidget(m_positionLabel, 1, 0);
|
||||
posLayout->addWidget(m_positionSlider, 1, 1, 1, 2);
|
||||
appLayout->addLayout(posLayout);
|
||||
QHBoxLayout *btnLayout = new QHBoxLayout;
|
||||
btnLayout->addStretch();
|
||||
m_openButton = initButton(QStyle::SP_DialogOpenButton, tr("Open File"),
|
||||
this, SLOT(open()), btnLayout);
|
||||
m_playButton = initButton(QStyle::SP_MediaPlay, tr("Play"),
|
||||
m_player, SLOT(play()), btnLayout);
|
||||
m_pauseButton = initButton(QStyle::SP_MediaPause, tr("Pause"),
|
||||
m_player, SLOT(pause()), btnLayout);
|
||||
m_stopButton = initButton(QStyle::SP_MediaStop, tr("Stop"),
|
||||
m_player, SLOT(stop()), btnLayout);
|
||||
m_fullScreenButton = initButton(QStyle::SP_TitleBarMaxButton, tr("Fullscreen"),
|
||||
this, SLOT(toggleFullScreen()), btnLayout);
|
||||
btnLayout->addStretch();
|
||||
m_volumeLabel = new QLabel();
|
||||
m_volumeLabel->setPixmap(
|
||||
style()->standardIcon(QStyle::SP_MediaVolume).pixmap(QSize(32, 32),
|
||||
QIcon::Normal, QIcon::On));
|
||||
btnLayout->addWidget(m_volumeLabel);
|
||||
btnLayout->addWidget(m_volumeSlider);
|
||||
appLayout->addLayout(btnLayout);
|
||||
}
|
||||
#include "moc_mediaapp.cpp"
|
||||
```
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)player.h
|
||||
|
||||
**player.h**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#ifndef PLAYER_H
|
||||
#define PLAYER_H
|
||||
#include <QtCore/QTimer>
|
||||
#include <QtCore/QTime>
|
||||
#include <QGst/Pipeline>
|
||||
#include <QGst/Ui/VideoWidget>
|
||||
|
||||
class Player : public QGst::Ui::VideoWidget
|
||||
{
|
||||
Q_OBJECT
|
||||
public:
|
||||
Player(QWidget *parent = 0);
|
||||
~Player();
|
||||
|
||||
void setUri(const QString &uri);
|
||||
|
||||
QTime position() const;
|
||||
void setPosition(const QTime &pos);
|
||||
int volume() const;
|
||||
QTime length() const;
|
||||
QGst::State state() const;
|
||||
|
||||
public Q_SLOTS:
|
||||
void play();
|
||||
void pause();
|
||||
void stop();
|
||||
void setVolume(int volume);
|
||||
|
||||
Q_SIGNALS:
|
||||
void positionChanged();
|
||||
void stateChanged();
|
||||
|
||||
private:
|
||||
void onBusMessage(const QGst::MessagePtr &message);
|
||||
void handlePipelineStateChange(const QGst::StateChangedMessagePtr &scm);
|
||||
|
||||
QGst::PipelinePtr m_pipeline;
|
||||
QTimer m_positionTimer;
|
||||
};
|
||||
|
||||
#endif //PLAYER_H
|
||||
```
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)player.cpp
|
||||
|
||||
**player.cpp**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include "player.h"
|
||||
#include <QtCore/QDir>
|
||||
#include <QtCore/QUrl>
|
||||
#include <QGlib/Connect>
|
||||
#include <QGlib/Error>
|
||||
#include <QGst/Pipeline>
|
||||
#include <QGst/ElementFactory>
|
||||
#include <QGst/Bus>
|
||||
#include <QGst/Message>
|
||||
#include <QGst/Query>
|
||||
#include <QGst/ClockTime>
|
||||
#include <QGst/Event>
|
||||
#include <QGst/StreamVolume>
|
||||
Player::Player(QWidget *parent)
|
||||
: QGst::Ui::VideoWidget(parent)
|
||||
{
|
||||
//this timer is used to tell the ui to change its position slider & label
|
||||
//every 100 ms, but only when the pipeline is playing
|
||||
connect(&m_positionTimer, SIGNAL(timeout()), this, SIGNAL(positionChanged()));
|
||||
}
|
||||
Player::~Player()
|
||||
{
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setState(QGst::StateNull);
|
||||
stopPipelineWatch();
|
||||
}
|
||||
}
|
||||
void Player::setUri(const QString & uri)
|
||||
{
|
||||
QString realUri = uri;
|
||||
//if uri is not a real uri, assume it is a file path
|
||||
if (realUri.indexOf("://") < 0) {
|
||||
realUri = QUrl::fromLocalFile(realUri).toEncoded();
|
||||
}
|
||||
if (!m_pipeline) {
|
||||
m_pipeline = QGst::ElementFactory::make("playbin2").dynamicCast<QGst::Pipeline>();
|
||||
if (m_pipeline) {
|
||||
//let the video widget watch the pipeline for new video sinks
|
||||
watchPipeline(m_pipeline);
|
||||
//watch the bus for messages
|
||||
QGst::BusPtr bus = m_pipeline->bus();
|
||||
bus->addSignalWatch();
|
||||
QGlib::connect(bus, "message", this, &Player::onBusMessage);
|
||||
} else {
|
||||
qCritical() << "Failed to create the pipeline";
|
||||
}
|
||||
}
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setProperty("uri", realUri);
|
||||
}
|
||||
}
|
||||
QTime Player::position() const
|
||||
{
|
||||
if (m_pipeline) {
|
||||
//here we query the pipeline about its position
|
||||
//and we request that the result is returned in time format
|
||||
QGst::PositionQueryPtr query = QGst::PositionQuery::create(QGst::FormatTime);
|
||||
m_pipeline->query(query);
|
||||
return QGst::ClockTime(query->position()).toTime();
|
||||
} else {
|
||||
return QTime(0,0);
|
||||
}
|
||||
}
|
||||
void Player::setPosition(const QTime & pos)
|
||||
{
|
||||
QGst::SeekEventPtr evt = QGst::SeekEvent::create(
|
||||
1.0, QGst::FormatTime, QGst::SeekFlagFlush,
|
||||
QGst::SeekTypeSet, QGst::ClockTime::fromTime(pos),
|
||||
QGst::SeekTypeNone, QGst::ClockTime::None
|
||||
);
|
||||
m_pipeline->sendEvent(evt);
|
||||
}
|
||||
int Player::volume() const
|
||||
{
|
||||
if (m_pipeline) {
|
||||
QGst::StreamVolumePtr svp =
|
||||
m_pipeline.dynamicCast<QGst::StreamVolume>();
|
||||
if (svp) {
|
||||
return svp->volume(QGst::StreamVolumeFormatCubic) * 10;
|
||||
}
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
void Player::setVolume(int volume)
|
||||
{
|
||||
if (m_pipeline) {
|
||||
QGst::StreamVolumePtr svp =
|
||||
m_pipeline.dynamicCast<QGst::StreamVolume>();
|
||||
if(svp) {
|
||||
svp->setVolume((double)volume / 10, QGst::StreamVolumeFormatCubic);
|
||||
}
|
||||
}
|
||||
}
|
||||
QTime Player::length() const
|
||||
{
|
||||
if (m_pipeline) {
|
||||
//here we query the pipeline about the content's duration
|
||||
//and we request that the result is returned in time format
|
||||
QGst::DurationQueryPtr query = QGst::DurationQuery::create(QGst::FormatTime);
|
||||
m_pipeline->query(query);
|
||||
return QGst::ClockTime(query->duration()).toTime();
|
||||
} else {
|
||||
return QTime(0,0);
|
||||
}
|
||||
}
|
||||
QGst::State Player::state() const
|
||||
{
|
||||
return m_pipeline ? m_pipeline->currentState() : QGst::StateNull;
|
||||
}
|
||||
void Player::play()
|
||||
{
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setState(QGst::StatePlaying);
|
||||
}
|
||||
}
|
||||
void Player::pause()
|
||||
{
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setState(QGst::StatePaused);
|
||||
}
|
||||
}
|
||||
void Player::stop()
|
||||
{
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setState(QGst::StateNull);
|
||||
//once the pipeline stops, the bus is flushed so we will
|
||||
//not receive any StateChangedMessage about this.
|
||||
//so, to inform the ui, we have to emit this signal manually.
|
||||
Q_EMIT stateChanged();
|
||||
}
|
||||
}
|
||||
void Player::onBusMessage(const QGst::MessagePtr & message)
|
||||
{
|
||||
switch (message->type()) {
|
||||
case QGst::MessageEos: //End of stream. We reached the end of the file.
|
||||
stop();
|
||||
break;
|
||||
case QGst::MessageError: //Some error occurred.
|
||||
qCritical() << message.staticCast<QGst::ErrorMessage>()->error();
|
||||
stop();
|
||||
break;
|
||||
case QGst::MessageStateChanged: //The element in message->source() has changed state
|
||||
if (message->source() == m_pipeline) {
|
||||
handlePipelineStateChange(message.staticCast<QGst::StateChangedMessage>());
|
||||
}
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
}
|
||||
void Player::handlePipelineStateChange(const QGst::StateChangedMessagePtr & scm)
|
||||
{
|
||||
switch (scm->newState()) {
|
||||
case QGst::StatePlaying:
|
||||
//start the timer when the pipeline starts playing
|
||||
m_positionTimer.start(100);
|
||||
break;
|
||||
case QGst::StatePaused:
|
||||
//stop the timer when the pipeline pauses
|
||||
if(scm->oldState() == QGst::StatePlaying) {
|
||||
m_positionTimer.stop();
|
||||
}
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
Q_EMIT stateChanged();
|
||||
}
|
||||
#include "moc_player.cpp"
|
||||
```
|
||||
|
||||
# Walkthrough
|
||||
|
||||
## Setting up GStreamer
|
||||
|
||||
We begin by looking at `main()`:
|
||||
|
||||
**main.cpp**
|
||||
|
||||
``` first-line: 4; theme: Default; brush: cpp; gutter: true
|
||||
int main(int argc, char *argv[])
|
||||
{
|
||||
QApplication app(argc, argv);
|
||||
QGst::init(&argc, &argv);
|
||||
MediaApp media;
|
||||
media.show();
|
||||
if (argc == 2) {
|
||||
media.openFile(argv[1]);
|
||||
}
|
||||
return app.exec();
|
||||
}
|
||||
```
|
||||
|
||||
We first initialize QtGStreamer by calling `QGst::init()`, passing
|
||||
`argc` and `argv`. Internally, this ensures that the GLib type system
|
||||
and GStreamer plugin registry is configured and initialized, along with
|
||||
handling helpful environment variables such as `GST_DEBUG` and common
|
||||
command line options. Please see the [Running GStreamer
|
||||
Applications](http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/gst-running.html)
|
||||
section of the core reference manual for details.
|
||||
|
||||
Construction of the `MediaApp` (derived from
|
||||
[`QApplication`](http://qt-project.org/doc/qt-5.0/qtwidgets/qapplication.html))
|
||||
involves constructing the `Player` object and connecting its signals to
|
||||
the UI:
|
||||
|
||||
**MediaApp::MediaApp()**
|
||||
|
||||
``` first-line: 20; theme: Default; brush: cpp; gutter: true
|
||||
//create the player
|
||||
m_player = new Player(this);
|
||||
connect(m_player, SIGNAL(positionChanged()), this, SLOT(onPositionChanged()));
|
||||
connect(m_player, SIGNAL(stateChanged()), this, SLOT(onStateChanged()));
|
||||
```
|
||||
|
||||
Next, we instruct the `MediaApp` to open the file given on the command
|
||||
line, if any:
|
||||
|
||||
**MediaApp::openFile()**
|
||||
|
||||
``` first-line: 43; theme: Default; brush: cpp; gutter: true
|
||||
void MediaApp::openFile(const QString & fileName)
|
||||
{
|
||||
m_baseDir = QFileInfo(fileName).path();
|
||||
m_player->stop();
|
||||
m_player->setUri(fileName);
|
||||
m_player->play();
|
||||
}
|
||||
```
|
||||
|
||||
This in turn instructs the `Player` to construct our GStreamer pipeline:
|
||||
|
||||
**Player::setUri()**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
void Player::setUri(const QString & uri)
|
||||
{
|
||||
QString realUri = uri;
|
||||
//if uri is not a real uri, assume it is a file path
|
||||
if (realUri.indexOf("://") < 0) {
|
||||
realUri = QUrl::fromLocalFile(realUri).toEncoded();
|
||||
}
|
||||
if (!m_pipeline) {
|
||||
m_pipeline = QGst::ElementFactory::make("playbin2").dynamicCast<QGst::Pipeline>();
|
||||
if (m_pipeline) {
|
||||
//let the video widget watch the pipeline for new video sinks
|
||||
watchPipeline(m_pipeline);
|
||||
//watch the bus for messages
|
||||
QGst::BusPtr bus = m_pipeline->bus();
|
||||
bus->addSignalWatch();
|
||||
QGlib::connect(bus, "message", this, &Player::onBusMessage);
|
||||
} else {
|
||||
qCritical() << "Failed to create the pipeline";
|
||||
}
|
||||
}
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setProperty("uri", realUri);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Here, we first ensure that the pipeline will receive a proper URI. If
|
||||
`Player::setUri()` is called with `/home/user/some/file.mp3`, the path
|
||||
is modified to `file:///home/user/some/file.mp3`. `playbin2` only
|
||||
accepts complete URIs.
|
||||
|
||||
The pipeline is created via `QGst::ElementFactory::make()`. The
|
||||
`Player` object inherits from the `QGst::Ui::VideoWidget` class, which
|
||||
includes a function to watch for the `prepare-xwindow-id` message, which
|
||||
associates the underlying video sink with a Qt widget used for
|
||||
rendering. For clarity, here is a portion of the implementation:
|
||||
|
||||
**prepare-xwindow-id handling**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
QGlib::connect(pipeline->bus(), "sync-message",
|
||||
this, &PipelineWatch::onBusSyncMessage);
|
||||
...
|
||||
void PipelineWatch::onBusSyncMessage(const MessagePtr & msg)
|
||||
{
|
||||
...
|
||||
if (msg->internalStructure()->name() == QLatin1String("prepare-xwindow-id")) {
|
||||
XOverlayPtr overlay = msg->source().dynamicCast<XOverlay>();
|
||||
m_renderer->setVideoSink(overlay);
|
||||
}
|
||||
```
|
||||
|
||||
Once the pipeline is created, we connect to the bus' message signal (via
|
||||
`QGlib::connect()`) to dispatch state change signals:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
void Player::onBusMessage(const QGst::MessagePtr & message)
|
||||
{
|
||||
switch (message->type()) {
|
||||
case QGst::MessageEos: //End of stream. We reached the end of the file.
|
||||
stop();
|
||||
break;
|
||||
case QGst::MessageError: //Some error occurred.
|
||||
qCritical() << message.staticCast<QGst::ErrorMessage>()->error();
|
||||
stop();
|
||||
break;
|
||||
case QGst::MessageStateChanged: //The element in message->source() has changed state
|
||||
if (message->source() == m_pipeline) {
|
||||
handlePipelineStateChange(message.staticCast<QGst::StateChangedMessage>());
|
||||
}
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
}
|
||||
void Player::handlePipelineStateChange(const QGst::StateChangedMessagePtr & scm)
|
||||
{
|
||||
switch (scm->newState()) {
|
||||
case QGst::StatePlaying:
|
||||
//start the timer when the pipeline starts playing
|
||||
m_positionTimer.start(100);
|
||||
break;
|
||||
case QGst::StatePaused:
|
||||
//stop the timer when the pipeline pauses
|
||||
if(scm->oldState() == QGst::StatePlaying) {
|
||||
m_positionTimer.stop();
|
||||
}
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
Q_EMIT stateChanged();
|
||||
}
|
||||
```
|
||||
|
||||
Finally, we tell `playbin2` what to play by setting the `uri` property:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
m_pipeline->setProperty("uri", realUri);
|
||||
```
|
||||
|
||||
## Starting Playback
|
||||
|
||||
After `Player::setUri()` is called, `MediaApp::openFile()` calls
|
||||
`play()` on the `Player` object:
|
||||
|
||||
**Player::play()**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
void Player::play()
|
||||
{
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setState(QGst::StatePlaying);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The other state control methods are equally simple:
|
||||
|
||||
**Player state functions**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
void Player::pause()
|
||||
{
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setState(QGst::StatePaused);
|
||||
}
|
||||
}
|
||||
void Player::stop()
|
||||
{
|
||||
if (m_pipeline) {
|
||||
m_pipeline->setState(QGst::StateNull);
|
||||
//once the pipeline stops, the bus is flushed so we will
|
||||
//not receive any StateChangedMessage about this.
|
||||
//so, to inform the ui, we have to emit this signal manually.
|
||||
Q_EMIT stateChanged();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Once the pipeline has entered the playing state, a state change message
|
||||
is emitted on the GStreamer bus which gets picked up by the `Player`:
|
||||
|
||||
**Player::onBusMessage()**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
void Player::onBusMessage(const QGst::MessagePtr & message)
|
||||
{
|
||||
switch (message->type()) {
|
||||
case QGst::MessageEos: //End of stream. We reached the end of the file.
|
||||
stop();
|
||||
break;
|
||||
case QGst::MessageError: //Some error occurred.
|
||||
qCritical() << message.staticCast<QGst::ErrorMessage>()->error();
|
||||
stop();
|
||||
break;
|
||||
case QGst::MessageStateChanged: //The element in message->source() has changed state
|
||||
if (message->source() == m_pipeline) {
|
||||
handlePipelineStateChange(message.staticCast<QGst::StateChangedMessage>());
|
||||
}
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The `stateChanged` signal we connected to earlier is emitted and
|
||||
handled:
|
||||
|
||||
**MediaApp::onStateChanged()**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
void MediaApp::onStateChanged()
|
||||
{
|
||||
QGst::State newState = m_player->state();
|
||||
m_playButton->setEnabled(newState != QGst::StatePlaying);
|
||||
m_pauseButton->setEnabled(newState == QGst::StatePlaying);
|
||||
m_stopButton->setEnabled(newState != QGst::StateNull);
|
||||
m_positionSlider->setEnabled(newState != QGst::StateNull);
|
||||
m_volumeSlider->setEnabled(newState != QGst::StateNull);
|
||||
m_volumeLabel->setEnabled(newState != QGst::StateNull);
|
||||
m_volumeSlider->setValue(m_player->volume());
|
||||
//if we are in Null state, call onPositionChanged() to restore
|
||||
//the position of the slider and the text on the label
|
||||
if (newState == QGst::StateNull) {
|
||||
onPositionChanged();
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This updates the UI to reflect the current state of the player's
|
||||
pipeline.
|
||||
|
||||
Driven by a
|
||||
[`QTimer`](http://qt-project.org/doc/qt-5.0/qtcore/qtimer.html), the
|
||||
`Player` emits the `positionChanged` signal at regular intervals for the
|
||||
UI to handle:
|
||||
|
||||
**MediaApp::onPositionChanged()**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
void MediaApp::onPositionChanged()
|
||||
{
|
||||
QTime length(0,0);
|
||||
QTime curpos(0,0);
|
||||
if (m_player->state() != QGst::StateReady &&
|
||||
m_player->state() != QGst::StateNull)
|
||||
{
|
||||
length = m_player->length();
|
||||
curpos = m_player->position();
|
||||
}
|
||||
m_positionLabel->setText(curpos.toString("hh:mm:ss.zzz")
|
||||
+ "/" +
|
||||
length.toString("hh:mm:ss.zzz"));
|
||||
if (length != QTime(0,0)) {
|
||||
m_positionSlider->setValue(curpos.msecsTo(QTime(0,0)) * 1000 / length.msecsTo(QTime(0,0)));
|
||||
} else {
|
||||
m_positionSlider->setValue(0);
|
||||
}
|
||||
if (curpos != QTime(0,0)) {
|
||||
m_positionLabel->setEnabled(true);
|
||||
m_positionSlider->setEnabled(true);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The `MediaApp` queries the pipeline via the `Player`'s
|
||||
`position()` method, which submits a position query. This is analogous
|
||||
to `gst_element_query_position()`:
|
||||
|
||||
**Player::position()**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
QTime Player::position() const
|
||||
{
|
||||
if (m_pipeline) {
|
||||
//here we query the pipeline about its position
|
||||
//and we request that the result is returned in time format
|
||||
QGst::PositionQueryPtr query = QGst::PositionQuery::create(QGst::FormatTime);
|
||||
m_pipeline->query(query);
|
||||
return QGst::ClockTime(query->position()).toTime();
|
||||
} else {
|
||||
return QTime(0,0);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Due to the way Qt handles signals that cross threads, there is no need
|
||||
to worry about calling UI functions from outside the UI thread in this
|
||||
example.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to create a basic pipeline
|
||||
- How to create a video output
|
||||
- Updating the GUI based on playback time
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
258
Basic+tutorial+1%3A+Hello+world%21.markdown
Normal file
|
@ -0,0 +1,258 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 1: Hello world\!
|
||||
|
||||
This page last changed on Jun 29, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Nothing better to get a first impression about a software library than
|
||||
to print “Hello World” on the screen\!
|
||||
|
||||
But since we are dealing with multimedia frameworks, we are going to
|
||||
play a video instead.
|
||||
|
||||
Do not be scared by the amount of code below: there are only 4 lines
|
||||
which do *real* work. The rest is cleanup code, and, in C, this is
|
||||
always a bit verbose.
|
||||
|
||||
Without further ado, get ready for your first GStreamer application...
|
||||
|
||||
# Hello world
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-1.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
**basic-tutorial-1.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Build the pipeline */
|
||||
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Start playing */
|
||||
gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* Wait until error or EOS */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Free resources */
|
||||
if (msg != NULL)
|
||||
gst_message_unref (msg);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
Compile it as described in [Installing on
|
||||
Linux](Installing%2Bon%2BLinux.html), [Installing on Mac OS
|
||||
X](Installing%2Bon%2BMac%2BOS%2BX.html) or [Installing on
|
||||
Windows](Installing%2Bon%2BWindows.html). If you get compilation errors,
|
||||
double-check the instructions given in those sections.
|
||||
|
||||
If everything built fine, fire up the executable\! You should see a
|
||||
window pop up, containing a video being played straight from the
|
||||
Internet, along with audio. Congratulations\!
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-750009403" class="expand-container">
|
||||
<div id="expander-control-750009403" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-750009403" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-1.c -o basic-tutorial-1 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. </span>Also, there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how <a href="Basic%2Btutorial%2B12%253A%2BStreaming.html">Basic tutorial 12: Streaming</a> solves this issue.</p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
Let's review these lines of code and see what they do:
|
||||
|
||||
``` first-line: 8; theme: Default; brush: cpp; gutter: true
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
```
|
||||
|
||||
This must always be your first GStreamer command. Among other things,
|
||||
`gst_init()`:
|
||||
|
||||
- Initializes all internal structures
|
||||
|
||||
- Checks what plug-ins are available
|
||||
|
||||
- Executes any command-line option intended for GStreamer
|
||||
|
||||
If you always pass your command-line parameters `argc` and `argv` to
|
||||
`gst_init()`, your application will automatically benefit from the
|
||||
GStreamer standard command-line options (more on this in [Basic tutorial
|
||||
10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html))
|
||||
|
||||
``` first-line: 11; theme: Default; brush: cpp; gutter: true
|
||||
/* Build the pipeline */
|
||||
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
```
|
||||
|
||||
This line is the heart of this tutorial, and exemplifies **two** key
|
||||
points: `gst_parse_launch()` and `playbin2`.
|
||||
|
||||
#### gst\_parse\_launch
|
||||
|
||||
GStreamer is a framework designed to handle multimedia flows. Media
|
||||
travels from the “source” elements (the producers), down to the “sink”
|
||||
elements (the consumers), passing through a series of intermediate
|
||||
elements performing all kinds of tasks. The set of all the
|
||||
interconnected elements is called a “pipeline”.
|
||||
|
||||
In GStreamer you usually build the pipeline by manually assembling the
|
||||
individual elements, but, when the pipeline is easy enough, and you do
|
||||
not need any advanced features, you can take the shortcut:
|
||||
`gst_parse_launch()`.
|
||||
|
||||
This function takes a textual representation of a pipeline and turns it
|
||||
into an actual pipeline, which is very handy. In fact, this function is
|
||||
so handy there is a tool built completely around it which you will get
|
||||
very acquainted with (see [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) to
|
||||
learn about `gst-launch` and the `gst-launch` syntax).
|
||||
|
||||
#### playbin2
|
||||
|
||||
So, what kind of pipeline are we asking `gst_parse_launch()`to build for
|
||||
us? Here enters the second key point: We are building a pipeline
|
||||
composed of a single element called `playbin2`.
|
||||
|
||||
`playbin2` is a special element which acts as a source and as a sink,
|
||||
and is capable of implementing a whole pipeline. Internally, it creates
|
||||
and connects all the necessary elements to play your media, so you do
|
||||
not have to worry about it.
|
||||
|
||||
It does not allow the control granularity that a manual pipeline does,
|
||||
but, still, it permits enough customization to suffice for a wide range
|
||||
of applications. Including this tutorial.
|
||||
|
||||
In this example, we are only passing one parameter to `playbin2`, which
|
||||
is the URI of the media we want to play. Try changing it to something
|
||||
else\! Whether it is an `http://` or `file://` URI, `playbin2` will
|
||||
instantiate the appropriate GStreamer source transparently\!
|
||||
|
||||
If you mistype the URI, or the file does not exist, or you are missing a
|
||||
plug-in, GStreamer provides several notification mechanisms, but the
|
||||
only thing we are doing in this example is exiting on error, so do not
|
||||
expect much feedback.
|
||||
|
||||
``` first-line: 14; theme: Default; brush: cpp; gutter: true
|
||||
/* Start playing */
|
||||
gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
```
|
||||
|
||||
This line highlights another interesting concept: the state. Every
|
||||
GStreamer element has an associated state, which you can more or less
|
||||
think of as the Play/Pause button in your regular DVD player. For now,
|
||||
suffice to say that playback will not start unless you set the pipeline
|
||||
to the PLAYING state.
|
||||
|
||||
In this line, `gst_element_set_state()` is setting `pipeline` (our only
|
||||
element, remember) to the PLAYING state, thus initiating playback.
|
||||
|
||||
``` first-line: 17; theme: Default; brush: cpp; gutter: true
|
||||
/* Wait until error or EOS */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
```
|
||||
|
||||
These lines will wait until an error occurs or the end of the stream is
|
||||
found. `gst_element_get_bus()` retrieves the pipeline's bus, and
|
||||
`gst_bus_timed_pop_filtered()` will block until you receive either an
|
||||
ERROR or an EOS (End-Of-Stream) through that bus. Do not worry much
|
||||
about this line, the GStreamer bus is explained in [Basic tutorial 2:
|
||||
GStreamer
|
||||
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html).
|
||||
|
||||
And that's it\! From this point onwards, GStreamer takes care of
|
||||
everything. Execution will end when the media reaches its end (EOS) or
|
||||
an error is encountered (try closing the video window, or unplugging the
|
||||
network cable). The application can always be stopped by pressing
|
||||
control-C in the console.
|
||||
|
||||
#### Cleanup
|
||||
|
||||
Before terminating the application, though, there is a couple of things
|
||||
we need to do to tidy up correctly after ourselves.
|
||||
|
||||
``` first-line: 21; theme: Default; brush: cpp; gutter: true
|
||||
/* Free resources */
|
||||
if (msg != NULL)
|
||||
gst_message_unref (msg);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
```
|
||||
|
||||
Always read the documentation of the functions you use, to know if you
|
||||
should free the objects they return after using them.
|
||||
|
||||
In this case, `gst_bus_timed_pop_filtered()` returned a message which
|
||||
needs to be freed with `gst_message_unref()` (more about messages in
|
||||
[Basic tutorial 2: GStreamer
|
||||
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)).
|
||||
|
||||
`gst_element_get_bus()` added a reference to the bus that must be freed
|
||||
with `gst_object_unref()`. Setting the pipeline to the NULL state will
|
||||
make sure it frees any resources it has allocated (More about states in
|
||||
[Basic tutorial 3: Dynamic
|
||||
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html)).
|
||||
Finally, unreferencing the pipeline will destroy it, and all its
|
||||
contents.
|
||||
|
||||
# Conclusion
|
||||
|
||||
And so ends your first tutorial with GStreamer. We hope its brevity
|
||||
serves as an example of how powerful this framework is\!
|
||||
|
||||
Let's recap a bit. Today we have learned:
|
||||
|
||||
- How to initialize GStreamer using `gst_init()`.
|
||||
|
||||
- How to quickly build a pipeline from a textual description using
|
||||
`gst_parse_launch()`.
|
||||
|
||||
- How to create an automatic playback pipeline using `playbin2`.
|
||||
|
||||
- How to signal GStreamer to start playback using
|
||||
`gst_element_set_state()`.
|
||||
|
||||
- How to sit back and relax, while GStreamer takes care of everything,
|
||||
using `gst_element_get_bus()` and `gst_bus_timed_pop_filtered()`.
|
||||
|
||||
The next tutorial will keep introducing more basic GStreamer elements,
|
||||
and show you how to build a pipeline manually.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
468
Basic+tutorial+10%3A+GStreamer+tools.markdown
Normal file
|
@ -0,0 +1,468 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 10: GStreamer tools
|
||||
|
||||
This page last changed on Jun 01, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
GStreamer (and the GStreamer SDK) come with a set of tools which range
|
||||
from handy to absolutely essential. There is no code in this tutorial,
|
||||
just sit back and relax, and we will teach you:
|
||||
|
||||
- How to build and run GStreamer pipelines from the command line,
|
||||
without using C at all\!
|
||||
- How to find out what GStreamer elements you have available and their
|
||||
capabilities.
|
||||
- How to discover the internal structure of media files.
|
||||
|
||||
# Introduction
|
||||
|
||||
These tools are available in the bin directory of the SDK. You need to
|
||||
move to this directory to execute them, because it is not added to the
|
||||
system’s `PATH` environment variable (to avoid polluting it too much).
|
||||
|
||||
Just open a terminal (or console window) and go to the `bin` directory
|
||||
of your GStreamer SDK installation (Read again the [Installing the
|
||||
SDK](Installing%2Bthe%2BSDK.html) section to find our where this is),
|
||||
and you are ready to start typing the commands given in this tutorial.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>On Linux, though, you can use the provided <code>/opt/gstreamer-sdk/bin/gst-sdk-shell</code> script to enter the GStreamer SDK shell environment, in which the <code>bin</code> directory is in the path. In this environment, you can use the GStreamer tools from any folder.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
In order to allow for multiple versions of GStreamer to coexists in the
|
||||
same system, these tools are versioned, this is, a GStreamer version
|
||||
number is appended to their name. This version of the SDK is based on
|
||||
GStreamer 0.10, so the tools are called `gst-launch-0.10`,
|
||||
`gst-inspect-0.10` and `gst-discoverer-0.10`
|
||||
|
||||
# `gst-launch`
|
||||
|
||||
This tool accepts a textual description of a pipeline, instantiates it,
|
||||
and sets it to the PLAYING state. It allows you to quickly check if a
|
||||
given pipeline works, before going through the actual implementation
|
||||
using GStreamer API calls.
|
||||
|
||||
Bear in mind that it can only create simple pipelines. In particular, it
|
||||
can only simulate the interaction of the pipeline with the application
|
||||
up to a certain level. In any case, it is extremely handy to test
|
||||
pipelines quickly, and is used by GStreamer developers around the world
|
||||
on a daily basis.
|
||||
|
||||
Please note that `gst-launch` is primarily a debugging tool for
|
||||
developers. You should not build applications on top of it. Instead, use
|
||||
the `gst_parse_launch()` function of the GStreamer API as an easy way to
|
||||
construct pipelines from pipeline descriptions.
|
||||
|
||||
Although the rules to construct pipeline descriptions are very simple,
|
||||
the concatenation of multiple elements can quickly make such
|
||||
descriptions resemble black magic. Fear not, for everyone learns the
|
||||
`gst-launch` syntax, eventually.
|
||||
|
||||
The command line for gst-launch consists of a list of options followed
|
||||
by a PIPELINE-DESCRIPTION. Some simplified instructions are given next,
|
||||
se the complete documentation at [the reference page](gst-launch.html)
|
||||
for `gst-launch`.
|
||||
|
||||
#### Elements
|
||||
|
||||
In simple form, a PIPELINE-DESCRIPTION is a list of element types
|
||||
separated by exclamation marks (\!). Go ahead and type in the following
|
||||
command:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
You should see a windows with an animated video pattern. Use CTRL+C on
|
||||
the terminal to stop the program.
|
||||
|
||||
This instantiates a new element of type `videotestsrc` (an element which
|
||||
generates a sample video pattern), an `ffmpegcolorspace` (an element
|
||||
which does color space conversion, making sure other elements can
|
||||
understand each other), and an `autovideosink` (a window to which video
|
||||
is rendered). Then, GStreamer tries to link the output of each element
|
||||
to the input of the element appearing on its right in the description.
|
||||
If more than one input or output Pad is available, the Pad Caps are used
|
||||
to find two compatible Pads.
|
||||
|
||||
#### Properties
|
||||
|
||||
Properties may be appended to elements, in the form
|
||||
*property=value *(multiple properties can be specified, separated by
|
||||
spaces). Use the `gst-inspect` tool (explained next) to find out the
|
||||
available properties for an
|
||||
element.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 videotestsrc pattern=11 ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
You should see a static video pattern, made of circles.
|
||||
|
||||
#### Named elements
|
||||
|
||||
Elements can be named using the `name` property, in this way complex
|
||||
pipelines involving branches can be created. Names allow linking to
|
||||
elements created previously in the description, and are indispensable to
|
||||
use elements with multiple output pads, like demuxers or tees, for
|
||||
example.
|
||||
|
||||
Named elements are referred to using their name followed by a
|
||||
dot.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! tee name=t ! queue ! autovideosink t. ! queue ! autovideosink
|
||||
```
|
||||
|
||||
You should see two video windows, showing the same sample video pattern.
|
||||
If you see only one, try to move it, since it is probably on top of the
|
||||
second window.
|
||||
|
||||
This example instantiates a `videotestsrc`, linked to a
|
||||
`ffmpegcolorspace`, linked to a `tee` (Remember from [Basic tutorial 7:
|
||||
Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) that
|
||||
a `tee` copies to each of its output pads everything coming through its
|
||||
input pad). The `tee` is named simply ‘t’ (using the `name` property)
|
||||
and then linked to a `queue` and an `autovideosink`. The same `tee` is
|
||||
referred to using ‘t.’ (mind the dot) and then linked to a second
|
||||
`queue` and a second `autovideosink`.
|
||||
|
||||
To learn why the queues are necessary read [Basic tutorial 7:
|
||||
Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html).
|
||||
|
||||
#### Pads
|
||||
|
||||
Instead of letting GStreamer choose which Pad to use when linking two
|
||||
elements, you may want to specify the Pads directly. You can do this by
|
||||
adding a dot plus the Pad name after the name of the element (it must be
|
||||
a named element). Learn the names of the Pads of an element by using
|
||||
the `gst-inspect` tool.
|
||||
|
||||
This is useful, for example, when you want to retrieve one particular
|
||||
stream out of a
|
||||
demuxer:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
|
||||
```
|
||||
|
||||
This fetches a media file from the internet using `souphttpsrc`, which
|
||||
is in webm format (a special kind of Matroska container, see [Basic
|
||||
tutorial 2: GStreamer
|
||||
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)). We
|
||||
then open the container using `matroskademux`. This media contains both
|
||||
audio and video, so `matroskademux` will create two output Pads, named
|
||||
`video_00` and `audio_00`. We link `video_00` to a `matroskamux` element
|
||||
to re-pack the video stream into a new container, and finally link it to
|
||||
a `filesink`, which will write the stream into a file named
|
||||
"sintel\_video.mkv" (the `location` property specifies the name of the
|
||||
file).
|
||||
|
||||
All in all, we took a webm file, stripped it of audio, and generated a
|
||||
new matroska file with the video. If we wanted to keep only the
|
||||
audio:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.audio_00 ! vorbisparse ! matroskamux ! filesink location=sintel_audio.mka
|
||||
```
|
||||
|
||||
The `vorbisparse` element is required to extract some information from
|
||||
the stream and put it in the Pad Caps, so the next element,
|
||||
`matroskamux`, knows how to deal with the stream. In the case of video
|
||||
this was not necessary, because `matroskademux` already extracted this
|
||||
information and added it to the Caps.
|
||||
|
||||
Note that in the above two examples no media has been decoded or played.
|
||||
We have just moved from one container to another (demultiplexing and
|
||||
re-multiplexing again).
|
||||
|
||||
#### Caps filters
|
||||
|
||||
When an element has more than one output pad, it might happen that the
|
||||
link to the next element is ambiguous: the next element may have more
|
||||
than one compatible input pad, or its input pad may be compatible with
|
||||
the Pad Caps of all the output pads. In these cases GStreamer will link
|
||||
using the first pad that is available, which pretty much amounts to
|
||||
saying that GStreamer will choose one output pad at random.
|
||||
|
||||
Consider the following
|
||||
pipeline:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! filesink location=test
|
||||
```
|
||||
|
||||
This is the same media file and demuxer as in the previous example. The
|
||||
input Pad Caps of `filesink` are `ANY`, meaning that it can accept any
|
||||
kind of media. Which one of the two output pads of `matroskademux` will
|
||||
be linked against the filesink? `video_00` or `audio_00`? You cannot
|
||||
know.
|
||||
|
||||
You can remove this ambiguity, though, by using named pads, as in the
|
||||
previous sub-section, or by using **Caps
|
||||
Filters**:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! video/x-vp8 ! matroskamux ! filesink location=sintel_video.mkv
|
||||
```
|
||||
|
||||
A Caps Filter behaves like a pass-through element which does nothing and
|
||||
only accepts media with the given Caps, effectively resolving the
|
||||
ambiguity. In this example, between `matroskademux` and `matroskamux` we
|
||||
added a `video/x-vp8` Caps Filter to specify that we are interested in
|
||||
the output pad of `matroskademux` which can produce this kind of video.
|
||||
|
||||
To find out the Caps an element accepts and produces, use the
|
||||
`gst-inspect` tool. To find out the Caps contained in a particular file,
|
||||
use the `gst-discoverer` tool. To find out the Caps an element is
|
||||
producing for a particular pipeline, run `gst-launch` as usual, with the
|
||||
`–v` option to print Caps information.
|
||||
|
||||
#### Examples
|
||||
|
||||
Play a media file using `playbin2` (as in [Basic tutorial 1: Hello
|
||||
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html)):
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm
|
||||
```
|
||||
|
||||
A fully operation playback pipeline, with audio and video (more or less
|
||||
the same pipeline that `playbin2` will create
|
||||
internally):
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d ! queue ! vp8dec ! ffmpegcolorspace ! autovideosink d. ! queue ! vorbisdec ! audioconvert ! audioresample ! autoaudiosink
|
||||
```
|
||||
|
||||
A transcoding pipeline, which opens the webm container and decodes both
|
||||
streams (via uridecodebin), then re-encodes the audio and video branches
|
||||
with a different codec, and puts them back together in an Ogg container
|
||||
(just for the sake of
|
||||
it).
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm name=d ! queue ! theoraenc ! oggmux name=m ! filesink location=sintel.ogg d. ! queue ! audioconvert ! audioresample ! flacenc ! m.
|
||||
```
|
||||
|
||||
A rescaling pipeline. The `videoscale` element performs a rescaling
|
||||
operation whenever the frame size is different in the input and the
|
||||
output caps. The output caps are set by the Caps Filter to
|
||||
320x200.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! queue ! videoscale ! video/x-raw-yuv,width=320,height=200 ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
This short description of `gst-launch` should be enough to get you
|
||||
started. Remember that you have the [complete documentation available
|
||||
here](gst-launch.html).
|
||||
|
||||
# `gst-inspect`
|
||||
|
||||
This tool has three modes of operation:
|
||||
|
||||
- Without arguments, it lists all available elements types, this is,
|
||||
the types you can use to instantiate new elements.
|
||||
- With a file name as an argument, it treats the file as a GStreamer
|
||||
plugin, tries to open it, and lists all the elements described
|
||||
inside.
|
||||
- With a GStreamer element name as an argument, it lists all
|
||||
information regarding that element.
|
||||
|
||||
Let's see an example of the third mode:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: true
|
||||
gst-inspect-0.10 vp8dec
|
||||
|
||||
Factory Details:
|
||||
Long name: On2 VP8 Decoder
|
||||
Class: Codec/Decoder/Video
|
||||
Description: Decode VP8 video streams
|
||||
Author(s): David Schleef <ds@entropywave.com>
|
||||
Rank: primary (256)
|
||||
Plugin Details:
|
||||
Name: vp8
|
||||
Description: VP8 plugin
|
||||
Filename: I:\gstreamer-sdk\2012.5\x86\lib\gstreamer-0.10\libgstvp8.dll
|
||||
Version: 0.10.23
|
||||
License: LGPL
|
||||
Source module: gst-plugins-bad
|
||||
Source release date: 2012-02-20
|
||||
Binary package: GStreamer Bad Plug-ins (GStreamer SDK)
|
||||
Origin URL: http://www.gstreamer.com
|
||||
GObject
|
||||
+----GstObject
|
||||
+----GstElement
|
||||
+----GstBaseVideoCodec
|
||||
+----GstBaseVideoDecoder
|
||||
+----GstVP8Dec
|
||||
Pad Templates:
|
||||
SRC template: 'src'
|
||||
Availability: Always
|
||||
Capabilities:
|
||||
video/x-raw-yuv
|
||||
format: I420
|
||||
width: [ 1, 2147483647 ]
|
||||
height: [ 1, 2147483647 ]
|
||||
framerate: [ 0/1, 2147483647/1 ]
|
||||
SINK template: 'sink'
|
||||
Availability: Always
|
||||
Capabilities:
|
||||
video/x-vp8
|
||||
|
||||
Element Flags:
|
||||
no flags set
|
||||
Element Implementation:
|
||||
Has change_state() function: gst_base_video_decoder_change_state
|
||||
Has custom save_thyself() function: gst_element_save_thyself
|
||||
Has custom restore_thyself() function: gst_element_restore_thyself
|
||||
Element has no clocking capabilities.
|
||||
Element has no indexing capabilities.
|
||||
Element has no URI handling capabilities.
|
||||
Pads:
|
||||
SRC: 'src'
|
||||
Implementation:
|
||||
Has custom eventfunc(): gst_base_video_decoder_src_event
|
||||
Has custom queryfunc(): gst_base_video_decoder_src_query
|
||||
Provides query types:
|
||||
(1): position (Current position)
|
||||
(2): duration (Total duration)
|
||||
(8): convert (Converting between formats)
|
||||
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
|
||||
Has getcapsfunc(): gst_pad_get_fixed_caps_func
|
||||
Has acceptcapsfunc(): gst_pad_acceptcaps_default
|
||||
Pad Template: 'src'
|
||||
SINK: 'sink'
|
||||
Implementation:
|
||||
Has chainfunc(): gst_base_video_decoder_chain
|
||||
Has custom eventfunc(): gst_base_video_decoder_sink_event
|
||||
Has custom queryfunc(): gst_base_video_decoder_sink_query
|
||||
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
|
||||
Has setcapsfunc(): gst_base_video_decoder_sink_setcaps
|
||||
Has acceptcapsfunc(): gst_pad_acceptcaps_default
|
||||
Pad Template: 'sink'
|
||||
Element Properties:
|
||||
name : The name of the object
|
||||
flags: readable, writable
|
||||
String. Default: "vp8dec0"
|
||||
post-processing : Enable post processing
|
||||
flags: readable, writable
|
||||
Boolean. Default: false
|
||||
post-processing-flags: Flags to control post processing
|
||||
flags: readable, writable
|
||||
Flags "GstVP8DecPostProcessingFlags" Default: 0x00000003, "demacroblock+deblock"
|
||||
(0x00000001): deblock - Deblock
|
||||
(0x00000002): demacroblock - Demacroblock
|
||||
(0x00000004): addnoise - Add noise
|
||||
deblocking-level : Deblocking level
|
||||
flags: readable, writable
|
||||
Unsigned Integer. Range: 0 - 16 Default: 4
|
||||
noise-level : Noise level
|
||||
flags: readable, writable
|
||||
Unsigned Integer. Range: 0 - 16 Default: 0
|
||||
```
|
||||
|
||||
The most relevant sections are:
|
||||
|
||||
- Pad Templates (line 25): This lists all the kinds of Pads this
|
||||
element can have, along with their capabilities. This is where you
|
||||
look to find out if an element can link with another one. In this
|
||||
case, it has only one sink pad template, accepting only
|
||||
`video/x-vp8` (encoded video data in VP8 format) and only one source
|
||||
pad template, producing `video/x-raw-yuv` (decoded video data).
|
||||
- Element Properties (line 70): This lists the properties of the
|
||||
element, along with their type and accepted values.
|
||||
|
||||
For more information, you can check the [documentation
|
||||
page](http://gst-inspect) of `gst-inspect`.
|
||||
|
||||
# `gst-discoverer`
|
||||
|
||||
This tool is a wrapper around the `GstDiscoverer` object shown in [Basic
|
||||
tutorial 9: Media information
|
||||
gathering](Basic%2Btutorial%2B9%253A%2BMedia%2Binformation%2Bgathering.html).
|
||||
It accepts a URI from the command line and prints all information
|
||||
regarding the media that GStreamer can extract. It is useful to find out
|
||||
what container and codecs have been used to produce the media, and
|
||||
therefore what elements you need to put in a pipeline to play it.
|
||||
|
||||
Use `gst-discoverer --help` to obtain the list of available options,
|
||||
which basically control the amount of verbosity of the output.
|
||||
|
||||
Let's see an
|
||||
example:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-discoverer-0.10 http://docs.gstreamer.com/media/sintel_trailer-480p.webm -v
|
||||
|
||||
Analyzing http://docs.gstreamer.com/media/sintel_trailer-480p.webm
|
||||
Done discovering http://docs.gstreamer.com/media/sintel_trailer-480p.webm
|
||||
Topology:
|
||||
container: video/webm
|
||||
audio: audio/x-vorbis, channels=(int)2, rate=(int)48000
|
||||
Codec:
|
||||
audio/x-vorbis, channels=(int)2, rate=(int)48000
|
||||
Additional info:
|
||||
None
|
||||
Language: en
|
||||
Channels: 2
|
||||
Sample rate: 48000
|
||||
Depth: 0
|
||||
Bitrate: 80000
|
||||
Max bitrate: 0
|
||||
Tags:
|
||||
taglist, language-code=(string)en, container-format=(string)Matroska, audio-codec=(string)Vorbis, application-name=(string)ffmpeg2theora-0.24, encoder=(string)"Xiph.Org\ libVorbis\ I\ 20090709", encoder-version=(uint)0, nominal-bitrate=(uint)80000, bitrate=(uint)80000;
|
||||
video: video/x-vp8, width=(int)854, height=(int)480, framerate=(fraction)25/1
|
||||
Codec:
|
||||
video/x-vp8, width=(int)854, height=(int)480, framerate=(fraction)25/1
|
||||
Additional info:
|
||||
None
|
||||
Width: 854
|
||||
Height: 480
|
||||
Depth: 0
|
||||
Frame rate: 25/1
|
||||
Pixel aspect ratio: 1/1
|
||||
Interlaced: false
|
||||
Bitrate: 0
|
||||
Max bitrate: 0
|
||||
Tags:
|
||||
taglist, video-codec=(string)"VP8\ video", container-format=(string)Matroska;
|
||||
|
||||
Properties:
|
||||
Duration: 0:00:52.250000000
|
||||
Seekable: yes
|
||||
Tags:
|
||||
video codec: On2 VP8
|
||||
language code: en
|
||||
container format: Matroska
|
||||
application name: ffmpeg2theora-0.24
|
||||
encoder: Xiph.Org libVorbis I 20090709
|
||||
encoder version: 0
|
||||
audio codec: Vorbis
|
||||
nominal bitrate: 80000
|
||||
bitrate: 80000
|
||||
```
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to build and run GStreamer pipelines from the command line using
|
||||
the `gst-launch` tool.
|
||||
- How to find out what GStreamer elements you have available and their
|
||||
capabilities, using the `gst-inspect` tool.
|
||||
- How to discover the internal structure of media files, using
|
||||
`gst-discoverer`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
232
Basic+tutorial+11%3A+Debugging+tools.markdown
Normal file
|
@ -0,0 +1,232 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 11: Debugging tools
|
||||
|
||||
This page last changed on Jun 04, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Sometimes things won’t go as expected and the error messages retrieved
|
||||
from the bus (if any) just don’t provide enough information. Luckily,
|
||||
GStreamer ships with massive amounts of debug information, which usually
|
||||
hint what the problem might be. This tutorial shows:
|
||||
|
||||
- How to get more debug information from GStreamer.
|
||||
|
||||
- How to print your own debug information into the GStreamer log.
|
||||
|
||||
- How to get pipeline graphs
|
||||
|
||||
# Printing debug information
|
||||
|
||||
### The debug log
|
||||
|
||||
GStreamer and its plugins are full of debug traces, this is, places in
|
||||
the code where a particularly interesting piece of information is
|
||||
printed to the console, along with time stamping, process, category,
|
||||
source code file, function and element information.
|
||||
|
||||
The debug output is controlled with the `GST_DEBUG` environment
|
||||
variable. Here’s an example with
|
||||
`GST_DEBUG=2`:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
|
||||
```
|
||||
|
||||
As you can see, this is quite a bit of information. In fact, the
|
||||
GStreamer debug log is so verbose, that when fully enabled it can render
|
||||
applications unresponsive (due to the console scrolling) or fill up
|
||||
megabytes of text files (when redirected to a file). For this reason,
|
||||
the logs are categorized, and you seldom need to enable all categories
|
||||
at once.
|
||||
|
||||
The first category is the Debug Level, which is a number specifying the
|
||||
amount of desired output:
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><span style="color: rgb(192,192,192);">0</span></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><span style="color: rgb(192,192,192);">1</span></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><span style="color: rgb(192,192,192);">2</span></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><span style="color: rgb(192,192,192);">3</span></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><span style="color: rgb(192,192,192);">4</span></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><span style="color: rgb(192,192,192);">5</span></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
To enable debug output, set the `GST_DEBUG` environment variable to the
|
||||
desired debug level. All levels below that will also be shown (i.e., if
|
||||
you set `GST_DEBUG=2`, you will get both `ERROR` and
|
||||
`WARNING` messages).
|
||||
|
||||
Furthermore, each plugin or part of the GStreamer defines its own
|
||||
category, so you can specify a debug level for each individual category.
|
||||
For example, `GST_DEBUG=2,audiotestsrc:5`, will use Debug Level 5 for
|
||||
the `audiotestsrc` element, and 2 for all the others.
|
||||
|
||||
The `GST_DEBUG` environment variable, then, is a comma-separated list of
|
||||
*category*:*level* pairs, with an optional *level* at the beginning,
|
||||
representing the default debug level for all categories.
|
||||
|
||||
The `'*'` wildcard is also available. For example
|
||||
`GST_DEBUG=2,audio*:5` will use Debug Level 5 for all categories
|
||||
starting with the word `audio`. `GST_DEBUG=*:2` is equivalent to
|
||||
`GST_DEBUG=2`.
|
||||
|
||||
Use `gst-launch-0.10 --gst-debug-help` to obtain the list of all
|
||||
registered categories. Bear in mind that each plugin registers its own
|
||||
categories, so, when installing or removing plugins, this list can
|
||||
change.
|
||||
|
||||
Use `GST_DEBUG` when the error information posted on the GStreamer bus
|
||||
does not help you nail down a problem. It is common practice to redirect
|
||||
the output log to a file, and then examine it later, searching for
|
||||
specific messages.
|
||||
|
||||
The content of each line in the debug output
|
||||
is:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
|
||||
```
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th><code>0:00:00.868050000</code></th>
|
||||
<th>Time stamp in HH:MM:SS.sssssssss format since the start of the program</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><code>1592</code></td>
|
||||
<td>Process ID from which the message was issued. Useful when your problem involves multiple processes</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>09F62420</code></td>
|
||||
<td><span>Thread ID from which the message was issued. Useful when your problem involves multiple threads</span></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><code>WARN</code></td>
|
||||
<td>Debug level of the message</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>filesrc</code></td>
|
||||
<td>Debug Category of the message</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><code>gstfilesrc.c:1044</code></td>
|
||||
<td>Source file and line in the GStreamer source code where this message is printed</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>gst_file_src_start</code></td>
|
||||
<td>Function from which the message was issued</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><code><filesrc0></code></td>
|
||||
<td>Name of the object that issued the message. It can be an element, a Pad, or something else. Useful when you have multiple elements of the same kind and need to distinguish among them. Naming your elements with the name property will make this debug output more readable (otherwise, GStreamer assigns each new element a unique name).</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>error: No such file "non-existing-file.webm"</code></td>
|
||||
<td>The actual message.</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
### Adding your own debug information
|
||||
|
||||
In the parts of your code that interact with GStreamer, it is
|
||||
interesting to use GStreamer’s debugging facilities. In this way, you
|
||||
have all debug output in the same file and the temporal relationship
|
||||
between different messages is preserved.
|
||||
|
||||
To do so, use the `GST_ERROR()`, `GST_WARNING()`, `GST_INFO()`,
|
||||
`GST_LOG()` and `GST_DEBUG()` macros. They accept the same parameters as
|
||||
`printf`, and they use the `default` category (`default` will be shown
|
||||
as the Debug category in the output log).
|
||||
|
||||
To change the category to something more meaningful, add these two lines
|
||||
at the top of your code:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
GST_DEBUG_CATEGORY_STATIC (my_category);
|
||||
#define GST_CAT_DEFAULT my_category
|
||||
```
|
||||
|
||||
And then this one after you have initialized GStreamer with
|
||||
`gst_init()`:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
GST_DEBUG_CATEGORY_INIT (my_category, "my category", 0, "This is my very own");
|
||||
```
|
||||
|
||||
This registers a new category (this is, for the duration of your
|
||||
application: it is not stored in any file), and sets it as the default
|
||||
category for your code. See the documentation
|
||||
for `GST_DEBUG_CATEGORY_INIT()`.
|
||||
|
||||
### Getting pipeline graphs
|
||||
|
||||
For those cases where your pipeline starts to grow too large and you
|
||||
lose track of what is connected with what, GStreamer has the capability
|
||||
to output graph files. These are `.dot` files, readable with free
|
||||
programs like [GraphViz](http://www.graphviz.org), that describe the
|
||||
topology of your pipeline, along with the caps negotiated in each link.
|
||||
|
||||
This is also very handy when using all-in-one elements like `playbin2`
|
||||
or `uridecodebin`, which instantiate several elements inside them. Use
|
||||
the `.dot` files to learn what pipeline they have created inside (and
|
||||
learn a bit of GStreamer along the way).
|
||||
|
||||
To obtain `.dot` files, simply set
|
||||
the `GST_DEBUG_DUMP_DOT_DIR` environment variable to point to the
|
||||
folder where you want the files to be placed. `gst-launch` will create
|
||||
a `.dot` file at each state change, so you can see the evolution of the
|
||||
caps negotiation. Unset the variable to disable this facility. From
|
||||
within your application, you can use the
|
||||
`GST_DEBUG_BIN_TO_DOT_FILE()` and
|
||||
`GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS()` macros to generate `.dot` files
|
||||
at your convenience.
|
||||
|
||||
Here you have an example of the kind of pipelines that playbin2
|
||||
generates. It is very complex because `playbin2` can handle many
|
||||
different cases: Your manual pipelines normally do not need to be this
|
||||
long. If your manual pipeline is starting to get very big, consider
|
||||
using `playbin2`.
|
||||
|
||||
![](attachments/327830/2424840.png)
|
||||
|
||||
To download the full-size picture, use the attachments link at the top
|
||||
of this page (It's the paperclip icon).
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to get more debug information from GStreamer using the
|
||||
`GST_DEBUG` environment variable.
|
||||
- How to print your own debug information into the GStreamer log with
|
||||
the `GST_ERROR()` macro and relatives.
|
||||
- How to get pipeline graphs with the
|
||||
`GST_DEBUG_DUMP_DOT_DIR` environment variable.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[playbin2.png](attachments/327830/2424840.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
290
Basic+tutorial+12%3A+Streaming.markdown
Normal file
|
@ -0,0 +1,290 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 12: Streaming
|
||||
|
||||
This page last changed on Sep 28, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Playing media straight from the Internet without storing it locally is
|
||||
known as Streaming. We have been doing it throughout the tutorials
|
||||
whenever we used a URI starting with `http://`. This tutorial shows a
|
||||
couple of additional points to keep in mind when streaming. In
|
||||
particular:
|
||||
|
||||
- How to enable buffering (to alleviate network problems)
|
||||
- How to recover from interruptions (lost clock)
|
||||
|
||||
# Introduction
|
||||
|
||||
When streaming, media chunks are decoded and queued for presentation as
|
||||
soon as they arrive form the network. This means that if a chunk is
|
||||
delayed (which is not an uncommon situation at all on the Internet) the
|
||||
presentation queue might run dry and media playback could stall.
|
||||
|
||||
The universal solution is to build a “buffer”, this is, allow a certain
|
||||
number of media chunks to be queued before starting playback. In this
|
||||
way, playback start is delayed a bit, but, if some chunks are late,
|
||||
reproduction is not impacted as there are more chunks in the queue,
|
||||
waiting.
|
||||
|
||||
As it turns out, this solution is already implemented in GStreamer, but
|
||||
the previous tutorials have not been benefiting from it. Some elements,
|
||||
like the `queue2` and `multiqueue` found inside `playbin2`, are capable
|
||||
of building this buffer and post bus messages regarding the buffer level
|
||||
(the state of the queue). An application wanting to have more network
|
||||
resilience, then, should listen to these messages and pause playback if
|
||||
the buffer level is not high enough (usually, whenever it is below
|
||||
100%).
|
||||
|
||||
To achieve synchronization among multiple sinks (for example and audio
|
||||
and a video sink) a global clock is used. This clock is selected by
|
||||
GStreamer among all elements which can provide one. Under some
|
||||
circumstances, for example, an RTP source switching streams or changing
|
||||
the output device, this clock can be lost and a new one needs to be
|
||||
selected. This happens mostly when dealing with streaming, so the
|
||||
process is explained in this tutorial.
|
||||
|
||||
When the clock is lost, the application receives a message on the bus;
|
||||
to select a new one, the application just needs to set the pipeline to
|
||||
PAUSED and then to PLAYING again.
|
||||
|
||||
# A network-resilient example
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-12.c`.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
**basic-tutorial-12.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
#include <string.h>
|
||||
|
||||
typedef struct _CustomData {
|
||||
gboolean is_live;
|
||||
GstElement *pipeline;
|
||||
GMainLoop *loop;
|
||||
} CustomData;
|
||||
|
||||
static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR: {
|
||||
GError *err;
|
||||
gchar *debug;
|
||||
|
||||
gst_message_parse_error (msg, &err, &debug);
|
||||
g_print ("Error: %s\n", err->message);
|
||||
g_error_free (err);
|
||||
g_free (debug);
|
||||
|
||||
gst_element_set_state (data->pipeline, GST_STATE_READY);
|
||||
g_main_loop_quit (data->loop);
|
||||
break;
|
||||
}
|
||||
case GST_MESSAGE_EOS:
|
||||
/* end-of-stream */
|
||||
gst_element_set_state (data->pipeline, GST_STATE_READY);
|
||||
g_main_loop_quit (data->loop);
|
||||
break;
|
||||
case GST_MESSAGE_BUFFERING: {
|
||||
gint percent = 0;
|
||||
|
||||
/* If the stream is live, we do not care about buffering. */
|
||||
if (data->is_live) break;
|
||||
|
||||
gst_message_parse_buffering (msg, &percent);
|
||||
g_print ("Buffering (%3d%%)\r", percent);
|
||||
/* Wait until buffering is complete before start/resume playing */
|
||||
if (percent < 100)
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
else
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
}
|
||||
case GST_MESSAGE_CLOCK_LOST:
|
||||
/* Get a new clock */
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
default:
|
||||
/* Unhandled message */
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline;
|
||||
GstBus *bus;
|
||||
GstStateChangeReturn ret;
|
||||
GMainLoop *main_loop;
|
||||
CustomData data;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
|
||||
/* Build the pipeline */
|
||||
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
} else if (ret == GST_STATE_CHANGE_NO_PREROLL) {
|
||||
data.is_live = TRUE;
|
||||
}
|
||||
|
||||
main_loop = g_main_loop_new (NULL, FALSE);
|
||||
data.loop = main_loop;
|
||||
data.pipeline = pipeline;
|
||||
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (bus, "message", G_CALLBACK (cb_message), &data);
|
||||
|
||||
g_main_loop_run (main_loop);
|
||||
|
||||
/* Free resources */
|
||||
g_main_loop_unref (main_loop);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-8150053" class="expand-container">
|
||||
<div id="expander-control-8150053" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-8150053" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-10.c -o basic-tutorial-10 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. In the console window, you should see a buffering message, and playback should only start when the buffering reaches 100%. This percentage might not change at all if your connection is fast enough and buffering is not required.</p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
The only special thing this tutorial does is react to certain messages;
|
||||
therefore, the initialization code is very simple and should be
|
||||
self-explanative by now. The only new bit is the detection of live
|
||||
streams:
|
||||
|
||||
``` first-line: 74; theme: Default; brush: cpp; gutter: true
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
} else if (ret == GST_STATE_CHANGE_NO_PREROLL) {
|
||||
data.is_live = TRUE;
|
||||
}
|
||||
```
|
||||
|
||||
Live streams cannot be paused, so they behave in PAUSED state as if they
|
||||
were in the PLAYING state. Setting live streams to PAUSED succeeds, but
|
||||
returns `GST_STATE_CHANGE_NO_PREROLL`, instead of
|
||||
`GST_STATE_CHANGE_SUCCESS` to indicate that this is a live stream. We
|
||||
are receiving the NO\_PROROLL return code even though we are trying to
|
||||
set the pipeline to PLAYING, because state changes happen progressively
|
||||
(from NULL to READY, to PAUSED and then to PLAYING).
|
||||
|
||||
We care about live streams because we want to disable buffering for
|
||||
them, so we take note of the result of `gst_element_set_state()` in the
|
||||
`is_live` variable.
|
||||
|
||||
Let’s now review the interesting parts of the message parsing callback:
|
||||
|
||||
``` first-line: 31; theme: Default; brush: cpp; gutter: true
|
||||
case GST_MESSAGE_BUFFERING: {
|
||||
gint percent = 0;
|
||||
|
||||
/* If the stream is live, we do not care about buffering. */
|
||||
if (data->is_live) break;
|
||||
|
||||
gst_message_parse_buffering (msg, &percent);
|
||||
g_print ("Buffering (%3d%%)\r", percent);
|
||||
/* Wait until buffering is complete before start/resume playing */
|
||||
if (percent < 100)
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
else
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
}
|
||||
```
|
||||
|
||||
First, if this is a live source, ignore buffering messages.
|
||||
|
||||
We parse the buffering message with `gst_message_parse_buffering()` to
|
||||
retrieve the buffering level.
|
||||
|
||||
Then, we print the buffering level on the console and set the pipeline
|
||||
to PAUSED if it is below 100%. Otherwise, we set the pipeline to
|
||||
PLAYING.
|
||||
|
||||
At startup, we will see the buffering level rise up to 100% before
|
||||
playback starts, which is what we wanted to achieve. If, later on, the
|
||||
network becomes slow or unresponsive and our buffer depletes, we will
|
||||
receive new buffering messages with levels below 100% so we will pause
|
||||
the pipeline again until enough buffer has been built up.
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
case GST_MESSAGE_CLOCK_LOST:
|
||||
/* Get a new clock */
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
```
|
||||
|
||||
For the second network issue, the loss of clock, we simply set the
|
||||
pipeline to PAUSED and back to PLAYING, so a new clock is selected,
|
||||
waiting for new media chunks to be received if necessary.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has described how to add network resilience to your
|
||||
application with two very simple precautions:
|
||||
|
||||
- Taking care of buffering messages sent by the pipeline
|
||||
- Taking care of clock loss
|
||||
|
||||
Handling these messages improves the application’s response to network
|
||||
problems, increasing the overall playback smoothness.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-12.c](attachments/327806/2424843.c) (text/plain)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[vs2010.zip](attachments/327806/2424844.zip) (application/zip)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
409
Basic+tutorial+13%3A+Playback+speed.markdown
Normal file
|
@ -0,0 +1,409 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 13: Playback speed
|
||||
|
||||
This page last changed on Jul 06, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Fast-forward, reverse-playback and slow-motion are all techniques
|
||||
collectively known as *trick modes* and they all have in common that
|
||||
modify the normal playback rate. This tutorial shows how to achieve
|
||||
these effects and adds frame-stepping into the deal. In particular, it
|
||||
shows:
|
||||
|
||||
- How to change the playback rate, faster and slower than normal,
|
||||
forward and backwards.
|
||||
- How to advance a video frame-by-frame
|
||||
|
||||
# Introduction
|
||||
|
||||
Fast-forward is the technique that plays a media at a speed higher than
|
||||
its normal (intended) speed; whereas slow-motion uses a speed lower than
|
||||
the intended one. Reverse playback does the same thing but backwards,
|
||||
from the end of the stream to the beginning.
|
||||
|
||||
All these techniques do is change the playback rate, which is a variable
|
||||
equal to 1.0 for normal playback, greater than 1.0 (in absolute value)
|
||||
for fast modes, lower than 1.0 (in absolute value) for slow modes,
|
||||
positive for forward playback and negative for reverse playback.
|
||||
|
||||
GStreamer provides two mechanisms to change the playback rate: Step
|
||||
Events and Seek Events. Step Events allow skipping a given amount of
|
||||
media besides changing the subsequent playback rate (only to positive
|
||||
values). Seek Events, additionally, allow jumping to any position in the
|
||||
stream and set positive and negative playback rates.
|
||||
|
||||
In [Basic tutorial 4: Time
|
||||
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html) seek
|
||||
events have already been shown, using a helper function to hide their
|
||||
complexity. This tutorial explains a bit more how to use these events.
|
||||
|
||||
Step Events are a more convenient way of changing the playback rate, due
|
||||
to the reduced number of parameters needed to create them; however,
|
||||
their implementation in GStreamer still needs a bit more polishing
|
||||
so Seek Events are used in this tutorial instead.
|
||||
|
||||
To use these events, they are created and then passed onto the pipeline,
|
||||
where they propagate upstream until they reach an element that can
|
||||
handle them. If an event is passed onto a bin element like `playbin2`,
|
||||
it will simply feed the event to all its sinks, which will result in
|
||||
multiple seeks being performed. The common approach is to retrieve one
|
||||
of `playbin2`’s sinks through the `video-sink` or
|
||||
`audio-sink` properties and feed the event directly into the sink.
|
||||
|
||||
Frame stepping is a technique that allows playing a video frame by
|
||||
frame. It is implemented by pausing the pipeline, and then sending Step
|
||||
Events to skip one frame each time.
|
||||
|
||||
# A trick mode player
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-13.c`.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
**basic-tutorial-13.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <string.h>
|
||||
#include <gst/gst.h>
|
||||
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline;
|
||||
GstElement *video_sink;
|
||||
GMainLoop *loop;
|
||||
|
||||
gboolean playing; /* Playing or Paused */
|
||||
gdouble rate; /* Current playback rate (can be negative) */
|
||||
} CustomData;
|
||||
|
||||
/* Send seek event to change rate */
|
||||
static void send_seek_event (CustomData *data) {
|
||||
gint64 position;
|
||||
GstFormat format = GST_FORMAT_TIME;
|
||||
GstEvent *seek_event;
|
||||
|
||||
/* Obtain the current position, needed for the seek event */
|
||||
if (!gst_element_query_position (data->pipeline, &format, &position)) {
|
||||
g_printerr ("Unable to retrieve current position.\n");
|
||||
return;
|
||||
}
|
||||
|
||||
/* Create the seek event */
|
||||
if (data->rate > 0) {
|
||||
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
|
||||
GST_SEEK_TYPE_SET, position, GST_SEEK_TYPE_NONE, 0);
|
||||
} else {
|
||||
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
|
||||
GST_SEEK_TYPE_SET, 0, GST_SEEK_TYPE_SET, position);
|
||||
}
|
||||
|
||||
if (data->video_sink == NULL) {
|
||||
/* If we have not done so, obtain the sink through which we will send the seek events */
|
||||
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
|
||||
}
|
||||
|
||||
/* Send the event */
|
||||
gst_element_send_event (data->video_sink, seek_event);
|
||||
|
||||
g_print ("Current rate: %g\n", data->rate);
|
||||
}
|
||||
|
||||
/* Process keyboard input */
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
|
||||
gchar *str = NULL;
|
||||
|
||||
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) != G_IO_STATUS_NORMAL) {
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
switch (g_ascii_tolower (str[0])) {
|
||||
case 'p':
|
||||
data->playing = !data->playing;
|
||||
gst_element_set_state (data->pipeline, data->playing ? GST_STATE_PLAYING : GST_STATE_PAUSED);
|
||||
g_print ("Setting state to %s\n", data->playing ? "PLAYING" : "PAUSE");
|
||||
break;
|
||||
case 's':
|
||||
if (g_ascii_isupper (str[0])) {
|
||||
data->rate *= 2.0;
|
||||
} else {
|
||||
data->rate /= 2.0;
|
||||
}
|
||||
send_seek_event (data);
|
||||
break;
|
||||
case 'd':
|
||||
data->rate *= -1.0;
|
||||
send_seek_event (data);
|
||||
break;
|
||||
case 'n':
|
||||
if (data->video_sink == NULL) {
|
||||
/* If we have not done so, obtain the sink through which we will send the step events */
|
||||
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
|
||||
}
|
||||
|
||||
gst_element_send_event (data->video_sink,
|
||||
gst_event_new_step (GST_FORMAT_BUFFERS, 1, data->rate, TRUE, FALSE));
|
||||
g_print ("Stepping one frame\n");
|
||||
break;
|
||||
case 'q':
|
||||
g_main_loop_quit (data->loop);
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
|
||||
g_free (str);
|
||||
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstStateChangeReturn ret;
|
||||
GIOChannel *io_stdin;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
|
||||
/* Print usage map */
|
||||
g_print (
|
||||
"USAGE: Choose one of the following options, then press enter:\n"
|
||||
" 'P' to toggle between PAUSE and PLAY\n"
|
||||
" 'S' to increase playback speed, 's' to decrease playback speed\n"
|
||||
" 'D' to toggle playback direction\n"
|
||||
" 'N' to move to next frame (in the current direction, better in PAUSE)\n"
|
||||
" 'Q' to quit\n");
|
||||
|
||||
/* Build the pipeline */
|
||||
data.pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Add a keyboard watch so we get notified of keystrokes */
|
||||
#ifdef _WIN32
|
||||
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
|
||||
#else
|
||||
io_stdin = g_io_channel_unix_new (fileno (stdin));
|
||||
#endif
|
||||
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
data.playing = TRUE;
|
||||
data.rate = 1.0;
|
||||
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
data.loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.loop);
|
||||
|
||||
/* Free resources */
|
||||
g_main_loop_unref (data.loop);
|
||||
g_io_channel_unref (io_stdin);
|
||||
gst_element_set_state (data.pipeline, GST_STATE_NULL);
|
||||
if (data.video_sink != NULL)
|
||||
gst_object_unref (data.video_sink);
|
||||
gst_object_unref (data.pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1662010270" class="expand-container">
|
||||
<div id="expander-control-1662010270" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1662010270" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-13.c -o basic-tutorial-13 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The console shows the available commands, composed of a single upper-case or lower-case letter, which you should input followed by the Enter key.</span></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
There is nothing new in the initialization code in the main function: a
|
||||
`playbin2` pipeline is instantiated, an I/O watch is installed to track
|
||||
keystrokes and a GLib main loop is executed.
|
||||
|
||||
Then, in the keyboard handler function:
|
||||
|
||||
``` first-line: 45; theme: Default; brush: cpp; gutter: true
|
||||
/* Process keyboard input */
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
|
||||
gchar *str = NULL;
|
||||
|
||||
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) != G_IO_STATUS_NORMAL) {
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
switch (g_ascii_tolower (str[0])) {
|
||||
case 'p':
|
||||
data->playing = !data->playing;
|
||||
gst_element_set_state (data->pipeline, data->playing ? GST_STATE_PLAYING : GST_STATE_PAUSED);
|
||||
g_print ("Setting state to %s\n", data->playing ? "PLAYING" : "PAUSE");
|
||||
break;
|
||||
```
|
||||
|
||||
Pause / Playing toggle is handled with `gst_element_set_state()` as in
|
||||
previous tutorials.
|
||||
|
||||
``` first-line: 59; theme: Default; brush: cpp; gutter: true
|
||||
case 's':
|
||||
if (g_ascii_isupper (str[0])) {
|
||||
data->rate *= 2.0;
|
||||
} else {
|
||||
data->rate /= 2.0;
|
||||
}
|
||||
send_seek_event (data);
|
||||
break;
|
||||
case 'd':
|
||||
data->rate *= -1.0;
|
||||
send_seek_event (data);
|
||||
break;
|
||||
```
|
||||
|
||||
Use ‘S’ and ‘s’ to double or halve the current playback rate, and ‘d’ to
|
||||
reverse the current playback direction. In both cases, the
|
||||
`rate` variable is updated and `send_seek_event` is called. Let’s
|
||||
review this function.
|
||||
|
||||
``` first-line: 13; theme: Default; brush: cpp; gutter: true
|
||||
/* Send seek event to change rate */
|
||||
static void send_seek_event (CustomData *data) {
|
||||
gint64 position;
|
||||
GstFormat format = GST_FORMAT_TIME;
|
||||
GstEvent *seek_event;
|
||||
|
||||
/* Obtain the current position, needed for the seek event */
|
||||
if (!gst_element_query_position (data->pipeline, &format, &position)) {
|
||||
g_printerr ("Unable to retrieve current position.\n");
|
||||
return;
|
||||
}
|
||||
```
|
||||
|
||||
This function creates a new Seek Event and sends it to the pipeline to
|
||||
update the rate. First, the current position is recovered with
|
||||
`gst_element_query_position()`. This is needed because the Seek Event
|
||||
jumps to another position in the stream, and, since we do not actually
|
||||
want to move, we jump to the current position. Using a Step Event would
|
||||
be simpler, but this event is not currently fully functional, as
|
||||
explained in the Introduction.
|
||||
|
||||
``` first-line: 25; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the seek event */
|
||||
if (data->rate > 0) {
|
||||
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
|
||||
GST_SEEK_TYPE_SET, position, GST_SEEK_TYPE_NONE, 0);
|
||||
} else {
|
||||
seek_event = gst_event_new_seek (data->rate, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE,
|
||||
GST_SEEK_TYPE_SET, 0, GST_SEEK_TYPE_SET, position);
|
||||
}
|
||||
```
|
||||
|
||||
The Seek Event is created with `gst_event_new_seek()`. Its parameters
|
||||
are, basically, the new rate, the new start position and the new stop
|
||||
position. Regardless of the playback direction, the start position must
|
||||
be smaller than the stop position, so the two playback directions are
|
||||
treated differently.
|
||||
|
||||
``` first-line: 34; theme: Default; brush: cpp; gutter: true
|
||||
if (data->video_sink == NULL) {
|
||||
/* If we have not done so, obtain the sink through which we will send the seek events */
|
||||
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
|
||||
}
|
||||
```
|
||||
|
||||
As explained in the Introduction, to avoid performing multiple Seeks,
|
||||
the Event is sent to only one sink, in this case, the video sink. It is
|
||||
obtained from `playbin2` through the `video-sink` property. It is read
|
||||
at this time instead at initialization time because the actual sink may
|
||||
change depending on the media contents, and this won’t be known until
|
||||
the pipeline is PLAYING and some media has been read.
|
||||
|
||||
``` first-line: 39; theme: Default; brush: cpp; gutter: true
|
||||
/* Send the event */
|
||||
gst_element_send_event (data->video_sink, seek_event);
|
||||
```
|
||||
|
||||
The new Event is finally sent to the selected sink with
|
||||
`gst_element_send_event()`.
|
||||
|
||||
Back to the keyboard handler, we still miss the frame stepping code,
|
||||
which is really simple:
|
||||
|
||||
``` first-line: 71; theme: Default; brush: cpp; gutter: true
|
||||
case 'n':
|
||||
if (data->video_sink == NULL) {
|
||||
/* If we have not done so, obtain the sink through which we will send the step events */
|
||||
g_object_get (data->pipeline, "video-sink", &data->video_sink, NULL);
|
||||
}
|
||||
|
||||
gst_element_send_event (data->video_sink,
|
||||
gst_event_new_step (GST_FORMAT_BUFFERS, 1, data->rate, TRUE, FALSE));
|
||||
g_print ("Stepping one frame\n");
|
||||
break;
|
||||
```
|
||||
|
||||
A new Step Event is created with `gst_event_new_step()`, whose
|
||||
parameters basically specify the amount to skip (1 frame in the example)
|
||||
and the new rate (which we do not change).
|
||||
|
||||
The video sink is grabbed from `playbin2` in case we didn’t have it yet,
|
||||
just like before.
|
||||
|
||||
And with this we are done. When testing this tutorial, keep in mind that
|
||||
backward playback is not optimal in many elements.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>Changing the playback rate might only work with local files. If you cannot modify it, try changing the URI passed to <code>playbin2</code> in line 114 to a local URI, starting with <code>file:///</code></p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to change the playback rate using a Seek Event, created with
|
||||
`gst_event_new_seek()` and fed to the pipeline
|
||||
with `gst_element_send_event()`.
|
||||
- How to advance a video frame-by-frame by using Step Events, created
|
||||
with `gst_event_new_step()`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-13.c](attachments/327800/2424883.c) (text/plain)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[vs2010.zip](attachments/327800/2424884.zip) (application/zip)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
367
Basic+tutorial+14%3A+Handy+elements.markdown
Normal file
|
@ -0,0 +1,367 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 14: Handy elements
|
||||
|
||||
This page last changed on May 13, 2014 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
This tutorial gives a list of handy GStreamer elements that are worth
|
||||
knowing. They range from powerful all-in-one elements that allow you to
|
||||
build complex pipelines easily (like `playbin2`), to little helper
|
||||
elements which are extremely useful when debugging.
|
||||
|
||||
For simplicity, the following examples are given using the
|
||||
`gst-launch` tool (Learn about it in [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)). Use the
|
||||
`-v` command line parameter if you want to see the Pad Caps that are
|
||||
being negotiated.
|
||||
|
||||
# Bins
|
||||
|
||||
These are Bin elements which you treat as a single element and they take
|
||||
care of instantiating all the necessary internal pipeline to accomplish
|
||||
their task.
|
||||
|
||||
### `playbin2`
|
||||
|
||||
This element has been extensively used throughout the tutorials. It
|
||||
manages all aspects of media playback, from source to display, passing
|
||||
through demuxing and decoding. It is so flexible and has so many options
|
||||
that a whole set of tutorials are devoted to it. See the [Playback
|
||||
tutorials](Playback%2Btutorials.html) for more details.
|
||||
|
||||
### `uridecodebin`
|
||||
|
||||
This element decodes data from a URI into raw media. It selects a source
|
||||
element that can handle the given URI scheme and connects it to
|
||||
a `decodebin2` element. It acts like a demuxer, so it offers as many
|
||||
source pads as streams are found in the
|
||||
media.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
### `decodebin2`
|
||||
|
||||
This element automatically constructs a decoding pipeline using
|
||||
available decoders and demuxers via auto-plugging until raw media is
|
||||
obtained. It is used internally by `uridecodebin` which is often more
|
||||
convenient to use, as it creates a suitable source element as well. It
|
||||
replaces the old `decodebin` element. It acts like a demuxer, so it
|
||||
offers as many source pads as streams are found in the
|
||||
media.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin2 ! autovideosink
|
||||
```
|
||||
|
||||
# File input/output
|
||||
|
||||
### `filesrc`
|
||||
|
||||
This element reads a local file and produces media with `ANY` Caps. If
|
||||
you want to obtain the correct Caps for the media, explore the stream by
|
||||
using a `typefind` element or by setting the `typefind` property
|
||||
of `filesrc` to
|
||||
`TRUE`.
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
gst-launch-0.10 filesrc location=f:\\media\\sintel\\sintel_trailer-480p.webm ! decodebin2 ! autovideosink
|
||||
```
|
||||
|
||||
### `filesink`
|
||||
|
||||
This element writes to a file all the media it receives. Use the
|
||||
`location` property to specify the file
|
||||
name.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 audiotestsrc ! vorbisenc ! oggmux ! filesink location=test.ogg
|
||||
```
|
||||
|
||||
# Network
|
||||
|
||||
### `souphttpsrc`
|
||||
|
||||
This element receives data as a client over the network via HTTP using
|
||||
the SOUP library. Set the URL to retrieve through the `location`
|
||||
property.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin2 ! autovideosink
|
||||
```
|
||||
|
||||
# Test media generation
|
||||
|
||||
These elements are very useful to check if other parts of the pipeline
|
||||
are working, by replacing the source by one of these test sources which
|
||||
are “guaranteed” to work.
|
||||
|
||||
### `videotestsrc`
|
||||
|
||||
This element produces a video pattern (selectable among many different
|
||||
options with the `pattern` property). Use it to test video pipelines.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
### `audiotestsrc`
|
||||
|
||||
This element produces an audio wave (selectable among many different
|
||||
options with the `wave` property). Use it to test video pipelines.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 audiotestsrc ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
# Video adapters
|
||||
|
||||
### `ffmpegcolorspace`
|
||||
|
||||
This element converts from one color space (e.g. RGB) to another one
|
||||
(e.g. YUV). It can also convert between different YUV formats (e.g.
|
||||
I420, NV12, YUY2 …) or RGB format arrangements (e.g. RGBA, ARGB, BGRA…).
|
||||
|
||||
This is normally your first choice when solving negotiation problems.
|
||||
When not needed, because its upstream and downstream elements can
|
||||
already understand each other, it acts in pass-through mode having
|
||||
minimal impact on the performance.
|
||||
|
||||
As a rule of thumb, always use `ffmpegcolorspace` whenever you use
|
||||
elements whose Caps are unknown at design time, like `autovideosink`, or
|
||||
that can vary depending on external factors, like decoding a
|
||||
user-provided file.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
### `videorate`
|
||||
|
||||
This element takes an incoming stream of time-stamped video frames and
|
||||
produces a stream that matches the source pad's frame rate. The
|
||||
correction is performed by dropping and duplicating frames, no fancy
|
||||
algorithm is used to interpolate frames.
|
||||
|
||||
This is useful to allow elements requiring different frame rates to
|
||||
link. As with the other adapters, if it is not needed (because there is
|
||||
a frame rate on which both Pads can agree), it acts in pass-through mode
|
||||
and does not impact performance.
|
||||
|
||||
It is therefore a good idea to always use it whenever the actual frame
|
||||
rate is unknown at design time, just in
|
||||
case.
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
gst-launch-0.10 videotestsrc ! video/x-raw-rgb,framerate=30/1 ! videorate ! video/x-raw-rgb,framerate=1/1 ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
### `videoscale`
|
||||
|
||||
This element resizes video frames. By default the element tries to
|
||||
negotiate to the same size on the source and sink Pads so that no
|
||||
scaling is needed. It is therefore safe to insert this element in a
|
||||
pipeline to get more robust behavior without any cost if no scaling is
|
||||
needed.
|
||||
|
||||
This element supports a wide range of color spaces including various YUV
|
||||
and RGB formats and is therefore generally able to operate anywhere in a
|
||||
pipeline.
|
||||
|
||||
If the video is to be output to a window whose size is controlled by the
|
||||
user, it is a good idea to use a `videoscale` element, since not all
|
||||
video sinks are capable of performing scaling
|
||||
operations.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! videoscale ! video/x-raw-yuv,width=178,height=100 ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
# Audio adapters
|
||||
|
||||
### `audioconvert`
|
||||
|
||||
This element converts raw audio buffers between various possible
|
||||
formats. It supports integer to float conversion, width/depth
|
||||
conversion, signedness and endianness conversion and channel
|
||||
transformations.
|
||||
|
||||
Like `ffmpegcolorspace` does for video, you use this to solve
|
||||
negotiation problems with audio, and it is generally safe to use it
|
||||
liberally, since this element does nothing if it is not needed.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 audiotestsrc ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
### `audioresample`
|
||||
|
||||
This element resamples raw audio buffers to different sampling rates
|
||||
using a configurable windowing function to enhance quality
|
||||
|
||||
Again, use it to solve negotiation problems regarding sampling rates and
|
||||
do not fear to use it
|
||||
generously.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! audioresample ! audio/x-raw-float,rate=4000 ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
### `audiorate`
|
||||
|
||||
This element takes an incoming stream of time-stamped raw audio frames
|
||||
and produces a perfect stream by inserting or dropping samples as
|
||||
needed. It does not allow the sample rate to be changed
|
||||
as `videorate` does, it just fills gaps and removes overlapped samples
|
||||
so the output stream is continuous and “clean”.
|
||||
|
||||
It is useful in situations where the timestamps are going to be lost
|
||||
(when storing into certain file formats, for example) and the receiver
|
||||
will require all samples to be present. It is cumbersome to exemplify
|
||||
this, so no example is given.
|
||||
|
||||
# Multithreading
|
||||
|
||||
### `queue`
|
||||
|
||||
Queues have been explained in [Basic tutorial 7: Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html).
|
||||
Basically, a queue performs two tasks:
|
||||
|
||||
- Data is queued until a selected limit is reached. Any attempt to
|
||||
push more buffers into the queue blocks the pushing thread until
|
||||
more space becomes available.
|
||||
- The queue creates a new thread on the source Pad to decouple the
|
||||
processing on sink and source Pads.
|
||||
|
||||
Additionally, `queue` triggers signals when it is about to become empty
|
||||
or full (according to some configurable thresholds), and can be
|
||||
instructed to drop buffers instead of blocking when it is full.
|
||||
|
||||
As a rule of thumb, prefer the simpler `queue` element
|
||||
over `queue2` whenever network buffering is not a concern to you.
|
||||
See [Basic tutorial 7: Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html)
|
||||
for an example.
|
||||
|
||||
### `queue2`
|
||||
|
||||
This element is not an evolution of `queue`. It has the same design
|
||||
goals but follows a different implementation approach, which results in
|
||||
different features. Unfortunately, it is often not easy to tell which
|
||||
queue is the best choice.
|
||||
|
||||
`queue2` performs the two tasks listed above for `queue`, and,
|
||||
additionally, is able to store the received data (or part of it) on a
|
||||
disk file, for later retrieval. It also replaces the signals with the
|
||||
more general and convenient buffering messages described in [Basic
|
||||
tutorial 12: Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html).
|
||||
|
||||
As a rule of thumb, prefer `queue2` over `queue` whenever network
|
||||
buffering is a concern to you. See [Basic tutorial 12:
|
||||
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) for an example
|
||||
(`queue2` is hidden inside `playbin2`).
|
||||
|
||||
### `multiqueue`
|
||||
|
||||
This element provides queues for multiple streams simultaneously, and
|
||||
eases their management, by allowing some queues to grow if no data is
|
||||
being received on other streams, or by allowing some queues to drop data
|
||||
if they are not connected to anything (instead of returning an error, as
|
||||
a simpler queue would do). Additionally, it synchronizes the different
|
||||
streams, ensuring that none of them goes too far ahead of the others.
|
||||
|
||||
This is an advanced element. It is found inside `decodebin2`, but you
|
||||
will rarely need to instantiate it yourself in a normal playback
|
||||
application.
|
||||
|
||||
### `tee`
|
||||
|
||||
[Basic tutorial 7: Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) already
|
||||
showed how to use a `tee` element, which splits data to multiple pads.
|
||||
Splitting the data flow is useful, for example, when capturing a video
|
||||
where the video is shown on the screen and also encoded and written to a
|
||||
file. Another example is playing music and hooking up a visualization
|
||||
module.
|
||||
|
||||
One needs to use separate `queue` elements in each branch to provide
|
||||
separate threads for each branch. Otherwise a blocked dataflow in one
|
||||
branch would stall the other
|
||||
branches.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 audiotestsrc ! tee name=t ! queue ! audioconvert ! autoaudiosink t. ! queue ! wavescope ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
# Capabilities
|
||||
|
||||
### `capsfilter`
|
||||
|
||||
[Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) already
|
||||
explained how to use Caps filters with `gst-launch`. When building a
|
||||
pipeline programmatically, Caps filters are implemented with
|
||||
the `capsfilter` element. This element does not modify data as such,
|
||||
but enforces limitations on the data
|
||||
format.
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
gst-launch-0.10 videotestsrc ! video/x-raw-gray ! ffmpegcolorspace ! autovideosink
|
||||
```
|
||||
|
||||
### `typefind`
|
||||
|
||||
This element determines the type of media a stream contains. It applies
|
||||
typefind functions in the order of their rank. Once the type has been
|
||||
detected it sets its source Pad Caps to the found media type and emits
|
||||
the `have-type` signal.
|
||||
|
||||
It is instantiated internally by `decodebin2`, and you can use it too to
|
||||
find the media type, although you can normally use the
|
||||
`GstDiscoverer` which provides more information (as seen in [Basic
|
||||
tutorial 9: Media information
|
||||
gathering](Basic%2Btutorial%2B9%253A%2BMedia%2Binformation%2Bgathering.html)).
|
||||
|
||||
# Debugging
|
||||
|
||||
### `fakesink`
|
||||
|
||||
This sink element simply swallows any data fed to it. It is useful when
|
||||
debugging, to replace your normal sinks and rule them out of the
|
||||
equation. It can be very verbose when combined with the `-v` switch
|
||||
of `gst-launch`, so use the `silent` property to remove any unwanted
|
||||
noise.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 audiotestsrc num-buffers=1000 ! fakesink sync=false
|
||||
```
|
||||
|
||||
### `identity`
|
||||
|
||||
This is a dummy element that passes incoming data through unmodified. It
|
||||
has several useful diagnostic functions, such as offset and timestamp
|
||||
checking, or buffer dropping. Read its documentation to learn all the
|
||||
things this seemingly harmless element can
|
||||
do.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gst-launch-0.10 audiotestsrc ! identity drop-probability=0.1 ! audioconvert ! autoaudiosink
|
||||
```
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has listed a few elements which are worth knowing, due to
|
||||
their usefulness in the day-to-day work with GStreamer. Some are
|
||||
valuable for production pipelines, whereas others are only needed for
|
||||
debugging purposes.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
255
Basic+tutorial+15%3A+Clutter+integration.markdown
Normal file
|
@ -0,0 +1,255 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 15: Clutter integration
|
||||
|
||||
This page last changed on Jul 11, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
“[Clutter](https://clutter-project.org/) is an open source software
|
||||
library for creating fast, compelling, portable, and dynamic graphical
|
||||
user interfaces”. GStreamer can be integrated into Clutter through the
|
||||
`cluttersink` element, allowing video to be used as a texture. This
|
||||
tutorial shows:
|
||||
|
||||
- How to use the video output of a GStreamer pipeline as a texture in
|
||||
Clutter.
|
||||
|
||||
# Introduction
|
||||
|
||||
The process to link GStreamer with Clutter is actually very simple. A `
|
||||
cluttersink `element must be instantiated (or, better,
|
||||
`autocluttersink`, if available) and used as the video sink. Through the
|
||||
`texture` property this element accepts a Clutter texture, which is
|
||||
automatically updated by GStreamer.
|
||||
|
||||
# A 3D media player
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-15.c`..
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This tutorial is included in the SDK since release 2012.9. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
**basic-tutorial-15.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <clutter-gst/clutter-gst.h>
|
||||
|
||||
/* Setup the video texture once its size is known */
|
||||
void size_change (ClutterActor *texture, gint width, gint height, gpointer user_data) {
|
||||
ClutterActor *stage;
|
||||
gfloat new_x, new_y, new_width, new_height;
|
||||
gfloat stage_width, stage_height;
|
||||
ClutterAnimation *animation = NULL;
|
||||
|
||||
stage = clutter_actor_get_stage (texture);
|
||||
if (stage == NULL)
|
||||
return;
|
||||
|
||||
clutter_actor_get_size (stage, &stage_width, &stage_height);
|
||||
|
||||
/* Center video on window and calculate new size preserving aspect ratio */
|
||||
new_height = (height * stage_width) / width;
|
||||
if (new_height <= stage_height) {
|
||||
new_width = stage_width;
|
||||
|
||||
new_x = 0;
|
||||
new_y = (stage_height - new_height) / 2;
|
||||
} else {
|
||||
new_width = (width * stage_height) / height;
|
||||
new_height = stage_height;
|
||||
|
||||
new_x = (stage_width - new_width) / 2;
|
||||
new_y = 0;
|
||||
}
|
||||
clutter_actor_set_position (texture, new_x, new_y);
|
||||
clutter_actor_set_size (texture, new_width, new_height);
|
||||
clutter_actor_set_rotation (texture, CLUTTER_Y_AXIS, 0.0, stage_width / 2, 0, 0);
|
||||
/* Animate it */
|
||||
animation = clutter_actor_animate (texture, CLUTTER_LINEAR, 10000, "rotation-angle-y", 360.0, NULL);
|
||||
clutter_animation_set_loop (animation, TRUE);
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline, *sink;
|
||||
ClutterTimeline *timeline;
|
||||
ClutterActor *stage, *texture;
|
||||
|
||||
/* clutter-gst takes care of initializing Clutter and GStreamer */
|
||||
if (clutter_gst_init (&argc, &argv) != CLUTTER_INIT_SUCCESS) {
|
||||
g_error ("Failed to initialize clutter\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
stage = clutter_stage_get_default ();
|
||||
|
||||
/* Make a timeline */
|
||||
timeline = clutter_timeline_new (1000);
|
||||
g_object_set(timeline, "loop", TRUE, NULL);
|
||||
|
||||
/* Create new texture and disable slicing so the video is properly mapped onto it */
|
||||
texture = CLUTTER_ACTOR (g_object_new (CLUTTER_TYPE_TEXTURE, "disable-slicing", TRUE, NULL));
|
||||
g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL);
|
||||
|
||||
/* Build the GStreamer pipeline */
|
||||
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Instantiate the Clutter sink */
|
||||
sink = gst_element_factory_make ("autocluttersink", NULL);
|
||||
if (sink == NULL) {
|
||||
/* Revert to the older cluttersink, in case autocluttersink was not found */
|
||||
sink = gst_element_factory_make ("cluttersink", NULL);
|
||||
}
|
||||
if (sink == NULL) {
|
||||
g_printerr ("Unable to find a Clutter sink.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Link GStreamer with Clutter by passing the Clutter texture to the Clutter sink*/
|
||||
g_object_set (sink, "texture", texture, NULL);
|
||||
|
||||
/* Add the Clutter sink to the pipeline */
|
||||
g_object_set (pipeline, "video-sink", sink, NULL);
|
||||
|
||||
/* Start playing */
|
||||
gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* start the timeline */
|
||||
clutter_timeline_start (timeline);
|
||||
|
||||
/* Add texture to the stage, and show it */
|
||||
clutter_group_add (CLUTTER_GROUP (stage), texture);
|
||||
clutter_actor_show_all (stage);
|
||||
|
||||
clutter_main();
|
||||
|
||||
/* Free resources */
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1303246949" class="expand-container">
|
||||
<div id="expander-control-1303246949" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1303246949" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-15.c -o basic-tutorial-15 `pkg-config --cflags --libs clutter-gst-1.0 gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial opens a window and displays a movie <span>on a revolving plane</span>, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed.</span></p>
|
||||
<p>Required libraries: <code>clutter-gst-1.0 gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
It is not the purpose of this tutorial to teach how to use Clutter, but
|
||||
how to integrate GStreamer with it. This is accomplished through the
|
||||
clutter-gst library, so its header must be included (and the program
|
||||
must link against it):
|
||||
|
||||
``` first-line: 1; theme: Default; brush: cpp; gutter: true
|
||||
#include <clutter-gst/clutter-gst.h>
|
||||
```
|
||||
|
||||
The first thing this library does is initialize both GStreamer and
|
||||
Clutter, so you must call ` clutter-gst-init()` instead of initializing
|
||||
these libraries yourself.
|
||||
|
||||
``` first-line: 43; theme: Default; brush: cpp; gutter: true
|
||||
/* clutter-gst takes care of initializing Clutter and GStreamer */
|
||||
if (clutter_gst_init (&argc, &argv) != CLUTTER_INIT_SUCCESS) {
|
||||
g_error ("Failed to initialize clutter\n");
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
The GStreamer video is to be played on a Clutter texture, so, we need to
|
||||
create a texture. Just remember to disable texture slicing to allow for
|
||||
proper
|
||||
integration:
|
||||
|
||||
``` first-line: 55; theme: Default; brush: cpp; gutter: true
|
||||
/* Create new texture and disable slicing so the video is properly mapped onto it */
|
||||
texture = CLUTTER_ACTOR (g_object_new (CLUTTER_TYPE_TEXTURE, "disable-slicing", TRUE, NULL));
|
||||
g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL);
|
||||
```
|
||||
|
||||
We connect to the size-change signal so we can perform final setup once
|
||||
the video size is known.
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
/* Instantiate the Clutter sink */
|
||||
sink = gst_element_factory_make ("autocluttersink", NULL);
|
||||
if (sink == NULL) {
|
||||
/* Revert to the older cluttersink, in case autocluttersink was not found */
|
||||
sink = gst_element_factory_make ("cluttersink", NULL);
|
||||
}
|
||||
if (sink == NULL) {
|
||||
g_printerr ("Unable to find a Clutter sink.\n");
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
The proper Clutter sink element to instantiate for GStreamer is
|
||||
`autocluttersink`, which works more or less like `autovideosink`, trying
|
||||
to find the best Clutter sink available. However, `autocluttersink` (for
|
||||
which there is no documentation yet) is only available since the 2012.7
|
||||
release of the SDK, so, if it cannot be found, the
|
||||
simpler `cluttersink` element is created
|
||||
instead.
|
||||
|
||||
``` first-line: 73; theme: Default; brush: cpp; gutter: true
|
||||
/* Link GStreamer with Clutter by passing the Clutter texture to the Clutter sink*/
|
||||
g_object_set (sink, "texture", texture, NULL);
|
||||
```
|
||||
|
||||
This texture is everything GStreamer needs to know about Clutter.
|
||||
|
||||
``` first-line: 76; theme: Default; brush: cpp; gutter: true
|
||||
/* Add the Clutter sink to the pipeline */
|
||||
g_object_set (pipeline, "video-sink", sink, NULL);
|
||||
```
|
||||
|
||||
Finally, tell `playbin2` to use the sink we created instead of the
|
||||
default one.
|
||||
|
||||
Then the GStreamer pipeline and the Clutter timeline are started and the
|
||||
ball starts rolling. Once the pipeline has received enough information
|
||||
to know the video size (width and height), the Clutter texture gets
|
||||
updated and we receive a notification, handled in the
|
||||
`size_change` callback. This method sets the texture to the proper
|
||||
size, centers it on the output window and starts a revolving animation,
|
||||
just for demonstration purposes. But this has nothing to do with
|
||||
GStreamer.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to use the video output of a GStreamer pipeline as a Clutter
|
||||
texture using the ` cluttersink` or `autocluttersink` elements.
|
||||
- How to link GStreamer and Clutter through the `texture` property of
|
||||
` cluttersink` or `autocluttersink`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
212
Basic+tutorial+16%3A+Platform-specific+elements.markdown
Normal file
|
@ -0,0 +1,212 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 16: Platform-specific elements
|
||||
|
||||
This page last changed on May 30, 2013 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Even though GStreamer is a multiplatform framework, not all the elements
|
||||
are available on all platforms. For example, the audio and video sinks
|
||||
depend heavily on the underlying windowing system, and a different one
|
||||
needs to be selected depending on the platform. You normally do not need
|
||||
to worry about this when using elements like `playbin2` or
|
||||
`autovideosink`, but, for those cases when you need to use one of the
|
||||
sinks that are only available on specific platforms, this tutorial hints
|
||||
you some of their peculiarities.
|
||||
|
||||
# Linux
|
||||
|
||||
### `ximagesink`
|
||||
|
||||
A standard X-based video sink. It implements the XOverlay interface, so
|
||||
the video window can be re-parented (embedded inside other windows). It
|
||||
does not support scaling; it has to be performed by different means
|
||||
(using the `videoscale` element, for example).
|
||||
|
||||
### `xvimagesink`
|
||||
|
||||
An X-based video sink, using the [X Video
|
||||
Extension](http://en.wikipedia.org/wiki/X_video_extension) (Xv). It
|
||||
implements the XOverlay interface, so the video window can be
|
||||
re-parented (embedded inside other windows). It can perform scaling
|
||||
efficiently, on the GPU. It is only available if the hardware and
|
||||
corresponding drivers support the Xv extension.
|
||||
|
||||
### `cluttersink`
|
||||
|
||||
This is a GStreamer video sink element that sends data to a
|
||||
[ClutterTexture](http://developer.gnome.org/clutter-gst/stable/ClutterGstVideoTexture.html)
|
||||
which can then be used in any [Clutter](https://clutter-project.org/)
|
||||
scene (See [Basic tutorial 15: Clutter
|
||||
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)). Clutter
|
||||
is a multiplatform library, so this sink is available on every platform
|
||||
(as long as Clutter is installed). Clutter achieves
|
||||
platform-independence by using [OpenGL](http://www.opengl.org) as the
|
||||
rendering backend, so OpenGL must also be available in the system.
|
||||
|
||||
### `alsasink`
|
||||
|
||||
This audio sink outputs to the sound card via
|
||||
[ALSA](http://www.alsa-project.org/) (Advanced Linux Sound
|
||||
Architecture). This sink is available on almost every Linux platform. It
|
||||
is often seen as a “low level” interface to the sound card, and can be
|
||||
complicated to configure (See the comment on [Playback tutorial 9:
|
||||
Digital audio
|
||||
pass-through](Playback%2Btutorial%2B9%253A%2BDigital%2Baudio%2Bpass-through.html)).
|
||||
|
||||
### `pulsesink`
|
||||
|
||||
This sink plays audio to a [PulseAudio](http://www.pulseaudio.org/)
|
||||
server. It is a higher level abstraction of the sound card than ALSA,
|
||||
and is therefore easier to use and offers more advanced features. It has
|
||||
been known to be unstable on some older Linux distributions, though.
|
||||
|
||||
# Mac OS X
|
||||
|
||||
### `osxvideosink`
|
||||
|
||||
This is the only video sink available to GStreamer on Mac OS X.
|
||||
|
||||
### `cluttersink`
|
||||
|
||||
This is a GStreamer video sink element that sends data to
|
||||
a [ClutterTexture](http://developer.gnome.org/clutter-gst/stable/ClutterGstVideoTexture.html) which
|
||||
can then be used in any [Clutter](https://clutter-project.org/) scene
|
||||
(See [Basic tutorial 15: Clutter
|
||||
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)). Clutter
|
||||
is a multiplatform library, so this sink is available on every platform
|
||||
(as long as Clutter is installed). Clutter achieves
|
||||
platform-independence by using [OpenGL](http://www.opengl.org/) as the
|
||||
rendering backend, so OpenGL must also be available in the system.
|
||||
|
||||
### `osxaudiosink`
|
||||
|
||||
This is the only audio sink available to GStreamer on Mac OS X.
|
||||
|
||||
# Windows
|
||||
|
||||
### `directdrawsink`
|
||||
|
||||
This is the oldest of the Windows video sinks, based on [Direct
|
||||
Draw](http://en.wikipedia.org/wiki/DirectDraw). It requires DirectX 7,
|
||||
so it is available on almost every current Windows platform. It supports
|
||||
rescaling and filtering of the scaled image to alleviate aliasing.
|
||||
|
||||
### `dshowvideosink`
|
||||
|
||||
This video sink is based on [Direct
|
||||
Show](http://en.wikipedia.org/wiki/Direct_Show). It can use different
|
||||
rendering back-ends, like
|
||||
[EVR](http://en.wikipedia.org/wiki/Enhanced_Video_Renderer),
|
||||
[VMR9](http://en.wikipedia.org/wiki/Direct_Show#Video_rendering_filters)
|
||||
or
|
||||
[VMR7](http://en.wikipedia.org/wiki/Direct_Show#Video_rendering_filters),
|
||||
EVR only being available on Windows Vista or more recent. It supports
|
||||
rescaling and filtering of the scaled image to alleviate aliasing. It
|
||||
implements the XOverlay interface, so the video window can be
|
||||
re-parented (embedded inside other windows).
|
||||
|
||||
### `d3dvideosink`
|
||||
|
||||
This video sink is based on
|
||||
[Direct3D](http://en.wikipedia.org/wiki/Direct3D) and it’s the most
|
||||
recent Windows video sink. It supports rescaling and filtering of the
|
||||
scaled image to alleviate aliasing. It implements the XOverlay
|
||||
interface, so the video window can be re-parented (embedded inside other
|
||||
windows).
|
||||
|
||||
### `cluttersink`
|
||||
|
||||
This is a GStreamer video sink element that sends data to
|
||||
a [ClutterTexture](http://developer.gnome.org/clutter-gst/stable/ClutterGstVideoTexture.html) which
|
||||
can then be used in any [Clutter](https://clutter-project.org/) scene
|
||||
(See [Basic tutorial 15: Clutter
|
||||
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)). Clutter
|
||||
is a multiplatform library, so this sink is available on every platform
|
||||
(as long as Clutter is installed). Clutter achieves
|
||||
platform-independence by using [OpenGL](http://www.opengl.org/) as the
|
||||
rendering backend, so OpenGL must also be available in the system.
|
||||
|
||||
### `directsoundsink`
|
||||
|
||||
This is the default audio sink for Windows, based on [Direct
|
||||
Sound](http://en.wikipedia.org/wiki/DirectSound), which is available in
|
||||
all Windows versions.
|
||||
|
||||
### `dshowdecwrapper`
|
||||
|
||||
[Direct Show](http://en.wikipedia.org/wiki/Direct_Show) is a multimedia
|
||||
framework similar to GStreamer. They are different enough, though, so
|
||||
that their pipelines cannot be interconnected. However, through this
|
||||
element, GStreamer can benefit from the decoding elements present in
|
||||
Direct Show. `dshowdecwrapper` wraps multiple Direct Show decoders so
|
||||
they can be embedded in a GStreamer pipeline. Use the `gst-inspect` tool
|
||||
(see [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)) to see the
|
||||
available decoders.
|
||||
|
||||
# Android
|
||||
|
||||
### `eglglessink`
|
||||
|
||||
This video sink is based on [OpenGL
|
||||
ES](http://en.wikipedia.org/wiki/OpenGL_ES) and
|
||||
[EGL](http://en.wikipedia.org/wiki/EGL_%28OpenGL%29). It supports
|
||||
rescaling and filtering of the scaled image to alleviate aliasing. It
|
||||
implements the XOverlay interface, so the video window can be
|
||||
re-parented (embedded inside other windows).
|
||||
|
||||
### `openslessink`
|
||||
|
||||
This is the only audio sink available to GStreamer on Android. It is
|
||||
based on [OpenSL
|
||||
ES](http://en.wikipedia.org/wiki/OpenSL_ES).
|
||||
|
||||
### `androidmedia`
|
||||
|
||||
[android.media.MediaCodec](http://developer.android.com/reference/android/media/MediaCodec.html)
|
||||
is an Android specific API to access the codecs that are available on
|
||||
the device, including hardware codecs. It is available since API level
|
||||
16 (JellyBean) and GStreamer can use it via the androidmedia plugin for
|
||||
audio and video decoding.
|
||||
|
||||
# iOS
|
||||
|
||||
### `eglglessink`
|
||||
|
||||
This video sink is based on [OpenGL
|
||||
ES](http://en.wikipedia.org/wiki/OpenGL_ES) and [EGL](http://en.wikipedia.org/wiki/EGL_%28OpenGL%29).
|
||||
It supports rescaling and filtering of the scaled image to alleviate
|
||||
aliasing. It implements the XOverlay interface, so the video window can
|
||||
be re-parented (embedded inside other windows).
|
||||
|
||||
### `osxaudiosink`
|
||||
|
||||
This is the only audio sink available to GStreamer on iOS.
|
||||
|
||||
### `iosassetsrc`
|
||||
|
||||
Source element to read iOS assets, this is, documents stored in the
|
||||
Library (like photos, music and videos). It can be instantiated
|
||||
automatically by `playbin2` when URIs use the
|
||||
`assets-library://` scheme.
|
||||
|
||||
### `iosavassetsrc`
|
||||
|
||||
Source element to read and decode iOS audiovisual assets, this is,
|
||||
documents stored in the Library (like photos, music and videos). It can
|
||||
be instantiated automatically by `playbin2` when URIs use the
|
||||
`ipod-library://` scheme. Decoding is performed by the system, so
|
||||
dedicated hardware will be used if available.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown a few specific details about some GStreamer
|
||||
elements which are not available on all platforms. You do not have to
|
||||
worry about them when using multiplatform elements like `playbin2` or
|
||||
`autovideosink`, but it is good to know their personal quirks if
|
||||
instancing them manually.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
372
Basic+tutorial+2%3A+GStreamer+concepts.markdown
Normal file
|
@ -0,0 +1,372 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 2: GStreamer concepts
|
||||
|
||||
This page last changed on Jul 02, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
The previous tutorial showed how to build a pipeline automatically. Now
|
||||
we are going to build a pipeline manually by instantiating each element
|
||||
and linking them all together. In the process, we will learn:
|
||||
|
||||
- What is a GStreamer element and how to create one.
|
||||
|
||||
- How to connect elements to each other.
|
||||
|
||||
- How to customize an element's behavior.
|
||||
|
||||
- How to watch the bus for error conditions and extract information
|
||||
from GStreamer messages.
|
||||
|
||||
# Manual Hello World
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-2.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
**basic-tutorial-2.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline, *source, *sink;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstStateChangeReturn ret;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
source = gst_element_factory_make ("videotestsrc", "source");
|
||||
sink = gst_element_factory_make ("autovideosink", "sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!pipeline || !source || !sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Build the pipeline */
|
||||
gst_bin_add_many (GST_BIN (pipeline), source, sink, NULL);
|
||||
if (gst_element_link (source, sink) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Modify the source's properties */
|
||||
g_object_set (source, "pattern", 0, NULL);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Wait until error or EOS */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Parse message */
|
||||
if (msg != NULL) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
break;
|
||||
default:
|
||||
/* We should not reach here because we only asked for ERRORs and EOS */
|
||||
g_printerr ("Unexpected message received.\n");
|
||||
break;
|
||||
}
|
||||
gst_message_unref (msg);
|
||||
}
|
||||
|
||||
/* Free resources */
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-179983452" class="expand-container">
|
||||
<div id="expander-control-179983452" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-179983452" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-2.c -o basic-tutorial-2 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial opens a window and displays a test pattern, without audio.</span></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
The basic construction block of GStreamer are the elements, which
|
||||
process the data as it flows *downstream* from the source elements (the
|
||||
producers of data) to the sink elements (the consumers of data), passing
|
||||
through filter elements.
|
||||
|
||||
![](attachments/327782/1343490.png)
|
||||
|
||||
**Figure 1**. Example pipeline
|
||||
|
||||
#### Element creation
|
||||
|
||||
We will skip GStreamer initialization, since it is the same as the
|
||||
previous tutorial:
|
||||
|
||||
``` first-line: 12; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the elements */
|
||||
source = gst_element_factory_make ("videotestsrc", "source");
|
||||
sink = gst_element_factory_make ("autovideosink", "sink");
|
||||
```
|
||||
|
||||
As seen in this code, new elements can be created
|
||||
with `gst_element_factory_make()`. The first parameter is the type of
|
||||
element to create ([Basic tutorial 14: Handy
|
||||
elements](Basic%2Btutorial%2B14%253A%2BHandy%2Belements.html) shows a
|
||||
few common types, and [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) shows how to
|
||||
obtain the list of all available types). The second parameter is the
|
||||
name we want to give to this particular instance. Naming your elements
|
||||
is useful to retrieve them later if you didn't keep a pointer (and for
|
||||
more meaningful debug output). If you pass NULL for the name, however,
|
||||
GStreamer will provide a unique name for you.
|
||||
|
||||
For this tutorial we create two elements: a `videotestsrc` and
|
||||
an `autovideosink`.
|
||||
|
||||
`videotestsrc` is a source element (it produces data), which creates a
|
||||
test video pattern. This element is useful for debugging purposes (and
|
||||
tutorials) and is not usually found in real applications.
|
||||
|
||||
`autovideosink` is a sink element (it consumes data), which displays on
|
||||
a window the images it receives. There exist several video sinks,
|
||||
depending on the operating system, with a varying range of capabilities.
|
||||
`autovideosink` automatically selects and instantiates the best one, so
|
||||
you do not have to worry with the details, and your code is more
|
||||
platform-independent.
|
||||
|
||||
#### Pipeline creation
|
||||
|
||||
``` first-line: 16; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the empty pipeline */
|
||||
pipeline = gst_pipeline_new ("test-pipeline");
|
||||
```
|
||||
|
||||
All elements in GStreamer must typically be contained inside a pipeline
|
||||
before they can be used, because it takes care of some clocking and
|
||||
messaging functions. We create the pipeline with `gst_pipeline_new()`.
|
||||
|
||||
``` first-line: 24; theme: Default; brush: cpp; gutter: true
|
||||
/* Build the pipeline */
|
||||
gst_bin_add_many (GST_BIN (pipeline), source, sink, NULL);
|
||||
if (gst_element_link (source, sink) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
A pipeline is a particular type of `bin`, which is the element used to
|
||||
contain other elements. Therefore all methods which apply to bins also
|
||||
apply to pipelines. In our case, we call `gst_bin_add_many()` to add the
|
||||
elements to the pipeline (mind the cast). This function accepts a list
|
||||
of elements to be added, ending with NULL. Individual elements can be
|
||||
added with `gst_bin_add()`.
|
||||
|
||||
These elements, however, are not linked with each other yet. For this,
|
||||
we need to use `gst_element_link()`. Its first parameter is the source,
|
||||
and the second one the destination. The order counts, because links must
|
||||
be established following the data flow (this is, from source elements to
|
||||
sink elements). Keep in mind that only elements residing in the same bin
|
||||
can be linked together, so remember to add them to the pipeline before
|
||||
trying to link them\!
|
||||
|
||||
#### Properties
|
||||
|
||||
``` first-line: 32; theme: Default; brush: cpp; gutter: true
|
||||
/* Modify the source's properties */
|
||||
g_object_set (source, "pattern", 0, NULL);
|
||||
```
|
||||
|
||||
Most GStreamer elements have customizable properties: named attributes
|
||||
that can be modified to change the element's behavior (writable
|
||||
properties) or inquired to find out about the element's internal state
|
||||
(readable properties).
|
||||
|
||||
Properties are read from with `g_object_get()` and written to
|
||||
with `g_object_set()`.
|
||||
|
||||
`g_object_set()` accepts a NULL-terminated list of property-name,
|
||||
property-value pairs, so multiple properties can be changed in one go
|
||||
(GStreamer elements are all a particular kind of `GObject`, which is the
|
||||
entity offering property facilities: This is why the property handling
|
||||
methods have the `g_` prefix).
|
||||
|
||||
The line of code above changes the “pattern” property of `videotestsrc`,
|
||||
which controls the type of test video the element outputs. Try different
|
||||
values\!
|
||||
|
||||
The names and possible values of all the properties an element exposes
|
||||
can be found using the gst-inspect tool described in [Basic tutorial 10:
|
||||
GStreamer tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html).
|
||||
|
||||
#### Error checking
|
||||
|
||||
At this point, we have the whole pipeline built and setup, and the rest
|
||||
of the tutorial is very similar to the previous one, but we are going to
|
||||
add more error checking:
|
||||
|
||||
``` first-line: 35; theme: Default; brush: cpp; gutter: true
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
We call `gst_element_set_state()`, but this time we check its return
|
||||
value for errors. Changing states is a delicate process and a few more
|
||||
details are given in [Basic tutorial 3: Dynamic
|
||||
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html).
|
||||
|
||||
``` first-line: 43; theme: Default; brush: cpp; gutter: true
|
||||
/* Wait until error or EOS */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Parse message */
|
||||
if (msg != NULL) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
break;
|
||||
default:
|
||||
/* We should not reach here because we only asked for ERRORs and EOS */
|
||||
g_printerr ("Unexpected message received.\n");
|
||||
break;
|
||||
}
|
||||
gst_message_unref (msg);
|
||||
}
|
||||
```
|
||||
|
||||
`gst_bus_timed_pop_filtered()` waits for execution to end and returns
|
||||
with a `GstMessage` which we previously ignored. We
|
||||
asked `gst_bus_timed_pop_filtered()` to return when GStreamer
|
||||
encountered either an error condition or an EOS, so we need to check
|
||||
which one happened, and print a message on screen (Your application will
|
||||
probably want to undertake more complex actions).
|
||||
|
||||
`GstMessage` is a very versatile structure which can deliver virtually
|
||||
any kind of information. Fortunately, GStreamer provides a series of
|
||||
parsing functions for each kind of message.
|
||||
|
||||
In this case, once we know the message contains an error (by using the
|
||||
`GST_MESSAGE_TYPE()` macro), we can use
|
||||
`gst_message_parse_error()` which returns a GLib `GError` error
|
||||
structure and a string useful for debugging. Examine the code to see how
|
||||
these are used and freed afterward.
|
||||
|
||||
#### The GStreamer bus
|
||||
|
||||
At this point it is worth introducing the GStreamer bus a bit more
|
||||
formally. It is the object responsible for delivering to the application
|
||||
the `GstMessage`s generated by the elements, in order and to the
|
||||
application thread. This last point is important, because the actual
|
||||
streaming of media is done in another thread than the application.
|
||||
|
||||
Messages can be extracted from the bus synchronously with
|
||||
`gst_bus_timed_pop_filtered()` and its siblings, or asynchronously,
|
||||
using signals (shown in the next tutorial). Your application should
|
||||
always keep an eye on the bus to be notified of errors and other
|
||||
playback-related issues.
|
||||
|
||||
The rest of the code is the cleanup sequence, which is the same as
|
||||
in [Basic tutorial 1: Hello
|
||||
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html).
|
||||
|
||||
# Exercise
|
||||
|
||||
If you feel like practicing, try this exercise: Add a video filter
|
||||
element in between the source and the sink of this pipeline. Use
|
||||
`vertigotv` for a nice effect. You will need to create it, add it to the
|
||||
pipeline, and link it with the other elements.
|
||||
|
||||
Depending on your platform and available plugins, you might get a
|
||||
“negotiation” error, because the sink does not understand what the
|
||||
filter is producing (more about negotiation in [Basic tutorial 6: Media
|
||||
formats and Pad
|
||||
Capabilities](Basic%2Btutorial%2B6%253A%2BMedia%2Bformats%2Band%2BPad%2BCapabilities.html)).
|
||||
In this case, try to add an element called `ffmpegcolorspace` after the
|
||||
filter (this is, build a pipeline of 4 elements. More on
|
||||
`ffmpegcolorspace` in [Basic tutorial 14: Handy
|
||||
elements](Basic%2Btutorial%2B14%253A%2BHandy%2Belements.html)).
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial showed:
|
||||
|
||||
- How to create elements with `gst_element_factory_make()`
|
||||
|
||||
- How to create an empty pipeline with `gst_pipeline_new()`
|
||||
|
||||
- How to add elements to the pipeline with `gst_bin_add_many()`
|
||||
|
||||
- How to link the elements with each other with `gst_element_link()`
|
||||
|
||||
This concludes the first of the two tutorials devoted to basic GStreamer
|
||||
concepts. The second one comes next.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[figure-1.png](attachments/327782/1343490.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
566
Basic+tutorial+3%3A+Dynamic+pipelines.markdown
Normal file
|
@ -0,0 +1,566 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 3: Dynamic pipelines
|
||||
|
||||
This page last changed on May 30, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
This tutorial shows the rest of the basic concepts required to use
|
||||
GStreamer, which allow building the pipeline "on the fly", as
|
||||
information becomes available, instead of having a monolithic pipeline
|
||||
defined at the beginning of your application.
|
||||
|
||||
After this tutorial, you will have the necessary knowledge to start the
|
||||
[Playback tutorials](Playback%2Btutorials.html). The points reviewed
|
||||
here will be:
|
||||
|
||||
- How to attain finer control when linking elements.
|
||||
|
||||
- How to be notified of interesting events so you can react in time.
|
||||
|
||||
- The various states in which an element can be.
|
||||
|
||||
# Introduction
|
||||
|
||||
As you are about to see, the pipeline in this tutorial is not completely
|
||||
built before it is set to the playing state. This is OK. If we did not
|
||||
take further action, data would reach the end of the pipeline and just
|
||||
get discarded. But we are going to take further action...
|
||||
|
||||
In this example we are opening a file which is multiplexed (or *muxed)*,
|
||||
this is, audio and video are stored together inside a *container* file.
|
||||
The elements responsible for opening such containers are called
|
||||
*demuxers*, and some examples of container formats are Matroska (MKV),
|
||||
Quick Time (QT, MOV), Ogg, or Advanced Systems Format (ASF, WMV, WMA).
|
||||
|
||||
If a container embeds multiple streams (one video and two audio tracks,
|
||||
for example), the demuxer will separate them and expose them through
|
||||
different output ports. In this way, different branches can be created
|
||||
in the pipeline, dealing with different types of data.
|
||||
|
||||
The ports through which GStreamer elements communicate with each other
|
||||
are called pads (`GstPad`). There exists sink pads, through which data
|
||||
enters an element, and source pads, through which data exits an element.
|
||||
It follows naturally that source elements only contain source pads, sink
|
||||
elements only contain sink pads, and filter elements contain
|
||||
both.
|
||||
|
||||
![](attachments/327784/1540098.png) ![](attachments/327784/1540099.png) ![](attachments/327784/1540100.png)
|
||||
|
||||
**Figure 1**. GStreamer elements with their pads.
|
||||
|
||||
A demuxer contains one sink pad, through which the muxed data arrives,
|
||||
and multiple source pads, one for each stream found in the container:
|
||||
|
||||
![](attachments/327784/1540101.png)
|
||||
|
||||
**Figure 2**. A demuxer with two source pads.
|
||||
|
||||
For completeness, here you have a simplified pipeline containing a
|
||||
demuxer and two branches, one for audio and one for video. This is
|
||||
**NOT** the pipeline that will be built in this example:
|
||||
|
||||
![](attachments/327784/1540102.png)
|
||||
|
||||
**Figure 3**. Example pipeline with two branches.
|
||||
|
||||
|
||||
|
||||
The main complexity when dealing with demuxers is that they cannot
|
||||
produce any information until they have received some data and have had
|
||||
a chance to look at the container to see what is inside. This is,
|
||||
demuxers start with no source pads to which other elements can link, and
|
||||
thus the pipeline must necessarily terminate at them.
|
||||
|
||||
The solution is to build the pipeline from the source down to the
|
||||
demuxer, and set it to run (play). When the demuxer has received enough
|
||||
information to know about the number and kind of streams in the
|
||||
container, it will start creating source pads. This is the right time
|
||||
for us to finish building the pipeline and attach it to the newly added
|
||||
demuxer pads.
|
||||
|
||||
For simplicity, in this example, we will only link to the audio pad and
|
||||
ignore the video.
|
||||
|
||||
# Dyamic Hello World
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-3.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
**basic-tutorial-3.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* Structure to contain all our information, so we can pass it to callbacks */
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline;
|
||||
GstElement *source;
|
||||
GstElement *convert;
|
||||
GstElement *sink;
|
||||
} CustomData;
|
||||
|
||||
/* Handler for the pad-added signal */
|
||||
static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data);
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstStateChangeReturn ret;
|
||||
gboolean terminate = FALSE;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
data.source = gst_element_factory_make ("uridecodebin", "source");
|
||||
data.convert = gst_element_factory_make ("audioconvert", "convert");
|
||||
data.sink = gst_element_factory_make ("autoaudiosink", "sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
data.pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!data.pipeline || !data.source || !data.convert || !data.sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Build the pipeline. Note that we are NOT linking the source at this
|
||||
* point. We will do it later. */
|
||||
gst_bin_add_many (GST_BIN (data.pipeline), data.source, data.convert , data.sink, NULL);
|
||||
if (!gst_element_link (data.convert, data.sink)) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.source, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Connect to the pad-added signal */
|
||||
g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Listen to the bus */
|
||||
bus = gst_element_get_bus (data.pipeline);
|
||||
do {
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE,
|
||||
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Parse message */
|
||||
if (msg != NULL) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_STATE_CHANGED:
|
||||
/* We are only interested in state-changed messages from the pipeline */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
g_print ("Pipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
}
|
||||
break;
|
||||
default:
|
||||
/* We should not reach here */
|
||||
g_printerr ("Unexpected message received.\n");
|
||||
break;
|
||||
}
|
||||
gst_message_unref (msg);
|
||||
}
|
||||
} while (!terminate);
|
||||
|
||||
/* Free resources */
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (data.pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (data.pipeline);
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* This function will be called by the pad-added signal */
|
||||
static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) {
|
||||
GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");
|
||||
GstPadLinkReturn ret;
|
||||
GstCaps *new_pad_caps = NULL;
|
||||
GstStructure *new_pad_struct = NULL;
|
||||
const gchar *new_pad_type = NULL;
|
||||
|
||||
g_print ("Received new pad '%s' from '%s':\n", GST_PAD_NAME (new_pad), GST_ELEMENT_NAME (src));
|
||||
|
||||
/* If our converter is already linked, we have nothing to do here */
|
||||
if (gst_pad_is_linked (sink_pad)) {
|
||||
g_print (" We are already linked. Ignoring.\n");
|
||||
goto exit;
|
||||
}
|
||||
|
||||
/* Check the new pad's type */
|
||||
new_pad_caps = gst_pad_get_caps (new_pad);
|
||||
new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
|
||||
new_pad_type = gst_structure_get_name (new_pad_struct);
|
||||
if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
|
||||
g_print (" It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type);
|
||||
goto exit;
|
||||
}
|
||||
|
||||
/* Attempt the link */
|
||||
ret = gst_pad_link (new_pad, sink_pad);
|
||||
if (GST_PAD_LINK_FAILED (ret)) {
|
||||
g_print (" Type is '%s' but link failed.\n", new_pad_type);
|
||||
} else {
|
||||
g_print (" Link succeeded (type '%s').\n", new_pad_type);
|
||||
}
|
||||
|
||||
exit:
|
||||
/* Unreference the new pad's caps, if we got them */
|
||||
if (new_pad_caps != NULL)
|
||||
gst_caps_unref (new_pad_caps);
|
||||
|
||||
/* Unreference the sink pad */
|
||||
gst_object_unref (sink_pad);
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1093416675" class="expand-container">
|
||||
<div id="expander-control-1093416675" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1093416675" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-3.c -o basic-tutorial-3 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial only plays audio. The media is fetched from the Internet, so it might take a few seconds to start, depending on your connection speed.</span></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
``` first-line: 3; theme: Default; brush: cpp; gutter: true
|
||||
/* Structure to contain all our information, so we can pass it to callbacks */
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline;
|
||||
GstElement *source;
|
||||
GstElement *convert;
|
||||
GstElement *sink;
|
||||
} CustomData;
|
||||
```
|
||||
|
||||
So far we have kept all the information we needed (pointers
|
||||
to `GstElement`s, basically) as local variables. Since this tutorial
|
||||
(and most real applications) involves callbacks, we will group all our
|
||||
data in a structure for easier handling.
|
||||
|
||||
``` first-line: 11; theme: Default; brush: cpp; gutter: true
|
||||
/* Handler for the pad-added signal */
|
||||
static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data);
|
||||
```
|
||||
|
||||
This is a forward reference, to be used later.
|
||||
|
||||
``` first-line: 24; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the elements */
|
||||
data.source = gst_element_factory_make ("uridecodebin", "source");
|
||||
data.convert = gst_element_factory_make ("audioconvert", "convert");
|
||||
data.sink = gst_element_factory_make ("autoaudiosink", "sink");
|
||||
```
|
||||
|
||||
We create the elements as usual. `uridecodebin` will internally
|
||||
instantiate all the necessary elements (sources, demuxers and decoders)
|
||||
to turn a URI into raw audio and/or video streams. It does half the work
|
||||
that `playbin2` does. Since it contains demuxers, its source pads are
|
||||
not initially available and we will need to link to them on the fly.
|
||||
|
||||
`audioconvert` is useful for converting between different audio formats,
|
||||
making sure that this example will work on any platform, since the
|
||||
format produced by the audio decoder might not be the same that the
|
||||
audio sink expects.
|
||||
|
||||
The `autoaudiosink` is the equivalent of `autovideosink` seen in the
|
||||
previous tutorial, for audio. It will render the audio stream to the
|
||||
audio card.
|
||||
|
||||
``` first-line: 40; theme: Default; brush: cpp; gutter: true
|
||||
if (!gst_element_link (data.convert, data.sink)) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
Here we link the converter element to the sink, but we **DO NOT** link
|
||||
them with the source, since at this point it contains no source pads. We
|
||||
just leave this branch (converter + sink) unlinked, until later on.
|
||||
|
||||
``` first-line: 46; theme: Default; brush: cpp; gutter: true
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.source, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
```
|
||||
|
||||
We set the URI of the file to play via a property, just like we did in
|
||||
the previous tutorial.
|
||||
|
||||
### Signals
|
||||
|
||||
``` first-line: 49; theme: Default; brush: cpp; gutter: true
|
||||
/* Connect to the pad-added signal */
|
||||
g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data);
|
||||
```
|
||||
|
||||
`GSignals` are a crucial point in GStreamer. They allow you to be
|
||||
notified (by means of a callback) when something interesting has
|
||||
happened. Signals are identified by a name, and each `GObject` has its
|
||||
own signals.
|
||||
|
||||
In this line, we are *attaching* to the “pad-added” signal of our source
|
||||
(an `uridecodebin` element). To do so, we use `g_signal_connect()` and
|
||||
provide the callback function to be used (`pad_added_handler`) and a
|
||||
data pointer. GStreamer does nothing with this data pointer, it just
|
||||
forwards it to the callback so we can share information with it. In this
|
||||
case, we pass a pointer to the `CustomData` structure we built specially
|
||||
for this purpose.
|
||||
|
||||
The signals that a `GstElement` generates can be found in its
|
||||
documentation or using the `gst-inspect` tool as described in [Basic
|
||||
tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html).
|
||||
|
||||
We are now ready to go\! Just set the pipeline to the PLAYING state and
|
||||
start listening to the bus for interesting messages (like ERROR or EOS),
|
||||
just like in the previous tutorials.
|
||||
|
||||
### The callback
|
||||
|
||||
When our source element finally has enough information to start
|
||||
producing data, it will create source pads, and trigger the “pad-added”
|
||||
signal. At this point our callback will be
|
||||
called:
|
||||
|
||||
``` first-line: 110; theme: Default; brush: cpp; gutter: true
|
||||
static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) {
|
||||
```
|
||||
|
||||
`src` is the `GstElement` which triggered the signal. In this example,
|
||||
it can only be the `uridecodebin`, since it is the only signal to which
|
||||
we have attached.
|
||||
|
||||
`new_pad` is the `GstPad` that has just been added to the `src` element.
|
||||
This is usually the pad to which we want to link.
|
||||
|
||||
`data` is the pointer we provided when attaching to the signal. In this
|
||||
example, we use it to pass the `CustomData` pointer.
|
||||
|
||||
``` first-line: 111; theme: Default; brush: cpp; gutter: true
|
||||
GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");
|
||||
```
|
||||
|
||||
From `CustomData` we extract the converter element, and then retrieve
|
||||
its sink pad using `gst_element_get_static_pad ()`. This is the pad to
|
||||
which we want to link `new_pad`. In the previous tutorial we linked
|
||||
element against element, and let GStreamer choose the appropriate pads.
|
||||
Now we are going to link the pads directly.
|
||||
|
||||
``` first-line: 119; theme: Default; brush: cpp; gutter: true
|
||||
/* If our converter is already linked, we have nothing to do here */
|
||||
if (gst_pad_is_linked (sink_pad)) {
|
||||
g_print (" We are already linked. Ignoring.\n");
|
||||
goto exit;
|
||||
}
|
||||
```
|
||||
|
||||
`uridecodebin` can create as many pads as it sees fit, and for each one,
|
||||
this callback will be called. These lines of code will prevent us from
|
||||
trying to link to a new pad once we are already linked.
|
||||
|
||||
``` first-line: 125; theme: Default; brush: cpp; gutter: true
|
||||
/* Check the new pad's type */
|
||||
new_pad_caps = gst_pad_get_caps (new_pad);
|
||||
new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
|
||||
new_pad_type = gst_structure_get_name (new_pad_struct);
|
||||
if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
|
||||
g_print (" It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type);
|
||||
goto exit;
|
||||
}
|
||||
```
|
||||
|
||||
Now we will check the type of data this new pad is going to output,
|
||||
because we are only interested in pads producing audio. We have
|
||||
previously created a piece of pipeline which deals with audio (an
|
||||
`audioconvert` linked with an `autoaudiosink`), and we will not be able
|
||||
to link it to a pad producing video, for example.
|
||||
|
||||
`gst_pad_get_caps()` retrieves the *capabilities* of the pad (this is,
|
||||
the kind of data it supports), wrapped in a `GstCaps` structure. A pad
|
||||
can offer many capabilities, and hence `GstCaps` can contain many
|
||||
`GstStructure`, each representing a different capability.
|
||||
|
||||
Since, in this case, we know that the pad we want only had one
|
||||
capability (audio), we retrieve the first `GstStructure` with
|
||||
`gst_caps_get_structure()`.
|
||||
|
||||
Finally, with `gst_structure_get_name()` we recover the name of the
|
||||
structure, which contains the main description of the format (its MIME
|
||||
type, actually).
|
||||
|
||||
If the name does not start with `audio/x-raw`, this is not a decoded
|
||||
audio pad, and we are not interested in it.
|
||||
|
||||
Otherwise, attempt the link:
|
||||
|
||||
``` first-line: 134; theme: Default; brush: cpp; gutter: true
|
||||
/* Attempt the link */
|
||||
ret = gst_pad_link (new_pad, sink_pad);
|
||||
if (GST_PAD_LINK_FAILED (ret)) {
|
||||
g_print (" Type is '%s' but link failed.\n", new_pad_type);
|
||||
} else {
|
||||
g_print (" Link succeeded (type '%s').\n", new_pad_type);
|
||||
}
|
||||
```
|
||||
|
||||
`gst_pad_link()` tries to link two pads. As it was the case
|
||||
with `gst_element_link()`, the link must be specified from source to
|
||||
sink, and both pads must be owned by elements residing in the same bin
|
||||
(or pipeline).
|
||||
|
||||
And we are done\! When a pad of the right kind appears, it will be
|
||||
linked to the rest of the audio-processing pipeline and execution will
|
||||
continue until ERROR or EOS. However, we will squeeze a bit more content
|
||||
from this tutorial by also introducing the concept of State.
|
||||
|
||||
#### GStreamer States
|
||||
|
||||
We already talked a bit about states when we said that playback does not
|
||||
start until you bring the pipeline to the PLAYING state. We will
|
||||
introduce here the rest of states and their meaning. There are 4 states
|
||||
in GStreamer:
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p><code class="western">NULL</code></p></td>
|
||||
<td><p>the NULL state or initial state of an element.</p></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><p><code class="western">READY</code></p></td>
|
||||
<td><p>the element is ready to go to PAUSED.</p></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><p><code class="western">PAUSED</code></p></td>
|
||||
<td><p>the element is PAUSED, it is ready to accept and process data. Sink elements however only accept one buffer and then block.</p></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><p><code class="western">PLAYING</code></p></td>
|
||||
<td><p>the element is PLAYING, the c<span style="text-decoration: none;">lock</span> is running and the data is flowing.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
You can only move between adjacent ones, this is, you can't go from NULL
|
||||
to PLAYING, you have to go through the intermediate READY and PAUSED
|
||||
states. If you set the pipeline to PLAYING, though, GStreamer will make
|
||||
the intermediate transitions for you.
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
case GST_MESSAGE_STATE_CHANGED:
|
||||
/* We are only interested in state-changed messages from the pipeline */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
g_print ("Pipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
}
|
||||
break;
|
||||
```
|
||||
|
||||
We added this piece of code that listens to bus messages regarding state
|
||||
changes and prints them on screen to help you understand the
|
||||
transitions. Every element puts messages on the bus regarding its
|
||||
current state, so we filter them out and only listen to messages coming
|
||||
from the pipeline.
|
||||
|
||||
Most applications only need to worry about going to PLAYING to start
|
||||
playback, then to PAUSE to perform a pause, and then back to NULL at
|
||||
program exit to free all resources.
|
||||
|
||||
# Exercise
|
||||
|
||||
Dynamic pad linking has traditionally been a difficult topic for a lot
|
||||
of programmers. Prove that you have achieved its mastery by
|
||||
instantiating an `autovideosink` (probably with an `ffmpegcolorspace` in
|
||||
front) and link it to the demuxer when the right pad appears. Hint: You
|
||||
are already printing on screen the type of the video pads.
|
||||
|
||||
You should now see (and hear) the same movie as in [Basic tutorial 1:
|
||||
Hello world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html). In
|
||||
that tutorial you used `playbin2`, which is a handy element that
|
||||
automatically takes care of all the demuxing and pad linking for you.
|
||||
Most of the [Playback tutorials](Playback%2Btutorials.html) are devoted
|
||||
to `playbin2`.
|
||||
|
||||
# Conclusion
|
||||
|
||||
In this tutorial, you learned:
|
||||
|
||||
- How to be notified of events using `GSignals`
|
||||
- How to connect `GstPad`s directly instead of their parent elements.
|
||||
- The various states of a GStreamer element.
|
||||
|
||||
You also combined these items to build a dynamic pipeline, which was not
|
||||
defined at program start, but was created as information regarding the
|
||||
media was available.
|
||||
|
||||
You can now continue with the basic tutorials and learn about performing
|
||||
seeks and time-related queries in [Basic tutorial 4: Time
|
||||
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html) or move
|
||||
to the [Playback tutorials](Playback%2Btutorials.html), and gain more
|
||||
insight about the `playbin2` element.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[src-element.png](attachments/327784/1540098.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[filter-element.png](attachments/327784/1540099.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[sink-element.png](attachments/327784/1540100.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[filter-element-multi.png](attachments/327784/1540101.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[simple-player.png](attachments/327784/1540102.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
473
Basic+tutorial+4%3A+Time+management.markdown
Normal file
|
@ -0,0 +1,473 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 4: Time management
|
||||
|
||||
This page last changed on Jun 15, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
This tutorial shows how to use GStreamer time-related facilities. In
|
||||
particular:
|
||||
|
||||
- How to query the pipeline for information like stream position or
|
||||
duration.
|
||||
|
||||
- How to seek (jump) to a different position (time instant) inside the
|
||||
stream.
|
||||
|
||||
# Introduction
|
||||
|
||||
`GstQuery` is a mechanism that allows asking an element or pad for a
|
||||
piece of information. In this example we ask the pipeline if seeking is
|
||||
allowed (some sources, like live streams, do not allow seeking). If it
|
||||
is allowed, then, once the movie has been running for ten seconds, we
|
||||
skip to a different position using a seek.
|
||||
|
||||
In the previous tutorials, once we had the pipeline setup and running,
|
||||
our main function just sat and waited to receive an ERROR or an EOS
|
||||
through the bus. Here we modify this function to periodically wake up
|
||||
and query the pipeline for the stream position, so we can print it on
|
||||
screen. This is similar to what a media player would do, updating the
|
||||
User Interface on a periodic basis.
|
||||
|
||||
Finally, the stream duration is queried and updated whenever it changes.
|
||||
|
||||
# Seeking example
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-4.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
**basic-tutorial-4.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin2; /* Our one and only element */
|
||||
gboolean playing; /* Are we in the PLAYING state? */
|
||||
gboolean terminate; /* Should we terminate execution? */
|
||||
gboolean seek_enabled; /* Is seeking enabled for this media? */
|
||||
gboolean seek_done; /* Have we performed the seek already? */
|
||||
gint64 duration; /* How long does this media last, in nanoseconds */
|
||||
} CustomData;
|
||||
|
||||
/* Forward definition of the message processing function */
|
||||
static void handle_message (CustomData *data, GstMessage *msg);
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstStateChangeReturn ret;
|
||||
|
||||
data.playing = FALSE;
|
||||
data.terminate = FALSE;
|
||||
data.seek_enabled = FALSE;
|
||||
data.seek_done = FALSE;
|
||||
data.duration = GST_CLOCK_TIME_NONE;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
|
||||
|
||||
if (!data.playbin2) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.playbin2);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Listen to the bus */
|
||||
bus = gst_element_get_bus (data.playbin2);
|
||||
do {
|
||||
msg = gst_bus_timed_pop_filtered (bus, 100 * GST_MSECOND,
|
||||
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS | GST_MESSAGE_DURATION);
|
||||
|
||||
/* Parse message */
|
||||
if (msg != NULL) {
|
||||
handle_message (&data, msg);
|
||||
} else {
|
||||
/* We got no message, this means the timeout expired */
|
||||
if (data.playing) {
|
||||
GstFormat fmt = GST_FORMAT_TIME;
|
||||
gint64 current = -1;
|
||||
|
||||
/* Query the current position of the stream */
|
||||
if (!gst_element_query_position (data.playbin2, &fmt, ¤t)) {
|
||||
g_printerr ("Could not query current position.\n");
|
||||
}
|
||||
|
||||
/* If we didn't know it yet, query the stream duration */
|
||||
if (!GST_CLOCK_TIME_IS_VALID (data.duration)) {
|
||||
if (!gst_element_query_duration (data.playbin2, &fmt, &data.duration)) {
|
||||
g_printerr ("Could not query current duration.\n");
|
||||
}
|
||||
}
|
||||
|
||||
/* Print current position and total duration */
|
||||
g_print ("Position %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
|
||||
GST_TIME_ARGS (current), GST_TIME_ARGS (data.duration));
|
||||
|
||||
/* If seeking is enabled, we have not done it yet, and the time is right, seek */
|
||||
if (data.seek_enabled && !data.seek_done && current > 10 * GST_SECOND) {
|
||||
g_print ("\nReached 10s, performing seek...\n");
|
||||
gst_element_seek_simple (data.playbin2, GST_FORMAT_TIME,
|
||||
GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT, 30 * GST_SECOND);
|
||||
data.seek_done = TRUE;
|
||||
}
|
||||
}
|
||||
}
|
||||
} while (!data.terminate);
|
||||
|
||||
/* Free resources */
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (data.playbin2, GST_STATE_NULL);
|
||||
gst_object_unref (data.playbin2);
|
||||
return 0;
|
||||
}
|
||||
|
||||
static void handle_message (CustomData *data, GstMessage *msg) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
data->terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
data->terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_DURATION:
|
||||
/* The duration has changed, mark the current one as invalid */
|
||||
data->duration = GST_CLOCK_TIME_NONE;
|
||||
break;
|
||||
case GST_MESSAGE_STATE_CHANGED: {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin2)) {
|
||||
g_print ("Pipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
|
||||
/* Remember whether we are in the PLAYING state or not */
|
||||
data->playing = (new_state == GST_STATE_PLAYING);
|
||||
|
||||
if (data->playing) {
|
||||
/* We just moved to PLAYING. Check if seeking is possible */
|
||||
GstQuery *query;
|
||||
gint64 start, end;
|
||||
query = gst_query_new_seeking (GST_FORMAT_TIME);
|
||||
if (gst_element_query (data->playbin2, query)) {
|
||||
gst_query_parse_seeking (query, NULL, &data->seek_enabled, &start, &end);
|
||||
if (data->seek_enabled) {
|
||||
g_print ("Seeking is ENABLED from %" GST_TIME_FORMAT " to %" GST_TIME_FORMAT "\n",
|
||||
GST_TIME_ARGS (start), GST_TIME_ARGS (end));
|
||||
} else {
|
||||
g_print ("Seeking is DISABLED for this stream.\n");
|
||||
}
|
||||
}
|
||||
else {
|
||||
g_printerr ("Seeking query failed.");
|
||||
}
|
||||
gst_query_unref (query);
|
||||
}
|
||||
}
|
||||
} break;
|
||||
default:
|
||||
/* We should not reach here */
|
||||
g_printerr ("Unexpected message received.\n");
|
||||
break;
|
||||
}
|
||||
gst_message_unref (msg);
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1441912910" class="expand-container">
|
||||
<div id="expander-control-1441912910" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1441912910" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-4.c -o basic-tutorial-4 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. 10 seconds into the movie it skips to a new position.</span></p>
|
||||
<p><span><span>Required libraries: </span><span> </span><code>gstreamer-0.10</code></span></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
``` first-line: 3; theme: Default; brush: cpp; gutter: true
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin2; /* Our one and only element */
|
||||
gboolean playing; /* Are we in the PLAYING state? */
|
||||
gboolean terminate; /* Should we terminate execution? */
|
||||
gboolean seek_enabled; /* Is seeking enabled for this media? */
|
||||
gboolean seek_done; /* Have we performed the seek already? */
|
||||
gint64 duration; /* How long does this media last, in nanoseconds */
|
||||
} CustomData;
|
||||
|
||||
/* Forward definition of the message processing function */
|
||||
static void handle_message (CustomData *data, GstMessage *msg);
|
||||
```
|
||||
|
||||
We start by defining a structure to contain all our information, so we
|
||||
can pass it around to other functions. In particular, in this example we
|
||||
move the message handling code to its own function
|
||||
`handle_message` because it is growing a bit too big.
|
||||
|
||||
We would then build a pipeline composed of a single element, a
|
||||
`playbin2`, which we already saw in [Basic tutorial 1: Hello
|
||||
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html). However,
|
||||
`playbin2` is in itself a pipeline, and in this case it is the only
|
||||
element in the pipeline, so we use directly the `playbin2` element. We
|
||||
will skip the details: the URI of the clip is given to `playbin2` via
|
||||
the URI property and the pipeline is set to the playing state.
|
||||
|
||||
``` first-line: 53; theme: Default; brush: cpp; gutter: true
|
||||
msg = gst_bus_timed_pop_filtered (bus, 100 * GST_MSECOND,
|
||||
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS | GST_MESSAGE_DURATION);
|
||||
```
|
||||
|
||||
Previously we did not provide a timeout to
|
||||
`gst_bus_timed_pop_filtered()`, meaning that it didn't return until a
|
||||
message was received. Now we use a timeout of 100 milliseconds, so, if
|
||||
no message is received, 10 times per second the function will return
|
||||
with a NULL instead of a `GstMessage`. We are going to use this to
|
||||
update our “UI”. Note that the timeout period is specified in
|
||||
nanoseconds, so usage of the `GST_SECOND` or `GST_MSECOND` macros is
|
||||
highly recommended.
|
||||
|
||||
If we got a message, we process it in the `handle_message`` `function
|
||||
(next subsection), otherwise:
|
||||
|
||||
#### User interface resfreshing
|
||||
|
||||
``` first-line: 60; theme: Default; brush: cpp; gutter: true
|
||||
/* We got no message, this means the timeout expired */
|
||||
if (data.playing) {
|
||||
```
|
||||
|
||||
First off, if we are not in the PLAYING state, we do not want to do
|
||||
anything here, since most queries would fail. Otherwise, it is time to
|
||||
refresh the screen.
|
||||
|
||||
We get here approximately 10 times per second, a good enough refresh
|
||||
rate for our UI. We are going to print on screen the current media
|
||||
position, which we can learn be querying the pipeline. This involves a
|
||||
few steps that will be shown in the next subsection, but, since position
|
||||
and duration are common enough queries, `GstElement` offers easier,
|
||||
ready-made alternatives:
|
||||
|
||||
``` first-line: 65; theme: Default; brush: cpp; gutter: true
|
||||
/* Query the current position of the stream */
|
||||
if (!gst_element_query_position (data.pipeline, &fmt, ¤t)) {
|
||||
g_printerr ("Could not query current position.\n");
|
||||
}
|
||||
```
|
||||
|
||||
`gst_element_query_position()` hides the management of the query object
|
||||
and directly provides us with the result.
|
||||
|
||||
``` first-line: 70; theme: Default; brush: cpp; gutter: true
|
||||
/* If we didn't know it yet, query the stream duration */
|
||||
if (!GST_CLOCK_TIME_IS_VALID (data.duration)) {
|
||||
if (!gst_element_query_duration (data.pipeline, &fmt, &data.duration)) {
|
||||
g_printerr ("Could not query current duration.\n");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Now is a good moment to know the length of the stream, with
|
||||
another `GstElement` helper function: `gst_element_query_duration()`
|
||||
|
||||
``` first-line: 77; theme: Default; brush: cpp; gutter: true
|
||||
/* Print current position and total duration */
|
||||
g_print ("Position %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
|
||||
GST_TIME_ARGS (current), GST_TIME_ARGS (data.duration));
|
||||
```
|
||||
|
||||
Note the usage of the `GST_TIME_FORMAT` and `GST_TIME_ARGS` macros to
|
||||
provide user-friendly representation of GStreamer
|
||||
times.
|
||||
|
||||
``` first-line: 81; theme: Default; brush: cpp; gutter: true
|
||||
/* If seeking is enabled, we have not done it yet, and the time is right, seek */
|
||||
if (data.seek_enabled && !data.seek_done && current > 10 * GST_SECOND) {
|
||||
g_print ("\nReached 10s, performing seek...\n");
|
||||
gst_element_seek_simple (data.pipeline, GST_FORMAT_TIME,
|
||||
GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT, 30 * GST_SECOND);
|
||||
data.seek_done = TRUE;
|
||||
}
|
||||
```
|
||||
|
||||
Now we perform the seek, “simply” by
|
||||
calling `gst_element_seek_simple()` on the pipeline. A lot of the
|
||||
intricacies of seeking are hidden in this method, which is a good
|
||||
thing\!
|
||||
|
||||
Let's review the parameters:
|
||||
|
||||
`GST_FORMAT_TIME` indicates that we are specifying the destination in
|
||||
time, as opposite to bytes (and other more obscure mechanisms).
|
||||
|
||||
Then come the GstSeekFlags, let's review the most common:
|
||||
|
||||
`GST_SEEK_FLAG_FLUSH`: This discards all data currently in the pipeline
|
||||
before doing the seek. Might pause a bit while the pipeline is refilled
|
||||
and the new data starts to show up, but greatly increases the
|
||||
“responsiveness” of the application. If this flag is not provided,
|
||||
“stale” data might be shown for a while until the new position appears
|
||||
at the end of the pipeline.
|
||||
|
||||
`GST_SEEK_FLAG_KEY_UNIT`: Most encoded video streams cannot seek to
|
||||
arbitrary positions, only to certain frames called Key Frames. When this
|
||||
flag is used, the seek will actually move to the closest key frame and
|
||||
start producing data straight away. If this flag is not used, the
|
||||
pipeline will move internally to the closest key frame (it has no other
|
||||
alternative) but data will not be shown until it reaches the requested
|
||||
position. Not providing the flag is more accurate, but might take longer
|
||||
to react.
|
||||
|
||||
`GST_SEEK_FLAG_ACCURATE`: Some media clips do not provide enough
|
||||
indexing information, meaning that seeking to arbitrary positions is
|
||||
time-consuming. In these cases, GStreamer usually estimates the position
|
||||
to seek to, and usually works just fine. If this precision is not good
|
||||
enough for your case (you see seeks not going to the exact time you
|
||||
asked for), then provide this flag. Be warned that it might take longer
|
||||
to calculate the seeking position (very long, on some files).
|
||||
|
||||
And finally we provide the position to seek to. Since we asked
|
||||
for `GST_FORMAT_TIME` , this position is in nanoseconds, so we use
|
||||
the `GST_SECOND` macro for simplicity.
|
||||
|
||||
#### Message Pump
|
||||
|
||||
The `handle_message` function processes all messages received through
|
||||
the pipeline's bus. ERROR and EOS handling is the same as in previous
|
||||
tutorials, so we skip to the interesting part:
|
||||
|
||||
``` first-line: 116; theme: Default; brush: cpp; gutter: true
|
||||
case GST_MESSAGE_DURATION:
|
||||
/* The duration has changed, mark the current one as invalid */
|
||||
data->duration = GST_CLOCK_TIME_NONE;
|
||||
break;
|
||||
```
|
||||
|
||||
This message is posted on the bus whenever the duration of the stream
|
||||
changes. Here we simply mark the current duration as invalid, so it gets
|
||||
re-queried later.
|
||||
|
||||
``` first-line: 120; theme: Default; brush: cpp; gutter: true
|
||||
case GST_MESSAGE_STATE_CHANGED: {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->pipeline)) {
|
||||
g_print ("Pipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
|
||||
/* Remember whether we are in the PLAYING state or not */
|
||||
data->playing = (new_state == GST_STATE_PLAYING);
|
||||
```
|
||||
|
||||
Seeks and time queries work better when in the PAUSED or PLAYING state,
|
||||
since all elements have had a chance to receive information and
|
||||
configure themselves. Here we take note of whether we are in the PLAYING
|
||||
state or not with the `playing` variable.
|
||||
|
||||
Also, if we have just entered the PLAYING state, we do our first query.
|
||||
We ask the pipeline if seeking is allowed on this stream:
|
||||
|
||||
``` first-line: 130; theme: Default; brush: cpp; gutter: true
|
||||
if (data->playing) {
|
||||
/* We just moved to PLAYING. Check if seeking is possible */
|
||||
GstQuery *query;
|
||||
gint64 start, end;
|
||||
query = gst_query_new_seeking (GST_FORMAT_TIME);
|
||||
if (gst_element_query (data->pipeline, query)) {
|
||||
gst_query_parse_seeking (query, NULL, &data->seek_enabled, &start, &end);
|
||||
if (data->seek_enabled) {
|
||||
g_print ("Seeking is ENABLED from %" GST_TIME_FORMAT " to %" GST_TIME_FORMAT "\n",
|
||||
GST_TIME_ARGS (start), GST_TIME_ARGS (end));
|
||||
} else {
|
||||
g_print ("Seeking is DISABLED for this stream.\n");
|
||||
}
|
||||
}
|
||||
else {
|
||||
g_printerr ("Seeking query failed.");
|
||||
}
|
||||
gst_query_unref (query);
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
`gst_query_new_seeking()` creates a new query object of the "seeking"
|
||||
type, with `GST_FORMAT_TIME` format. This indicates that we are
|
||||
interested in seeking by specifying the new time to which we want to
|
||||
move. We could also ask for `GST_FORMAT_BYTES`, and then seek to a
|
||||
particular byte position inside the source file, but this is normally
|
||||
less useful.
|
||||
|
||||
This query object is then passed to the pipeline with
|
||||
`gst_element_query()`. The result is stored in the same query, and can
|
||||
be easily retrieved with `gst_query_parse_seeking()`. It extracts a
|
||||
boolean indicating if seeking is allowed, and the range in which seeking
|
||||
is possible.
|
||||
|
||||
Don't forget to unref the query object when you are done with it.
|
||||
|
||||
And that's it\! With this knowledge a media player can be built which
|
||||
periodically updates a slider based on the current stream position and
|
||||
allows seeking by moving the slider\!
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to query the pipeline for information using `GstQuery`
|
||||
|
||||
- How to obtain common information like position and duration
|
||||
using `gst_element_query_position()` and `gst_element_query_duration()`
|
||||
|
||||
- How to seek to an arbitrary position in the stream
|
||||
using `gst_element_seek_simple()`
|
||||
|
||||
- In which states all these operations can be performed.
|
||||
|
||||
The next tutorial shows how to integrate GStreamer with a Graphical User
|
||||
Interface toolkit.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
926
Basic+tutorial+5%3A+GUI+toolkit+integration.markdown
Normal file
|
@ -0,0 +1,926 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 5: GUI toolkit integration
|
||||
|
||||
This page last changed on Dec 03, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
This tutorial shows how to integrate GStreamer in a Graphical User
|
||||
Interface (GUI) toolkit like [GTK+](http://www.gtk.org). Basically,
|
||||
GStreamer takes care of media playback while the GUI toolkit handles
|
||||
user interaction. The most interesting parts are those in which both
|
||||
libraries have to interact: Instructing GStreamer to output video to a
|
||||
GTK+ window and forwarding user actions to GStreamer.
|
||||
|
||||
In particular, you will learn:
|
||||
|
||||
- How to tell GStreamer to output video to a particular window
|
||||
(instead of creating its own window).
|
||||
|
||||
- How to continuously refresh the GUI with information from GStreamer.
|
||||
|
||||
- How to update the GUI from the multiple threads of GStreamer, an
|
||||
operation forbidden on most GUI toolkits.
|
||||
|
||||
- A mechanism to subscribe only to the messages you are interested in,
|
||||
instead of being notified of all of them.
|
||||
|
||||
# Introduction
|
||||
|
||||
We are going to build a media player using the
|
||||
[GTK+](http://www.gtk.org/) toolkit, but the concepts apply to other
|
||||
toolkits like [QT](http://qt-project.org/), for example. A minimum
|
||||
knowledge of [GTK+](http://www.gtk.org/) will help understand this
|
||||
tutorial.
|
||||
|
||||
The main point is telling GStreamer to output the video to a window of
|
||||
our choice. The specific mechanism depends on the operating system (or
|
||||
rather, on the windowing system), but GStreamer provides a layer of
|
||||
abstraction for the sake of platform independence. This independence
|
||||
comes through the `XOverlay` interface, that allows the application to
|
||||
tell a video sink the handler of the window that should receive the
|
||||
rendering.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><strong>GObject interfaces</strong><br />
|
||||
|
||||
<p>A GObject <code>interface</code> (which GStreamer uses) is a set of functions that an element can implement. If it does, then it is said to support that particular interface. For example, video sinks usually create their own windows to display video, but, if they are also capable of rendering to an external window, they can choose to implement the <code>XOverlay</code> interface and provide functions to specify this external window. From the application developer point of view, if a certain interface is supported, you can use it and forget about which kind of element is implementing it. Moreover, if you are using <code>playbin2</code>, it will automatically expose some of the interfaces supported by its internal elements: You can use your interface functions directly on <code>playbin2</code> without knowing who is implementing them!</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Another issue is that GUI toolkits usually only allow manipulation of
|
||||
the graphical “widgets” through the main (or application) thread,
|
||||
whereas GStreamer usually spawns multiple threads to take care of
|
||||
different tasks. Calling [GTK+](http://www.gtk.org/) functions from
|
||||
within callbacks will usually fail, because callbacks execute in the
|
||||
calling thread, which does not need to be the main thread. This problem
|
||||
can be solved by posting a message on the GStreamer bus in the callback:
|
||||
The messages will be received by the main thread which will then react
|
||||
accordingly.
|
||||
|
||||
Finally, so far we have registered a `handle_message` function that got
|
||||
called every time a message appeared on the bus, which forced us to
|
||||
parse every message to see if it was of interest to us. In this tutorial
|
||||
a different method is used that registers a callback for each kind of
|
||||
message, so there is less parsing and less code overall.
|
||||
|
||||
# A media player in GTK+
|
||||
|
||||
Let's write a very simple media player based on playbin2, this time,
|
||||
with a GUI\!
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-5.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
**basic-tutorial-5.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <string.h>
|
||||
|
||||
#include <gtk/gtk.h>
|
||||
#include <gst/gst.h>
|
||||
#include <gst/interfaces/xoverlay.h>
|
||||
|
||||
#include <gdk/gdk.h>
|
||||
#if defined (GDK_WINDOWING_X11)
|
||||
#include <gdk/gdkx.h>
|
||||
#elif defined (GDK_WINDOWING_WIN32)
|
||||
#include <gdk/gdkwin32.h>
|
||||
#elif defined (GDK_WINDOWING_QUARTZ)
|
||||
#include <gdk/gdkquartz.h>
|
||||
#endif
|
||||
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin2; /* Our one and only pipeline */
|
||||
|
||||
GtkWidget *slider; /* Slider widget to keep track of current position */
|
||||
GtkWidget *streams_list; /* Text widget to display info about the streams */
|
||||
gulong slider_update_signal_id; /* Signal ID for the slider update signal */
|
||||
|
||||
GstState state; /* Current state of the pipeline */
|
||||
gint64 duration; /* Duration of the clip, in nanoseconds */
|
||||
} CustomData;
|
||||
|
||||
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
|
||||
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
|
||||
* and pass it to GStreamer through the XOverlay interface. */
|
||||
static void realize_cb (GtkWidget *widget, CustomData *data) {
|
||||
GdkWindow *window = gtk_widget_get_window (widget);
|
||||
guintptr window_handle;
|
||||
|
||||
if (!gdk_window_ensure_native (window))
|
||||
g_error ("Couldn't create native window needed for GstXOverlay!");
|
||||
|
||||
/* Retrieve window handler from GDK */
|
||||
#if defined (GDK_WINDOWING_WIN32)
|
||||
window_handle = (guintptr)GDK_WINDOW_HWND (window);
|
||||
#elif defined (GDK_WINDOWING_QUARTZ)
|
||||
window_handle = gdk_quartz_window_get_nsview (window);
|
||||
#elif defined (GDK_WINDOWING_X11)
|
||||
window_handle = GDK_WINDOW_XID (window);
|
||||
#endif
|
||||
/* Pass it to playbin2, which implements XOverlay and will forward it to the video sink */
|
||||
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->playbin2), window_handle);
|
||||
}
|
||||
|
||||
/* This function is called when the PLAY button is clicked */
|
||||
static void play_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin2, GST_STATE_PLAYING);
|
||||
}
|
||||
|
||||
/* This function is called when the PAUSE button is clicked */
|
||||
static void pause_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin2, GST_STATE_PAUSED);
|
||||
}
|
||||
|
||||
/* This function is called when the STOP button is clicked */
|
||||
static void stop_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin2, GST_STATE_READY);
|
||||
}
|
||||
|
||||
/* This function is called when the main window is closed */
|
||||
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
|
||||
stop_cb (NULL, data);
|
||||
gtk_main_quit ();
|
||||
}
|
||||
|
||||
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
|
||||
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
|
||||
* we simply draw a black rectangle to avoid garbage showing up. */
|
||||
static gboolean expose_cb (GtkWidget *widget, GdkEventExpose *event, CustomData *data) {
|
||||
if (data->state < GST_STATE_PAUSED) {
|
||||
GtkAllocation allocation;
|
||||
GdkWindow *window = gtk_widget_get_window (widget);
|
||||
cairo_t *cr;
|
||||
|
||||
/* Cairo is a 2D graphics library which we use here to clean the video window.
|
||||
* It is used by GStreamer for other reasons, so it will always be available to us. */
|
||||
gtk_widget_get_allocation (widget, &allocation);
|
||||
cr = gdk_cairo_create (window);
|
||||
cairo_set_source_rgb (cr, 0, 0, 0);
|
||||
cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
|
||||
cairo_fill (cr);
|
||||
cairo_destroy (cr);
|
||||
}
|
||||
|
||||
return FALSE;
|
||||
}
|
||||
|
||||
/* This function is called when the slider changes its position. We perform a seek to the
|
||||
* new position here. */
|
||||
static void slider_cb (GtkRange *range, CustomData *data) {
|
||||
gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
|
||||
gst_element_seek_simple (data->playbin2, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
|
||||
(gint64)(value * GST_SECOND));
|
||||
}
|
||||
|
||||
/* This creates all the GTK+ widgets that compose our application, and registers the callbacks */
|
||||
static void create_ui (CustomData *data) {
|
||||
GtkWidget *main_window; /* The uppermost window, containing all other windows */
|
||||
GtkWidget *video_window; /* The drawing area where the video will be shown */
|
||||
GtkWidget *main_box; /* VBox to hold main_hbox and the controls */
|
||||
GtkWidget *main_hbox; /* HBox to hold the video_window and the stream info text widget */
|
||||
GtkWidget *controls; /* HBox to hold the buttons and the slider */
|
||||
GtkWidget *play_button, *pause_button, *stop_button; /* Buttons */
|
||||
|
||||
main_window = gtk_window_new (GTK_WINDOW_TOPLEVEL);
|
||||
g_signal_connect (G_OBJECT (main_window), "delete-event", G_CALLBACK (delete_event_cb), data);
|
||||
|
||||
video_window = gtk_drawing_area_new ();
|
||||
gtk_widget_set_double_buffered (video_window, FALSE);
|
||||
g_signal_connect (video_window, "realize", G_CALLBACK (realize_cb), data);
|
||||
g_signal_connect (video_window, "expose_event", G_CALLBACK (expose_cb), data);
|
||||
|
||||
play_button = gtk_button_new_from_stock (GTK_STOCK_MEDIA_PLAY);
|
||||
g_signal_connect (G_OBJECT (play_button), "clicked", G_CALLBACK (play_cb), data);
|
||||
|
||||
pause_button = gtk_button_new_from_stock (GTK_STOCK_MEDIA_PAUSE);
|
||||
g_signal_connect (G_OBJECT (pause_button), "clicked", G_CALLBACK (pause_cb), data);
|
||||
|
||||
stop_button = gtk_button_new_from_stock (GTK_STOCK_MEDIA_STOP);
|
||||
g_signal_connect (G_OBJECT (stop_button), "clicked", G_CALLBACK (stop_cb), data);
|
||||
|
||||
data->slider = gtk_hscale_new_with_range (0, 100, 1);
|
||||
gtk_scale_set_draw_value (GTK_SCALE (data->slider), 0);
|
||||
data->slider_update_signal_id = g_signal_connect (G_OBJECT (data->slider), "value-changed", G_CALLBACK (slider_cb), data);
|
||||
|
||||
data->streams_list = gtk_text_view_new ();
|
||||
gtk_text_view_set_editable (GTK_TEXT_VIEW (data->streams_list), FALSE);
|
||||
|
||||
controls = gtk_hbox_new (FALSE, 0);
|
||||
gtk_box_pack_start (GTK_BOX (controls), play_button, FALSE, FALSE, 2);
|
||||
gtk_box_pack_start (GTK_BOX (controls), pause_button, FALSE, FALSE, 2);
|
||||
gtk_box_pack_start (GTK_BOX (controls), stop_button, FALSE, FALSE, 2);
|
||||
gtk_box_pack_start (GTK_BOX (controls), data->slider, TRUE, TRUE, 2);
|
||||
|
||||
main_hbox = gtk_hbox_new (FALSE, 0);
|
||||
gtk_box_pack_start (GTK_BOX (main_hbox), video_window, TRUE, TRUE, 0);
|
||||
gtk_box_pack_start (GTK_BOX (main_hbox), data->streams_list, FALSE, FALSE, 2);
|
||||
|
||||
main_box = gtk_vbox_new (FALSE, 0);
|
||||
gtk_box_pack_start (GTK_BOX (main_box), main_hbox, TRUE, TRUE, 0);
|
||||
gtk_box_pack_start (GTK_BOX (main_box), controls, FALSE, FALSE, 0);
|
||||
gtk_container_add (GTK_CONTAINER (main_window), main_box);
|
||||
gtk_window_set_default_size (GTK_WINDOW (main_window), 640, 480);
|
||||
|
||||
gtk_widget_show_all (main_window);
|
||||
}
|
||||
|
||||
/* This function is called periodically to refresh the GUI */
|
||||
static gboolean refresh_ui (CustomData *data) {
|
||||
GstFormat fmt = GST_FORMAT_TIME;
|
||||
gint64 current = -1;
|
||||
|
||||
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
|
||||
if (data->state < GST_STATE_PAUSED)
|
||||
return TRUE;
|
||||
|
||||
/* If we didn't know it yet, query the stream duration */
|
||||
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
|
||||
if (!gst_element_query_duration (data->playbin2, &fmt, &data->duration)) {
|
||||
g_printerr ("Could not query current duration.\n");
|
||||
} else {
|
||||
/* Set the range of the slider to the clip duration, in SECONDS */
|
||||
gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
|
||||
}
|
||||
}
|
||||
|
||||
if (gst_element_query_position (data->playbin2, &fmt, ¤t)) {
|
||||
/* Block the "value-changed" signal, so the slider_cb function is not called
|
||||
* (which would trigger a seek the user has not requested) */
|
||||
g_signal_handler_block (data->slider, data->slider_update_signal_id);
|
||||
/* Set the position of the slider to the current pipeline positoin, in SECONDS */
|
||||
gtk_range_set_value (GTK_RANGE (data->slider), (gdouble)current / GST_SECOND);
|
||||
/* Re-enable the signal */
|
||||
g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
|
||||
}
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
/* This function is called when new metadata is discovered in the stream */
|
||||
static void tags_cb (GstElement *playbin2, gint stream, CustomData *data) {
|
||||
/* We are possibly in a GStreamer working thread, so we notify the main
|
||||
* thread of this event through a message in the bus */
|
||||
gst_element_post_message (playbin2,
|
||||
gst_message_new_application (GST_OBJECT (playbin2),
|
||||
gst_structure_new ("tags-changed", NULL)));
|
||||
}
|
||||
|
||||
/* This function is called when an error message is posted on the bus */
|
||||
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
/* Print error details on the screen */
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
|
||||
/* Set the pipeline to READY (which stops playback) */
|
||||
gst_element_set_state (data->playbin2, GST_STATE_READY);
|
||||
}
|
||||
|
||||
/* This function is called when an End-Of-Stream message is posted on the bus.
|
||||
* We just set the pipeline to READY (which stops playback) */
|
||||
static void eos_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
gst_element_set_state (data->playbin2, GST_STATE_READY);
|
||||
}
|
||||
|
||||
/* This function is called when the pipeline changes states. We use it to
|
||||
* keep track of the current state. */
|
||||
static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin2)) {
|
||||
data->state = new_state;
|
||||
g_print ("State set to %s\n", gst_element_state_get_name (new_state));
|
||||
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
|
||||
/* For extra responsiveness, we refresh the GUI as soon as we reach the PAUSED state */
|
||||
refresh_ui (data);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/* Extract metadata from all the streams and write it to the text widget in the GUI */
|
||||
static void analyze_streams (CustomData *data) {
|
||||
gint i;
|
||||
GstTagList *tags;
|
||||
gchar *str, *total_str;
|
||||
guint rate;
|
||||
gint n_video, n_audio, n_text;
|
||||
GtkTextBuffer *text;
|
||||
|
||||
/* Clean current contents of the widget */
|
||||
text = gtk_text_view_get_buffer (GTK_TEXT_VIEW (data->streams_list));
|
||||
gtk_text_buffer_set_text (text, "", -1);
|
||||
|
||||
/* Read some properties */
|
||||
g_object_get (data->playbin2, "n-video", &n_video, NULL);
|
||||
g_object_get (data->playbin2, "n-audio", &n_audio, NULL);
|
||||
g_object_get (data->playbin2, "n-text", &n_text, NULL);
|
||||
|
||||
for (i = 0; i < n_video; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's video tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-video-tags", i, &tags);
|
||||
if (tags) {
|
||||
total_str = g_strdup_printf ("video stream %d:\n", i);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
|
||||
total_str = g_strdup_printf (" codec: %s\n", str ? str : "unknown");
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
g_free (str);
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
for (i = 0; i < n_audio; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's audio tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-audio-tags", i, &tags);
|
||||
if (tags) {
|
||||
total_str = g_strdup_printf ("\naudio stream %d:\n", i);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
|
||||
total_str = g_strdup_printf (" codec: %s\n", str);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
g_free (str);
|
||||
}
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
|
||||
total_str = g_strdup_printf (" language: %s\n", str);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
g_free (str);
|
||||
}
|
||||
if (gst_tag_list_get_uint (tags, GST_TAG_BITRATE, &rate)) {
|
||||
total_str = g_strdup_printf (" bitrate: %d\n", rate);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
}
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
for (i = 0; i < n_text; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's subtitle tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-text-tags", i, &tags);
|
||||
if (tags) {
|
||||
total_str = g_strdup_printf ("\nsubtitle stream %d:\n", i);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
|
||||
total_str = g_strdup_printf (" language: %s\n", str);
|
||||
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
|
||||
g_free (total_str);
|
||||
g_free (str);
|
||||
}
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/* This function is called when an "application" message is posted on the bus.
|
||||
* Here we retrieve the message posted by the tags_cb callback */
|
||||
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
if (g_strcmp0 (gst_structure_get_name (msg->structure), "tags-changed") == 0) {
|
||||
/* If the message is the "tags-changed" (only one we are currently issuing), update
|
||||
* the stream info GUI */
|
||||
analyze_streams (data);
|
||||
}
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstStateChangeReturn ret;
|
||||
GstBus *bus;
|
||||
|
||||
/* Initialize GTK */
|
||||
gtk_init (&argc, &argv);
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
data.duration = GST_CLOCK_TIME_NONE;
|
||||
|
||||
/* Create the elements */
|
||||
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
|
||||
|
||||
if (!data.playbin2) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Connect to interesting signals in playbin2 */
|
||||
g_signal_connect (G_OBJECT (data.playbin2), "video-tags-changed", (GCallback) tags_cb, &data);
|
||||
g_signal_connect (G_OBJECT (data.playbin2), "audio-tags-changed", (GCallback) tags_cb, &data);
|
||||
g_signal_connect (G_OBJECT (data.playbin2), "text-tags-changed", (GCallback) tags_cb, &data);
|
||||
|
||||
/* Create the GUI */
|
||||
create_ui (&data);
|
||||
|
||||
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
||||
bus = gst_element_get_bus (data.playbin2);
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
|
||||
gst_object_unref (bus);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.playbin2);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Register a function that GLib will call every second */
|
||||
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
|
||||
|
||||
/* Start the GTK main loop. We will not regain control until gtk_main_quit is called. */
|
||||
gtk_main ();
|
||||
|
||||
/* Free resources */
|
||||
gst_element_set_state (data.playbin2, GST_STATE_NULL);
|
||||
gst_object_unref (data.playbin2);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-700857978" class="expand-container">
|
||||
<div id="expander-control-700857978" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-700857978" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-5.c -o basic-tutorial-5 `pkg-config --cflags --libs gstreamer-interfaces-0.10 gtk+-2.0 gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p></p>
|
||||
<p><span>This tutorial opens a GTK+ window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The Window has some GTK+ buttons to Pause, Stop and Play the movie, and a slider to show the current position of the stream, which can be dragged to change it. Also, information about the stream is shown on a column at the right edge of the window.</span></p>
|
||||
<p><span><span>Bear in mind that there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how <a href="Basic%2Btutorial%2B12%253A%2BStreaming.html">Basic tutorial 12: Streaming</a> </span><span>solves this issue.</span></span></p>
|
||||
<p></p>
|
||||
<p>Required libraries: <code> gstreamer-interfaces-0.10 gtk+-2.0 gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
Regarding this tutorial's structure, we are not going to use forward
|
||||
function definitions anymore: Functions will be defined before they are
|
||||
used. Also, for clarity of explanation, the order in which the snippets
|
||||
of code are presented will not always match the program order. Use the
|
||||
line numbers to locate the snippets in the complete code.
|
||||
|
||||
``` first-line: 7; theme: Default; brush: cpp; gutter: true
|
||||
#include <gdk/gdk.h>
|
||||
#if defined (GDK_WINDOWING_X11)
|
||||
#include <gdk/gdkx.h>
|
||||
#elif defined (GDK_WINDOWING_WIN32)
|
||||
#include <gdk/gdkwin32.h>
|
||||
#elif defined (GDK_WINDOWING_QUARTZ)
|
||||
#include <gdk/gdkquartzwindow.h>
|
||||
#endif
|
||||
```
|
||||
|
||||
The first thing worth noticing is that we are no longer completely
|
||||
platform-independent. We need to include the appropriate GDK headers for
|
||||
the windowing system we are going to use. Fortunately, there are not
|
||||
that many supported windowing systems, so these three lines often
|
||||
suffice: X11 for Linux, Win32 for Windows and Quartz for Mac OSX.
|
||||
|
||||
This tutorial is composed mostly of callback functions, which will be
|
||||
called from GStreamer or GTK+, so let's review the `main` function,
|
||||
which registers all these callbacks.
|
||||
|
||||
``` first-line: 324; theme: Default; brush: cpp; gutter: true
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstStateChangeReturn ret;
|
||||
GstBus *bus;
|
||||
|
||||
/* Initialize GTK */
|
||||
gtk_init (&argc, &argv);
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
data.duration = GST_CLOCK_TIME_NONE;
|
||||
|
||||
/* Create the elements */
|
||||
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
|
||||
|
||||
if (!data.playbin2) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
```
|
||||
|
||||
Standard GStreamer initialization and playbin2 pipeline creation, along
|
||||
with GTK+ initialization. Not much new.
|
||||
|
||||
``` first-line: 350; theme: Default; brush: cpp; gutter: true
|
||||
/* Connect to interesting signals in playbin2 */
|
||||
g_signal_connect (G_OBJECT (data.playbin2), "video-tags-changed", (GCallback) tags_cb, &data);
|
||||
g_signal_connect (G_OBJECT (data.playbin2), "audio-tags-changed", (GCallback) tags_cb, &data);
|
||||
g_signal_connect (G_OBJECT (data.playbin2), "text-tags-changed", (GCallback) tags_cb, &data);
|
||||
```
|
||||
|
||||
We are interested in being notified when new tags (metadata) appears on
|
||||
the stream. For simplicity, we are going to handle all kinds of tags
|
||||
(video, audio and text) from the same callback `tags_cb`.
|
||||
|
||||
``` first-line: 355; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the GUI */
|
||||
create_ui (&data);
|
||||
```
|
||||
|
||||
All GTK+ widget creation and signal registration happens in this
|
||||
function. It contains only GTK-related function calls, so we will skip
|
||||
over its definition. The signals to which it registers convey user
|
||||
commands, as shown below when reviewing the
|
||||
callbacks.
|
||||
|
||||
``` first-line: 359; theme: Default; brush: cpp; gutter: true
|
||||
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
||||
bus = gst_element_get_bus (data.playbin2);
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
|
||||
g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
|
||||
gst_object_unref (bus);
|
||||
```
|
||||
|
||||
In [Playback tutorial 1: Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html), `gst_bus_add_watch()` is
|
||||
used to register a function that receives every message posted to the
|
||||
GStreamer bus. We can achieve a finer granularity by using signals
|
||||
instead, which allow us to register only to the messages we are
|
||||
interested in. By calling `gst_bus_add_signal_watch()` we instruct the
|
||||
bus to emit a signal every time it receives a message. This signal has
|
||||
the name `message::detail` where *`detail`* is the message that
|
||||
triggered the signal emission. For example, when the bus receives the
|
||||
EOS message, it emits a signal with the name `message::eos`.
|
||||
|
||||
This tutorial is using the `Signals`'s details to register only to the
|
||||
messages we care about. If we had registered to the `message` signal, we
|
||||
would be notified of every single message, just like
|
||||
`gst_bus_add_watch()` would do.
|
||||
|
||||
Keep in mind that, in order for the bus watches to work (be it a
|
||||
`gst_bus_add_watch()` or a `gst_bus_add_signal_watch()`), there must be
|
||||
GLib `Main Loop` running. In this case, it is hidden inside the
|
||||
[GTK+](http://www.gtk.org/) main loop.
|
||||
|
||||
``` first-line: 374; theme: Default; brush: cpp; gutter: true
|
||||
/* Register a function that GLib will call every second */
|
||||
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
|
||||
```
|
||||
|
||||
Before transferring control to GTK+, we use `g_timeout_add_seconds
|
||||
()` to register yet another callback, this time with a timeout, so it
|
||||
gets called every second. We are going to use it to refresh the GUI from
|
||||
the `refresh_ui` function.
|
||||
|
||||
After this, we are done with the setup and can start the GTK+ main loop.
|
||||
We will regain control from our callbacks when interesting things
|
||||
happen. Let's review the callbacks. Each callback has a different
|
||||
signature, depending on who will call it. You can look up the signature
|
||||
(the meaning of the parameters and the return value) in the
|
||||
documentation of the
|
||||
signal.
|
||||
|
||||
``` first-line: 28; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
|
||||
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
|
||||
* and pass it to GStreamer through the XOverlay interface. */
|
||||
static void realize_cb (GtkWidget *widget, CustomData *data) {
|
||||
GdkWindow *window = gtk_widget_get_window (widget);
|
||||
guintptr window_handle;
|
||||
|
||||
if (!gdk_window_ensure_native (window))
|
||||
g_error ("Couldn't create native window needed for GstXOverlay!");
|
||||
|
||||
/* Retrieve window handler from GDK */
|
||||
#if defined (GDK_WINDOWING_WIN32)
|
||||
window_handle = (guintptr)GDK_WINDOW_HWND (window);
|
||||
#elif defined (GDK_WINDOWING_QUARTZ)
|
||||
window_handle = gdk_quartz_window_get_nsview (window);
|
||||
#elif defined (GDK_WINDOWING_X11)
|
||||
window_handle = GDK_WINDOW_XID (window);
|
||||
#endif
|
||||
/* Pass it to playbin2, which implements XOverlay and will forward it to the video sink */
|
||||
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->playbin2), window_handle);
|
||||
}
|
||||
```
|
||||
|
||||
The code comments talks by itself. At this point in the life cycle of
|
||||
the application, we know the handle (be it an X11's `XID`, a Window's
|
||||
`HWND` or a Quartz's `NSView`) of the window where GStreamer should
|
||||
render the video. We simply retrieve it from the windowing system and
|
||||
pass it to `playbin2` through the `XOverlay` interface using
|
||||
`gst_x_overlay_set_window_handle()`. `playbin2` will locate the video
|
||||
sink and pass the handler to it, so it does not create its own window
|
||||
and uses this one.
|
||||
|
||||
Not much more to see here; `playbin2` and the `XOverlay` really simplify
|
||||
this process a lot\!
|
||||
|
||||
``` first-line: 50; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called when the PLAY button is clicked */
|
||||
static void play_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin2, GST_STATE_PLAYING);
|
||||
}
|
||||
|
||||
/* This function is called when the PAUSE button is clicked */
|
||||
static void pause_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin2, GST_STATE_PAUSED);
|
||||
}
|
||||
|
||||
/* This function is called when the STOP button is clicked */
|
||||
static void stop_cb (GtkButton *button, CustomData *data) {
|
||||
gst_element_set_state (data->playbin2, GST_STATE_READY);
|
||||
}
|
||||
```
|
||||
|
||||
These three little callbacks are associated with the PLAY, PAUSE and
|
||||
STOP buttons in the GUI. They simply set the pipeline to the
|
||||
corresponding state. Note that in the STOP state we set the pipeline to
|
||||
`READY`. We could have brought the pipeline all the way down to the
|
||||
`NULL` state, but, the transition would then be slower, since some
|
||||
resources (like the audio device) would need to be released and
|
||||
re-acquired.
|
||||
|
||||
``` first-line: 65; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called when the main window is closed */
|
||||
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
|
||||
stop_cb (NULL, data);
|
||||
gtk_main_quit ();
|
||||
}
|
||||
```
|
||||
|
||||
gtk\_main\_quit() will eventually make the call to to gtk\_main\_run()
|
||||
in `main` to terminate, which, in this case, finishes the program. Here,
|
||||
we call it when the main window is closed, after stopping the pipeline
|
||||
(just for the sake of
|
||||
tidiness).
|
||||
|
||||
``` first-line: 71; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
|
||||
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
|
||||
* we simply draw a black rectangle to avoid garbage showing up. */
|
||||
static gboolean expose_cb (GtkWidget *widget, GdkEventExpose *event, CustomData *data) {
|
||||
if (data->state < GST_STATE_PAUSED) {
|
||||
GtkAllocation allocation;
|
||||
GdkWindow *window = gtk_widget_get_window (widget);
|
||||
cairo_t *cr;
|
||||
|
||||
/* Cairo is a 2D graphics library which we use here to clean the video window.
|
||||
* It is used by GStreamer for other reasons, so it will always be available to us. */
|
||||
gtk_widget_get_allocation (widget, &allocation);
|
||||
cr = gdk_cairo_create (window);
|
||||
cairo_set_source_rgb (cr, 0, 0, 0);
|
||||
cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
|
||||
cairo_fill (cr);
|
||||
cairo_destroy (cr);
|
||||
}
|
||||
|
||||
return FALSE;
|
||||
}
|
||||
```
|
||||
|
||||
When there is data flow (in the `PAUSED` and `PLAYING` states) the video
|
||||
sink takes care of refreshing the content of the video window. In the
|
||||
other cases, however, it will not, so we have to do it. In this example,
|
||||
we just fill the window with a black
|
||||
rectangle.
|
||||
|
||||
``` first-line: 93; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called when the slider changes its position. We perform a seek to the
|
||||
* new position here. */
|
||||
static void slider_cb (GtkRange *range, CustomData *data) {
|
||||
gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
|
||||
gst_element_seek_simple (data->playbin2, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
|
||||
(gint64)(value * GST_SECOND));
|
||||
}
|
||||
```
|
||||
|
||||
This is an example of how a complex GUI element like a seeker bar (or
|
||||
slider that allows seeking) can be very easily implemented thanks to
|
||||
GStreamer and GTK+ collaborating. If the slider has been dragged to a
|
||||
new position, tell GStreamer to seek to that position
|
||||
with `gst_element_seek_simple()` (as seen in [Basic tutorial 4: Time
|
||||
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html)). The
|
||||
slider has been setup so its value represents seconds.
|
||||
|
||||
It is worth mentioning that some performance (and responsiveness) can be
|
||||
gained by doing some throttling, this is, not responding to every single
|
||||
user request to seek. Since the seek operation is bound to take some
|
||||
time, it is often nicer to wait half a second (for example) after a seek
|
||||
before allowing another one. Otherwise, the application might look
|
||||
unresponsive if the user drags the slider frantically, which would not
|
||||
allow any seek to complete before a new one is queued.
|
||||
|
||||
``` first-line: 153; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called periodically to refresh the GUI */
|
||||
static gboolean refresh_ui (CustomData *data) {
|
||||
GstFormat fmt = GST_FORMAT_TIME;
|
||||
gint64 current = -1;
|
||||
|
||||
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
|
||||
if (data->state < GST_STATE_PAUSED)
|
||||
return TRUE;
|
||||
```
|
||||
|
||||
This function will move the slider to reflect the current position of
|
||||
the media. First off, if we are not in the `PLAYING` state, we have
|
||||
nothing to do here (plus, position and duration queries will normally
|
||||
fail).
|
||||
|
||||
``` first-line: 162; theme: Default; brush: cpp; gutter: true
|
||||
/* If we didn't know it yet, query the stream duration */
|
||||
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
|
||||
if (!gst_element_query_duration (data->playbin2, &fmt, &data->duration)) {
|
||||
g_printerr ("Could not query current duration.\n");
|
||||
} else {
|
||||
/* Set the range of the slider to the clip duration, in SECONDS */
|
||||
gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
We recover the duration of the clip if we didn't know it, so we can set
|
||||
the range for the slider.
|
||||
|
||||
``` first-line: 172; theme: Default; brush: cpp; gutter: true
|
||||
if (gst_element_query_position (data->playbin2, &fmt, ¤t)) {
|
||||
/* Block the "value-changed" signal, so the slider_cb function is not called
|
||||
* (which would trigger a seek the user has not requested) */
|
||||
g_signal_handler_block (data->slider, data->slider_update_signal_id);
|
||||
/* Set the position of the slider to the current pipeline positoin, in SECONDS */
|
||||
gtk_range_set_value (GTK_RANGE (data->slider), (gdouble)current / GST_SECOND);
|
||||
/* Re-enable the signal */
|
||||
g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
|
||||
}
|
||||
return TRUE;
|
||||
```
|
||||
|
||||
We query the current pipeline position, and set the position of the
|
||||
slider accordingly. This would trigger the emission of the
|
||||
`value-changed` signal, which we use to know when the user is dragging
|
||||
the slider. Since we do not want seeks happening unless the user
|
||||
requested them, we disable the `value-changed` signal emission during
|
||||
this operation with `g_signal_handler_block()` and
|
||||
`g_signal_handler_unblock()`.
|
||||
|
||||
Returning TRUE from this function will keep it called in the future. If
|
||||
we return FALSE, the timer will be
|
||||
removed.
|
||||
|
||||
``` first-line: 184; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called when new metadata is discovered in the stream */
|
||||
static void tags_cb (GstElement *playbin2, gint stream, CustomData *data) {
|
||||
/* We are possibly in a GStreamer working thread, so we notify the main
|
||||
* thread of this event through a message in the bus */
|
||||
gst_element_post_message (playbin2,
|
||||
gst_message_new_application (GST_OBJECT (playbin2),
|
||||
gst_structure_new ("tags-changed", NULL)));
|
||||
}
|
||||
```
|
||||
|
||||
This is one of the key points of this tutorial. This function will be
|
||||
called when new tags are found in the media, **from a streaming
|
||||
thread**, this is, from a thread other than the application (or main)
|
||||
thread. What we want to do here is to update a GTK+ widget to reflect
|
||||
this new information, but **GTK+ does not allow operating from threads
|
||||
other than the main one**.
|
||||
|
||||
The solution is to make `playbin2` post a message on the bus and return
|
||||
to the calling thread. When appropriate, the main thread will pick up
|
||||
this message and update GTK.
|
||||
|
||||
`gst_element_post_message()` makes a GStreamer element post the given
|
||||
message to the bus. `gst_message_new_application()` creates a new
|
||||
message of the `APPLICATION` type. GStreamer messages have different
|
||||
types, and this particular type is reserved to the application: it will
|
||||
go through the bus unaffected by GStreamer. The list of types can be
|
||||
found in the `GstMessageType` documentation.
|
||||
|
||||
Messages can deliver additional information through their embedded
|
||||
`GstStructure`, which is a very flexible data container. Here, we create
|
||||
a new structure with `gst_structure_new`, and name it `tags-changed`, to
|
||||
avoid confusion in case we wanted to send other application messages.
|
||||
|
||||
Later, once in the main thread, the bus will receive this message and
|
||||
emit the `message::application` signal, which we have associated to the
|
||||
`application_cb` function:
|
||||
|
||||
``` first-line: 314; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called when an "application" message is posted on the bus.
|
||||
* Here we retrieve the message posted by the tags_cb callback */
|
||||
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
if (g_strcmp0 (gst_structure_get_name (msg->structure), "tags-changed") == 0) {
|
||||
/* If the message is the "tags-changed" (only one we are currently issuing), update
|
||||
* the stream info GUI */
|
||||
analyze_streams (data);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Once me made sure it is the `tags-changed` message, we call the
|
||||
`analyze_streams` function, which is also used in [Playback tutorial 1:
|
||||
Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html) and is
|
||||
more detailed there. It basically recovers the tags from the stream and
|
||||
writes them in a text widget in the GUI.
|
||||
|
||||
The `error_cb`, `eos_cb` and `state_changed_cb` are not really worth
|
||||
explaining, since they do the same as in all previous tutorials, but
|
||||
from their own function now.
|
||||
|
||||
And this is it\! The amount of code in this tutorial might seem daunting
|
||||
but the required concepts are few and easy. If you have followed the
|
||||
previous tutorials and have a little knowledge of GTK, you probably
|
||||
understood this one can now enjoy your very own media player\!
|
||||
|
||||
![](attachments/327796/1540121.png)
|
||||
|
||||
# Exercise
|
||||
|
||||
If this media player is not good enough for you, try to change the text
|
||||
widget that displays the information about the streams into a proper
|
||||
list view (or tree view). Then, when the user selects a different
|
||||
stream, make GStreamer switch streams\! To switch streams, you will need
|
||||
to read [Playback tutorial 1: Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html).
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to output the video to a particular window handle
|
||||
using `gst_x_overlay_set_window_handle()`.
|
||||
|
||||
- How to refresh the GUI periodically by registering a timeout
|
||||
callback with `g_timeout_add_seconds ()`.
|
||||
|
||||
- How to convey information to the main thread by means of application
|
||||
messages through the bus with `gst_element_post_message()`.
|
||||
|
||||
- How to be notified only of interesting messages by making the bus
|
||||
emit signals with `gst_bus_add_signal_watch()` and discriminating
|
||||
among all message types using the signal details.
|
||||
|
||||
This allows you to build a somewhat complete media player with a proper
|
||||
Graphical User Interface.
|
||||
|
||||
The following basic tutorials keep focusing on other individual
|
||||
GStreamer topics
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-5.png](attachments/327796/1540122.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-5.png](attachments/327796/1540123.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-5.png](attachments/327796/1540121.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
496
Basic+tutorial+6%3A+Media+formats+and+Pad+Capabilities.markdown
Normal file
|
@ -0,0 +1,496 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 6: Media formats and Pad Capabilities
|
||||
|
||||
This page last changed on Dec 03, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Pad Capabilities are a fundamental element of GStreamer, although most
|
||||
of the time they are invisible because the framework handles them
|
||||
automatically. This somewhat theoretical tutorial shows:
|
||||
|
||||
- What are Pad Capabilities.
|
||||
|
||||
- How to retrieve them.
|
||||
|
||||
- When to retrieve them.
|
||||
|
||||
- Why you need to know about them.
|
||||
|
||||
# Introduction
|
||||
|
||||
### Pads
|
||||
|
||||
As it has already been shown, Pads allow information to enter and leave
|
||||
an element. The *Capabilities* (or *Caps*, for short) of a Pad, then,
|
||||
specify what kind of information can travel through the Pad. For
|
||||
example, “RGB video with a resolution of 320x200 pixels and 30 frames
|
||||
per second”, or “16-bits per sample audio, 5.1 channels at 44100 samples
|
||||
per second”, or even compressed formats like mp3 or h264.
|
||||
|
||||
Pads can support multiple Capabilities (for example, a video sink can
|
||||
support video in the RGB or YUV formats) and Capabilities can be
|
||||
specified as *ranges* (for example, an audio sink can support samples
|
||||
rates from 1 to 48000 samples per second). However, the actual
|
||||
information traveling from Pad to Pad must have only one well-specified
|
||||
type. Through a process known as *negotiation*, two linked Pads agree on
|
||||
a common type, and thus the Capabilities of the Pads become *fixed*
|
||||
(they only have one type and do not contain ranges). The walkthrough of
|
||||
the sample code below should make all this clear.
|
||||
|
||||
**In order for two elements to be linked together, they must share a
|
||||
common subset of Capabilities** (Otherwise they could not possibly
|
||||
understand each other). This is the main goal of Capabilities.
|
||||
|
||||
As an application developer, you will usually build pipelines by linking
|
||||
elements together (to a lesser extent if you use all-in-all elements
|
||||
like `playbin2`). In this case, you need to know the *Pad Caps* (as they
|
||||
are familiarly referred to) of your elements, or, at least, know what
|
||||
they are when GStreamer refuses to link two elements with a negotiation
|
||||
error.
|
||||
|
||||
### Pad templates
|
||||
|
||||
Pads are created from *Pad Templates*, which indicate all possible
|
||||
Capabilities a Pad could have. Templates are useful to create several
|
||||
similar Pads, and also allow early refusal of connections between
|
||||
elements: If the Capabilities of their Pad Templates do not have a
|
||||
common subset (their *intersection *is empty), there is no need to
|
||||
negotiate further.
|
||||
|
||||
Pad Templates can be viewed as the first step in the negotiation
|
||||
process. As the process evolves, actual Pads are instantiated and their
|
||||
Capabilities refined until they are fixed (or negotiation fails).
|
||||
|
||||
### Capabilities examples
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
SINK template: 'sink'
|
||||
Availability: Always
|
||||
Capabilities:
|
||||
audio/x-raw-int
|
||||
signed: true
|
||||
width: 16
|
||||
depth: 16
|
||||
rate: [ 1, 2147483647 ]
|
||||
channels: [ 1, 2 ]
|
||||
audio/x-raw-int
|
||||
signed: false
|
||||
width: 8
|
||||
depth: 8
|
||||
rate: [ 1, 2147483647 ]
|
||||
channels: [ 1, 2 ]
|
||||
```
|
||||
|
||||
This pad is a sink which is always available on the element (we will not
|
||||
talk about availability for now). It supports two kinds of media, both
|
||||
raw audio in integer format (`audio/x-raw-int`): signed,16-bit and
|
||||
unsigned 8-bit. The square brackets indicate a range: for instance, the
|
||||
number of channels varies from 1 to 2.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
SRC template: 'src'
|
||||
Availability: Always
|
||||
Capabilities:
|
||||
video/x-raw-yuv
|
||||
width: [ 1, 2147483647 ]
|
||||
height: [ 1, 2147483647 ]
|
||||
framerate: [ 0/1, 2147483647/1 ]
|
||||
format: { I420, NV12, NV21, YV12, YUY2, Y42B, Y444, YUV9, YVU9, Y41B, Y800, Y8 , GREY, Y16 , UYVY, YVYU, IYU1, v308, AYUV, A420 }
|
||||
```
|
||||
|
||||
`video/x-raw-yuv` indicates that this source pad outputs video in YUV
|
||||
format (1 Luminance + 2 Chrominance planes). It supports a wide range of
|
||||
dimensions and framerates, and a set of YUV formats (The curly braces
|
||||
indicate a *list*). All these formats indicate different packing and
|
||||
subsampling of the image planes.
|
||||
|
||||
### Last remarks
|
||||
|
||||
You can use the `gst-inspect-0.10` tool described in [Basic tutorial 10:
|
||||
GStreamer tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) to
|
||||
learn about the Caps of any GStreamer element.
|
||||
|
||||
Bear in mind that some elements query the underlying hardware for
|
||||
supported formats and offer their Pad Caps accordingly (They usually do
|
||||
this when entering the READY state or higher). Therefore, the shown caps
|
||||
can vary from platform to platform, or even from one execution to the
|
||||
next (even though this case is rare).
|
||||
|
||||
This tutorial instantiates two elements (this time, through their
|
||||
factories), shows their Pad Templates, links them and sets the pipeline
|
||||
to play. On each state change, the Capabilities of the sink element's
|
||||
Pad are shown, so you can observe how the negotiation proceeds until the
|
||||
Pad Caps are fixed.
|
||||
|
||||
# A trivial Pad Capabilities Example
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-6.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
**basic-tutorial-6.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* Functions below print the Capabilities in a human-friendly format */
|
||||
static gboolean print_field (GQuark field, const GValue * value, gpointer pfx) {
|
||||
gchar *str = gst_value_serialize (value);
|
||||
|
||||
g_print ("%s %15s: %s\n", (gchar *) pfx, g_quark_to_string (field), str);
|
||||
g_free (str);
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
static void print_caps (const GstCaps * caps, const gchar * pfx) {
|
||||
guint i;
|
||||
|
||||
g_return_if_fail (caps != NULL);
|
||||
|
||||
if (gst_caps_is_any (caps)) {
|
||||
g_print ("%sANY\n", pfx);
|
||||
return;
|
||||
}
|
||||
if (gst_caps_is_empty (caps)) {
|
||||
g_print ("%sEMPTY\n", pfx);
|
||||
return;
|
||||
}
|
||||
|
||||
for (i = 0; i < gst_caps_get_size (caps); i++) {
|
||||
GstStructure *structure = gst_caps_get_structure (caps, i);
|
||||
|
||||
g_print ("%s%s\n", pfx, gst_structure_get_name (structure));
|
||||
gst_structure_foreach (structure, print_field, (gpointer) pfx);
|
||||
}
|
||||
}
|
||||
|
||||
/* Prints information about a Pad Template, including its Capabilities */
|
||||
static void print_pad_templates_information (GstElementFactory * factory) {
|
||||
const GList *pads;
|
||||
GstStaticPadTemplate *padtemplate;
|
||||
|
||||
g_print ("Pad Templates for %s:\n", gst_element_factory_get_longname (factory));
|
||||
if (!factory->numpadtemplates) {
|
||||
g_print (" none\n");
|
||||
return;
|
||||
}
|
||||
|
||||
pads = factory->staticpadtemplates;
|
||||
while (pads) {
|
||||
padtemplate = (GstStaticPadTemplate *) (pads->data);
|
||||
pads = g_list_next (pads);
|
||||
|
||||
if (padtemplate->direction == GST_PAD_SRC)
|
||||
g_print (" SRC template: '%s'\n", padtemplate->name_template);
|
||||
else if (padtemplate->direction == GST_PAD_SINK)
|
||||
g_print (" SINK template: '%s'\n", padtemplate->name_template);
|
||||
else
|
||||
g_print (" UNKNOWN!!! template: '%s'\n", padtemplate->name_template);
|
||||
|
||||
if (padtemplate->presence == GST_PAD_ALWAYS)
|
||||
g_print (" Availability: Always\n");
|
||||
else if (padtemplate->presence == GST_PAD_SOMETIMES)
|
||||
g_print (" Availability: Sometimes\n");
|
||||
else if (padtemplate->presence == GST_PAD_REQUEST) {
|
||||
g_print (" Availability: On request\n");
|
||||
} else
|
||||
g_print (" Availability: UNKNOWN!!!\n");
|
||||
|
||||
if (padtemplate->static_caps.string) {
|
||||
g_print (" Capabilities:\n");
|
||||
print_caps (gst_static_caps_get (&padtemplate->static_caps), " ");
|
||||
}
|
||||
|
||||
g_print ("\n");
|
||||
}
|
||||
}
|
||||
|
||||
/* Shows the CURRENT capabilities of the requested pad in the given element */
|
||||
static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
|
||||
GstPad *pad = NULL;
|
||||
GstCaps *caps = NULL;
|
||||
|
||||
/* Retrieve pad */
|
||||
pad = gst_element_get_static_pad (element, pad_name);
|
||||
if (!pad) {
|
||||
g_printerr ("Could not retrieve pad '%s'\n", pad_name);
|
||||
return;
|
||||
}
|
||||
|
||||
/* Retrieve negotiated caps (or acceptable caps if negotiation is not finished yet) */
|
||||
caps = gst_pad_get_negotiated_caps (pad);
|
||||
if (!caps)
|
||||
caps = gst_pad_get_caps_reffed (pad);
|
||||
|
||||
/* Print and free */
|
||||
g_print ("Caps for the %s pad:\n", pad_name);
|
||||
print_caps (caps, " ");
|
||||
gst_caps_unref (caps);
|
||||
gst_object_unref (pad);
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline, *source, *sink;
|
||||
GstElementFactory *source_factory, *sink_factory;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstStateChangeReturn ret;
|
||||
gboolean terminate = FALSE;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the element factories */
|
||||
source_factory = gst_element_factory_find ("audiotestsrc");
|
||||
sink_factory = gst_element_factory_find ("autoaudiosink");
|
||||
if (!source_factory || !sink_factory) {
|
||||
g_printerr ("Not all element factories could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Print information about the pad templates of these factories */
|
||||
print_pad_templates_information (source_factory);
|
||||
print_pad_templates_information (sink_factory);
|
||||
|
||||
/* Ask the factories to instantiate actual elements */
|
||||
source = gst_element_factory_create (source_factory, "source");
|
||||
sink = gst_element_factory_create (sink_factory, "sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!pipeline || !source || !sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Build the pipeline */
|
||||
gst_bin_add_many (GST_BIN (pipeline), source, sink, NULL);
|
||||
if (gst_element_link (source, sink) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Print initial negotiated caps (in NULL state) */
|
||||
g_print ("In NULL state:\n");
|
||||
print_pad_capabilities (sink, "sink");
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state (check the bus for error messages).\n");
|
||||
}
|
||||
|
||||
/* Wait until error, EOS or State Change */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
do {
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS |
|
||||
GST_MESSAGE_STATE_CHANGED);
|
||||
|
||||
/* Parse message */
|
||||
if (msg != NULL) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
terminate = TRUE;
|
||||
break;
|
||||
case GST_MESSAGE_STATE_CHANGED:
|
||||
/* We are only interested in state-changed messages from the pipeline */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (pipeline)) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
g_print ("\nPipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
/* Print the current capabilities of the sink element */
|
||||
print_pad_capabilities (sink, "sink");
|
||||
}
|
||||
break;
|
||||
default:
|
||||
/* We should not reach here because we only asked for ERRORs, EOS and STATE_CHANGED */
|
||||
g_printerr ("Unexpected message received.\n");
|
||||
break;
|
||||
}
|
||||
gst_message_unref (msg);
|
||||
}
|
||||
} while (!terminate);
|
||||
|
||||
/* Free resources */
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
gst_object_unref (source_factory);
|
||||
gst_object_unref (sink_factory);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-775303622" class="expand-container">
|
||||
<div id="expander-control-775303622" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-775303622" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-6.c -o basic-tutorial-6 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial simply displays information regarding the Pad Capabilities in different time instants.</span></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
The `print_field`, `print_caps` and `print_pad_templates` simply
|
||||
display, in a human-friendly format, the capabilities structures. If you
|
||||
want to learn about the internal organization of the
|
||||
`GstCaps` structure, read the `GStreamer Documentation` regarding Pad
|
||||
Caps.
|
||||
|
||||
``` first-line: 75; theme: Default; brush: cpp; gutter: true
|
||||
/* Shows the CURRENT capabilities of the requested pad in the given element */
|
||||
static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
|
||||
GstPad *pad = NULL;
|
||||
GstCaps *caps = NULL;
|
||||
|
||||
/* Retrieve pad */
|
||||
pad = gst_element_get_static_pad (element, pad_name);
|
||||
if (!pad) {
|
||||
g_printerr ("Could not retrieve pad '%s'\n", pad_name);
|
||||
return;
|
||||
}
|
||||
|
||||
/* Retrieve negotiated caps (or acceptable caps if negotiation is not finished yet) */
|
||||
caps = gst_pad_get_negotiated_caps (pad);
|
||||
if (!caps)
|
||||
caps = gst_pad_get_caps_reffed (pad);
|
||||
|
||||
/* Print and free */
|
||||
g_print ("Caps for the %s pad:\n", pad_name);
|
||||
print_caps (caps, " ");
|
||||
gst_caps_unref (caps);
|
||||
gst_object_unref (pad);
|
||||
}
|
||||
```
|
||||
|
||||
`gst_element_get_static_pad()` retrieves the named Pad from the given
|
||||
element. This Pad is *static* because it is always present in the
|
||||
element. To know more about Pad availability read the `GStreamer
|
||||
documentation` about Pads.
|
||||
|
||||
Then we call `gst_pad_get_negotiated_caps()` to retrieve the Pad's
|
||||
current Capabilities, which can be fixed or not, depending on the state
|
||||
of the negotiation process. They could even be non-existent, in which
|
||||
case, we call `gst_pad_get_caps_reffed()` to retrieve the currently
|
||||
acceptable Pad Capabilities. The currently acceptable Caps will be the
|
||||
Pad Template's Caps in the NULL state, but might change in later states,
|
||||
as the actual hardware Capabilities might be queried.
|
||||
|
||||
`gst_pad_get_caps_reffed()` is usually faster than `gst_pad_get_caps()`,
|
||||
and is enough if the retrieved Caps do not need to be modified.
|
||||
|
||||
We then print these Capabilities.
|
||||
|
||||
``` first-line: 110; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the element factories */
|
||||
source_factory = gst_element_factory_find ("audiotestsrc");
|
||||
sink_factory = gst_element_factory_find ("autoaudiosink");
|
||||
if (!source_factory || !sink_factory) {
|
||||
g_printerr ("Not all element factories could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Print information about the pad templates of these factories */
|
||||
print_pad_templates_information (source_factory);
|
||||
print_pad_templates_information (sink_factory);
|
||||
|
||||
/* Ask the factories to instantiate actual elements */
|
||||
source = gst_element_factory_create (source_factory, "source");
|
||||
sink = gst_element_factory_create (sink_factory, "sink");
|
||||
```
|
||||
|
||||
In the previous tutorials we created the elements directly using
|
||||
`gst_element_factory_make()` and skipped talking about factories, but we
|
||||
will do now. A `GstElementFactory` is in charge of instantiating a
|
||||
particular type of element, identified by its factory name.
|
||||
|
||||
You can use `gst_element_factory_find()` to create a factory of type
|
||||
“videotestsrc”, and then use it to instantiate multiple “videotestsrc”
|
||||
elements using `gst_element_factory_create()`.
|
||||
`gst_element_factory_make()` is really a shortcut for
|
||||
`gst_element_factory_find()`+ `gst_element_factory_create()`.
|
||||
|
||||
The Pad Templates can already be accessed through the factories, so they
|
||||
are printed as soon as the factories are created.
|
||||
|
||||
We skip the pipeline creation and start, and go to the State-Changed
|
||||
message handling:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
case GST_MESSAGE_STATE_CHANGED:
|
||||
/* We are only interested in state-changed messages from the pipeline */
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (pipeline)) {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
g_print ("\nPipeline state changed from %s to %s:\n",
|
||||
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
|
||||
/* Print the current capabilities of the sink element */
|
||||
print_pad_capabilities (sink, "sink");
|
||||
}
|
||||
break;
|
||||
```
|
||||
|
||||
This simply prints the current Pad Caps every time the state of the
|
||||
pipeline changes. You should see, in the output, how the initial caps
|
||||
(the Pad Template's Caps) are progressively refined until they are
|
||||
completely fixed (they contain a single type with no ranges).
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- What are Pad Capabilities and Pad Template Capabilities.
|
||||
|
||||
- How to retrieve them
|
||||
with `gst_pad_get_negotiated_caps()` or `gst_pad_get_caps_reffed()`.
|
||||
|
||||
- That they have different meaning depending on the state of the
|
||||
pipeline (initially they indicate all the possible Capabilities,
|
||||
later they indicate the currently negotiated Caps for the Pad).
|
||||
|
||||
- That Pad Caps are important to know beforehand if two elements can
|
||||
be linked together.
|
||||
|
||||
- That Pad Caps can be found using the `gst-inspect` tool described
|
||||
in [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html).
|
||||
|
||||
Next tutorial shows how data can be manually injected into and extracted
|
||||
from the GStreamer pipeline.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
363
Basic+tutorial+7%3A+Multithreading+and+Pad+Availability.markdown
Normal file
|
@ -0,0 +1,363 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 7: Multithreading and Pad Availability
|
||||
|
||||
This page last changed on Jul 03, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
GStreamer handles multithreading automatically, but, under some
|
||||
circumstances, you might need to decouple threads manually. This
|
||||
tutorial shows how to do this and, in addition, completes the exposition
|
||||
about Pad Availability. More precisely, this document explains:
|
||||
|
||||
- How to create new threads of execution for some parts of the
|
||||
pipeline
|
||||
|
||||
- What is the Pad Availability
|
||||
|
||||
- How to replicate streams
|
||||
|
||||
# Introduction
|
||||
|
||||
### Multithreading
|
||||
|
||||
GStreamer is a multithreaded framework. This means that, internally, it
|
||||
creates and destroys threads as it needs them, for example, to decouple
|
||||
streaming from the application thread. Moreover, plugins are also free
|
||||
to create threads for their own processing, for example, a video decoder
|
||||
could create 4 threads to take full advantage of a CPU with 4 cores.
|
||||
|
||||
On top of this, when building the pipeline an application can specify
|
||||
explicitly that a *branch* (a part of the pipeline) runs on a different
|
||||
thread (for example, to have the audio and video decoders executing
|
||||
simultaneously).
|
||||
|
||||
This is accomplished using the `queue` element, which works as follows.
|
||||
The sink pad just enqueues data and returns control. On a different
|
||||
thread, data is dequeued and pushed downstream. This element is also
|
||||
used for buffering, as seen later in the streaming tutorials. The size
|
||||
of the queue can be controlled through properties.
|
||||
|
||||
### The example pipeline
|
||||
|
||||
This example builds the following pipeline:
|
||||
|
||||
![](attachments/327812/1540141.png)
|
||||
|
||||
The source is a synthetic audio signal (a continuous tone) which is
|
||||
split using a `tee` element (it sends through its source pads everything
|
||||
it receives through its sink pad). One branch then sends the signal to
|
||||
the audio card, and the other renders a video of the waveform and sends
|
||||
it to the screen.
|
||||
|
||||
As seen in the picture, queues create a new thread, so this pipeline
|
||||
runs in 3 threads. Pipelines with more than one sink usually need to be
|
||||
multithreaded, because, to be synchronized, sinks usually block
|
||||
execution until all other sinks are ready, and they cannot get ready if
|
||||
there is only one thread, being blocked by the first sink.
|
||||
|
||||
### Request pads
|
||||
|
||||
In [Basic tutorial 3: Dynamic
|
||||
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html) we saw
|
||||
an element (`uridecodebin`) which had no pads to begin with, and they
|
||||
appeared as data started to flow and the element learned about the
|
||||
media. These are called **Sometimes Pads**, and contrast with the
|
||||
regular pads which are always available and are called **Always Pads**.
|
||||
|
||||
The third kind of pad is the **Request Pad**, which is created on
|
||||
demand. The classical example is the `tee` element, which has one sink
|
||||
pad and no initial source pads: they need to be requested and then
|
||||
`tee` adds them. In this way, an input stream can be replicated any
|
||||
number of times. The disadvantage is that linking elements with Request
|
||||
Pads is not as automatic, as linking Always Pads, as the walkthrough for
|
||||
this example will show.
|
||||
|
||||
Also, to request (or release) pads in the PLAYING or PAUSED states, you
|
||||
need to take additional cautions (Pad blocking) which are not described
|
||||
in this tutorial. It is safe to request (or release) pads in the NULL or
|
||||
READY states, though.
|
||||
|
||||
Without further delay, let's see the code.
|
||||
|
||||
# Simple multithreaded example
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-7.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
**basic-tutorial-7.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline, *audio_source, *tee, *audio_queue, *audio_convert, *audio_resample, *audio_sink;
|
||||
GstElement *video_queue, *visual, *video_convert, *video_sink;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GstPadTemplate *tee_src_pad_template;
|
||||
GstPad *tee_audio_pad, *tee_video_pad;
|
||||
GstPad *queue_audio_pad, *queue_video_pad;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
audio_source = gst_element_factory_make ("audiotestsrc", "audio_source");
|
||||
tee = gst_element_factory_make ("tee", "tee");
|
||||
audio_queue = gst_element_factory_make ("queue", "audio_queue");
|
||||
audio_convert = gst_element_factory_make ("audioconvert", "audio_convert");
|
||||
audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
|
||||
audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
|
||||
video_queue = gst_element_factory_make ("queue", "video_queue");
|
||||
visual = gst_element_factory_make ("wavescope", "visual");
|
||||
video_convert = gst_element_factory_make ("ffmpegcolorspace", "csp");
|
||||
video_sink = gst_element_factory_make ("autovideosink", "video_sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!pipeline || !audio_source || !tee || !audio_queue || !audio_convert || !audio_resample || !audio_sink ||
|
||||
!video_queue || !visual || !video_convert || !video_sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Configure elements */
|
||||
g_object_set (audio_source, "freq", 215.0f, NULL);
|
||||
g_object_set (visual, "shader", 0, "style", 1, NULL);
|
||||
|
||||
/* Link all elements that can be automatically linked because they have "Always" pads */
|
||||
gst_bin_add_many (GST_BIN (pipeline), audio_source, tee, audio_queue, audio_convert, audio_resample, audio_sink,
|
||||
video_queue, visual, video_convert, video_sink, NULL);
|
||||
if (gst_element_link_many (audio_source, tee, NULL) != TRUE ||
|
||||
gst_element_link_many (audio_queue, audio_convert, audio_resample, audio_sink, NULL) != TRUE ||
|
||||
gst_element_link_many (video_queue, visual, video_convert, video_sink, NULL) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Manually link the Tee, which has "Request" pads */
|
||||
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (tee), "src%d");
|
||||
tee_audio_pad = gst_element_request_pad (tee, tee_src_pad_template, NULL, NULL);
|
||||
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
|
||||
queue_audio_pad = gst_element_get_static_pad (audio_queue, "sink");
|
||||
tee_video_pad = gst_element_request_pad (tee, tee_src_pad_template, NULL, NULL);
|
||||
g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
|
||||
queue_video_pad = gst_element_get_static_pad (video_queue, "sink");
|
||||
if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
|
||||
gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK) {
|
||||
g_printerr ("Tee could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
gst_object_unref (queue_audio_pad);
|
||||
gst_object_unref (queue_video_pad);
|
||||
|
||||
/* Start playing the pipeline */
|
||||
gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* Wait until error or EOS */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Release the request pads from the Tee, and unref them */
|
||||
gst_element_release_request_pad (tee, tee_audio_pad);
|
||||
gst_element_release_request_pad (tee, tee_video_pad);
|
||||
gst_object_unref (tee_audio_pad);
|
||||
gst_object_unref (tee_video_pad);
|
||||
|
||||
/* Free resources */
|
||||
if (msg != NULL)
|
||||
gst_message_unref (msg);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-863959374" class="expand-container">
|
||||
<div id="expander-control-863959374" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-863959374" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-7.c -o basic-tutorial-7 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial plays an audible tone through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.</span></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
``` first-line: 15; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the elements */
|
||||
audio_source = gst_element_factory_make ("audiotestsrc", "audio_source");
|
||||
tee = gst_element_factory_make ("tee", "tee");
|
||||
audio_queue = gst_element_factory_make ("queue", "audio_queue");
|
||||
audio_convert = gst_element_factory_make ("audioconvert", "audio_convert");
|
||||
audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
|
||||
audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
|
||||
video_queue = gst_element_factory_make ("queue", "video_queue");
|
||||
visual = gst_element_factory_make ("wavescope", "visual");
|
||||
video_convert = gst_element_factory_make ("ffmpegcolorspace", "csp");
|
||||
video_sink = gst_element_factory_make ("autovideosink", "video_sink");
|
||||
```
|
||||
|
||||
All the elements in the above picture are instantiated here:
|
||||
|
||||
`audiotestsrc` produces a synthetic tone. `wavescope` consumes an audio
|
||||
signal and renders a waveform as if it was an (admittedly cheap)
|
||||
oscilloscope. We have already worked with the `autoaudiosink` and
|
||||
`autovideosink`.
|
||||
|
||||
The conversion elements (`audioconvert`, `audioresample` and
|
||||
`ffmpegcolorspace`) are necessary to guarantee that the pipeline can be
|
||||
linked. Indeed, the Capabilities of the audio and video sinks depend on
|
||||
the hardware, and you do not know at design time if they will match the
|
||||
Caps produced by the `audiotestsrc` and `wavescope`. If the Caps
|
||||
matched, though, these elements act in “pass-through” mode and do not
|
||||
modify the signal, having negligible impact on performance.
|
||||
|
||||
``` first-line: 36; theme: Default; brush: cpp; gutter: true
|
||||
/* Configure elements */
|
||||
g_object_set (audio_source, "freq", 215.0f, NULL);
|
||||
g_object_set (visual, "shader", 0, "style", 1, NULL);
|
||||
```
|
||||
|
||||
Small adjustments for better demonstration: The “freq” property of
|
||||
`audiotestsrc` controls the frequency of the wave (215Hz makes the wave
|
||||
appear almost stationary in the window), and this style and shader for
|
||||
`wavescope` make the wave continuous. Use the `gst-inspect` tool
|
||||
described in [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) to learn all
|
||||
the properties of these
|
||||
elements.
|
||||
|
||||
``` first-line: 40; theme: Default; brush: cpp; gutter: true
|
||||
/* Link all elements that can be automatically linked because they have "Always" pads */
|
||||
gst_bin_add_many (GST_BIN (pipeline), audio_source, tee, audio_queue, audio_convert, audio_sink,
|
||||
video_queue, visual, video_convert, video_sink, NULL);
|
||||
if (gst_element_link_many (audio_source, tee, NULL) != TRUE ||
|
||||
gst_element_link_many (audio_queue, audio_convert, audio_sink, NULL) != TRUE ||
|
||||
gst_element_link_many (video_queue, visual, video_convert, video_sink, NULL) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
This code block adds all elements to the pipeline and then links the
|
||||
ones that can be automatically linked (the ones with Always Pads, as the
|
||||
comment says).
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p><code>gst_element_link_many()</code> can actually link elements with Request Pads. It internally requests the Pads so you do not have worry about the elements being linked having Always or Request Pads. Strange as it might seem, this is actually inconvenient, because you still need to release the requested Pads afterwards, and, if the Pad was requested automatically by <code>gst_element_link_many()</code>, it is easy to forget. Stay out of trouble by always requesting Request Pads manually, as shown in the next code block.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
``` first-line: 51; theme: Default; brush: cpp; gutter: true
|
||||
/* Manually link the Tee, which has "Request" pads */
|
||||
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (tee), "src%d");
|
||||
tee_audio_pad = gst_element_request_pad (tee, tee_src_pad_template, NULL, NULL);
|
||||
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
|
||||
queue_audio_pad = gst_element_get_static_pad (audio_queue, "sink");
|
||||
tee_video_pad = gst_element_request_pad (tee, tee_src_pad_template, NULL, NULL);
|
||||
g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
|
||||
queue_video_pad = gst_element_get_static_pad (video_queue, "sink");
|
||||
if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
|
||||
gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK) {
|
||||
g_printerr ("Tee could not be linked.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
}
|
||||
gst_object_unref (queue_audio_pad);
|
||||
gst_object_unref (queue_video_pad);
|
||||
```
|
||||
|
||||
To link Request Pads, they need to be obtained by “requesting” them to
|
||||
the element. An element might be able to produce different kinds of
|
||||
Request Pads, so, when requesting them, the desired Pad Template must be
|
||||
provided. Pad templates are obtained with
|
||||
`gst_element_class_get_pad_template()` and are identified by their name.
|
||||
In the documentation for the `tee` element we see that it has two pad
|
||||
templates named “sink” (for its sink Pads) and “src%d” (for the Request
|
||||
Pads).
|
||||
|
||||
Once we have the Pad template, we request two Pads from the tee (for the
|
||||
audio and video branches) with `gst_element_request_pad()`.
|
||||
|
||||
We then obtain the Pads from the downstream elements to which these
|
||||
Request Pads need to be linked. These are normal Always Pads, so we
|
||||
obtain them with `gst_element_get_static_pad()`.
|
||||
|
||||
Finally, we link the pads with `gst_pad_link()`. This is the function
|
||||
that `gst_element_link()` and `gst_element_link_many()` use internally.
|
||||
|
||||
The sink Pads we have obtained need to be released with
|
||||
`gst_object_unref()`. The Request Pads will be released when we no
|
||||
longer need them, at the end of the program.
|
||||
|
||||
We then set the pipeline to playing as usual, and wait until an error
|
||||
message or an EOS is produced. The only thing left to so is cleanup the
|
||||
requested Pads:
|
||||
|
||||
``` first-line: 75; theme: Default; brush: cpp; gutter: true
|
||||
/* Release the request pads from the Tee, and unref them */
|
||||
gst_element_release_request_pad (tee, tee_audio_pad);
|
||||
gst_element_release_request_pad (tee, tee_video_pad);
|
||||
gst_object_unref (tee_audio_pad);
|
||||
gst_object_unref (tee_video_pad);
|
||||
```
|
||||
|
||||
`gst_element_release_request_pad()` releases the pad from the `tee`, but
|
||||
it still needs to be unreferenced (freed) with `gst_object_unref()`.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to make parts of a pipeline run on a different thread by using
|
||||
`queue` elements.
|
||||
|
||||
- What is a Request Pad and how to link elements with request pads,
|
||||
with `gst_element_class_get_pad_template()`, `gst_element_request_pad()`, `gst_pad_link()` and
|
||||
`gst_element_release_request_pad()`.
|
||||
|
||||
- How to have the same stream available in different branches by using
|
||||
`tee` elements.
|
||||
|
||||
The next tutorial builds on top of this one to show how data can be
|
||||
manually injected into and extracted from a running pipeline.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-7.png](attachments/327812/1540161.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-7.png](attachments/327812/2424839.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-7.png](attachments/327812/1540141.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
569
Basic+tutorial+8%3A+Short-cutting+the+pipeline.markdown
Normal file
|
@ -0,0 +1,569 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 8: Short-cutting the pipeline
|
||||
|
||||
This page last changed on Jun 19, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Pipelines constructed with GStreamer do not need to be completely
|
||||
closed. Data can be injected into the pipeline and extracted from it at
|
||||
any time, in a variety of ways. This tutorial shows:
|
||||
|
||||
- How to inject external data into a general GStreamer pipeline.
|
||||
|
||||
- How to extract data from a general GStreamer pipeline.
|
||||
|
||||
- How to access and manipulate this data.
|
||||
|
||||
[Playback tutorial 3: Short-cutting the
|
||||
pipeline](Playback%2Btutorial%2B3%253A%2BShort-cutting%2Bthe%2Bpipeline.html) explains
|
||||
how to achieve the same goals in a playbin2-based pipeline.
|
||||
|
||||
# Introduction
|
||||
|
||||
Applications can interact with the data flowing through a GStreamer
|
||||
pipeline in several ways. This tutorial describes the easiest one, since
|
||||
it uses elements that have been created for this sole purpose.
|
||||
|
||||
The element used to inject application data into a GStreamer pipeline is
|
||||
`appsrc`, and its counterpart, used to extract GStreamer data back to
|
||||
the application is `appsink`. To avoid confusing the names, think of it
|
||||
from GStreamer's point of view: `appsrc` is just a regular source, that
|
||||
provides data magically fallen from the sky (provided by the
|
||||
application, actually). `appsink` is a regular sink, where the data
|
||||
flowing through a GStreamer pipeline goes to die (it is recovered by the
|
||||
application, actually).
|
||||
|
||||
`appsrc` and `appsink` are so versatile that they offer their own API
|
||||
(see their documentation), which can be accessed by linking against the
|
||||
`gstreamer-app` library. In this tutorial, however, we will use a
|
||||
simpler approach and control them through signals.
|
||||
|
||||
`appsrc` can work in a variety of modes: in **pull** mode, it requests
|
||||
data from the application every time it needs it. In **push** mode, the
|
||||
application pushes data at its own pace. Furthermore, in push mode, the
|
||||
application can choose to be blocked in the push function when enough
|
||||
data has already been provided, or it can listen to the
|
||||
`enough-data` and `need-data` signals to control flow. This example
|
||||
implements the latter approach. Information regarding the other methods
|
||||
can be found in the `appsrc` documentation.
|
||||
|
||||
### Buffers
|
||||
|
||||
Data travels through a GStreamer pipeline in chunks called **buffers**.
|
||||
Since this example produces and consumes data, we need to know about
|
||||
`GstBuffer`s.
|
||||
|
||||
Source Pads produce buffers, that are consumed by Sink Pads; GStreamer
|
||||
takes these buffers and passes them from element to element.
|
||||
|
||||
A buffer simply represents a piece of data, do not assume that all
|
||||
buffers will have the same size, or represent the same amount of time.
|
||||
Neither should you assume that if a single buffer enters an element, a
|
||||
single buffer will come out. Elements are free to do with the received
|
||||
buffers as they please.
|
||||
|
||||
Every buffer has an attached `GstCaps` structure that describes the kind
|
||||
of media contained in the buffer. Also, buffers have an attached
|
||||
time-stamp and duration, that describe in which moment the content of
|
||||
the buffer should be rendered or displayed. Time stamping is a very
|
||||
complex and delicate subject, but this simplified vision should suffice
|
||||
for now.
|
||||
|
||||
As an example, a `filesrc` (a GStreamer element that reads files)
|
||||
produces buffers with the “ANY” caps and no time-stamping information.
|
||||
After demuxing (see [Basic tutorial 3: Dynamic
|
||||
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html))
|
||||
buffers can have some specific caps, for example “video/x-h264”. After
|
||||
decoding, each buffer will contain a single video frame with raw caps
|
||||
(for example, “video/x-raw-yuv”) and very precise time stamps indicating
|
||||
when should that frame be displayed.
|
||||
|
||||
### This tutorial
|
||||
|
||||
This tutorial expands [Basic tutorial 7: Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) in
|
||||
two ways: Firstly, the `audiotestsrc` is replaced by an `appsrc` that
|
||||
will generate the audio data. Secondly, a new branch is added to the
|
||||
`tee` so data going into the audio sink and the wave display is also
|
||||
replicated into an `appsink`. The `appsink` uploads the information back
|
||||
into the application, which then just notifies the user that data has
|
||||
been received, but it could obviously perform more complex tasks.
|
||||
|
||||
![](attachments/1442189/1540158.png)
|
||||
|
||||
# A crude waveform generator
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-8.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
#include <string.h>
|
||||
|
||||
#define CHUNK_SIZE 1024 /* Amount of bytes we are sending in each buffer */
|
||||
#define SAMPLE_RATE 44100 /* Samples per second we are sending */
|
||||
#define AUDIO_CAPS "audio/x-raw-int,channels=1,rate=%d,signed=(boolean)true,width=16,depth=16,endianness=BYTE_ORDER"
|
||||
|
||||
/* Structure to contain all our information, so we can pass it to callbacks */
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline, *app_source, *tee, *audio_queue, *audio_convert1, *audio_resample, *audio_sink;
|
||||
GstElement *video_queue, *audio_convert2, *visual, *video_convert, *video_sink;
|
||||
GstElement *app_queue, *app_sink;
|
||||
|
||||
guint64 num_samples; /* Number of samples generated so far (for timestamp generation) */
|
||||
gfloat a, b, c, d; /* For waveform generation */
|
||||
|
||||
guint sourceid; /* To control the GSource */
|
||||
|
||||
GMainLoop *main_loop; /* GLib's Main Loop */
|
||||
} CustomData;
|
||||
|
||||
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
|
||||
* The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
|
||||
* and is removed when appsrc has enough data (enough-data signal).
|
||||
*/
|
||||
static gboolean push_data (CustomData *data) {
|
||||
GstBuffer *buffer;
|
||||
GstFlowReturn ret;
|
||||
int i;
|
||||
gint16 *raw;
|
||||
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
|
||||
gfloat freq;
|
||||
|
||||
/* Create a new empty buffer */
|
||||
buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);
|
||||
|
||||
/* Set its timestamp and duration */
|
||||
GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
|
||||
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);
|
||||
|
||||
/* Generate some psychodelic waveforms */
|
||||
raw = (gint16 *)GST_BUFFER_DATA (buffer);
|
||||
data->c += data->d;
|
||||
data->d -= data->c / 1000;
|
||||
freq = 1100 + 1000 * data->d;
|
||||
for (i = 0; i < num_samples; i++) {
|
||||
data->a += data->b;
|
||||
data->b -= data->a / freq;
|
||||
raw[i] = (gint16)(500 * data->a);
|
||||
}
|
||||
data->num_samples += num_samples;
|
||||
|
||||
/* Push the buffer into the appsrc */
|
||||
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
|
||||
|
||||
/* Free the buffer now that we are done with it */
|
||||
gst_buffer_unref (buffer);
|
||||
|
||||
if (ret != GST_FLOW_OK) {
|
||||
/* We got some error, stop sending data */
|
||||
return FALSE;
|
||||
}
|
||||
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
|
||||
* to the mainloop to start pushing data into the appsrc */
|
||||
static void start_feed (GstElement *source, guint size, CustomData *data) {
|
||||
if (data->sourceid == 0) {
|
||||
g_print ("Start feeding\n");
|
||||
data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
|
||||
}
|
||||
}
|
||||
|
||||
/* This callback triggers when appsrc has enough data and we can stop sending.
|
||||
* We remove the idle handler from the mainloop */
|
||||
static void stop_feed (GstElement *source, CustomData *data) {
|
||||
if (data->sourceid != 0) {
|
||||
g_print ("Stop feeding\n");
|
||||
g_source_remove (data->sourceid);
|
||||
data->sourceid = 0;
|
||||
}
|
||||
}
|
||||
|
||||
/* The appsink has received a buffer */
|
||||
static void new_buffer (GstElement *sink, CustomData *data) {
|
||||
GstBuffer *buffer;
|
||||
|
||||
/* Retrieve the buffer */
|
||||
g_signal_emit_by_name (sink, "pull-buffer", &buffer);
|
||||
if (buffer) {
|
||||
/* The only thing we do in this example is print a * to indicate a received buffer */
|
||||
g_print ("*");
|
||||
gst_buffer_unref (buffer);
|
||||
}
|
||||
}
|
||||
|
||||
/* This function is called when an error message is posted on the bus */
|
||||
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
/* Print error details on the screen */
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
|
||||
g_main_loop_quit (data->main_loop);
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstPadTemplate *tee_src_pad_template;
|
||||
GstPad *tee_audio_pad, *tee_video_pad, *tee_app_pad;
|
||||
GstPad *queue_audio_pad, *queue_video_pad, *queue_app_pad;
|
||||
gchar *audio_caps_text;
|
||||
GstCaps *audio_caps;
|
||||
GstBus *bus;
|
||||
|
||||
/* Initialize cumstom data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
data.b = 1; /* For waveform generation */
|
||||
data.d = 1;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
data.app_source = gst_element_factory_make ("appsrc", "audio_source");
|
||||
data.tee = gst_element_factory_make ("tee", "tee");
|
||||
data.audio_queue = gst_element_factory_make ("queue", "audio_queue");
|
||||
data.audio_convert1 = gst_element_factory_make ("audioconvert", "audio_convert1");
|
||||
data.audio_resample = gst_element_factory_make ("audioresample", "audio_resample");
|
||||
data.audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
|
||||
data.video_queue = gst_element_factory_make ("queue", "video_queue");
|
||||
data.audio_convert2 = gst_element_factory_make ("audioconvert", "audio_convert2");
|
||||
data.visual = gst_element_factory_make ("wavescope", "visual");
|
||||
data.video_convert = gst_element_factory_make ("ffmpegcolorspace", "csp");
|
||||
data.video_sink = gst_element_factory_make ("autovideosink", "video_sink");
|
||||
data.app_queue = gst_element_factory_make ("queue", "app_queue");
|
||||
data.app_sink = gst_element_factory_make ("appsink", "app_sink");
|
||||
|
||||
/* Create the empty pipeline */
|
||||
data.pipeline = gst_pipeline_new ("test-pipeline");
|
||||
|
||||
if (!data.pipeline || !data.app_source || !data.tee || !data.audio_queue || !data.audio_convert1 ||
|
||||
!data.audio_resample || !data.audio_sink || !data.video_queue || !data.audio_convert2 || !data.visual ||
|
||||
!data.video_convert || !data.video_sink || !data.app_queue || !data.app_sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Configure wavescope */
|
||||
g_object_set (data.visual, "shader", 0, "style", 0, NULL);
|
||||
|
||||
/* Configure appsrc */
|
||||
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
|
||||
audio_caps = gst_caps_from_string (audio_caps_text);
|
||||
g_object_set (data.app_source, "caps", audio_caps, NULL);
|
||||
g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
|
||||
g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);
|
||||
|
||||
/* Configure appsink */
|
||||
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
|
||||
g_signal_connect (data.app_sink, "new-buffer", G_CALLBACK (new_buffer), &data);
|
||||
gst_caps_unref (audio_caps);
|
||||
g_free (audio_caps_text);
|
||||
|
||||
/* Link all elements that can be automatically linked because they have "Always" pads */
|
||||
gst_bin_add_many (GST_BIN (data.pipeline), data.app_source, data.tee, data.audio_queue, data.audio_convert1, data.audio_resample,
|
||||
data.audio_sink, data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, data.app_queue,
|
||||
data.app_sink, NULL);
|
||||
if (gst_element_link_many (data.app_source, data.tee, NULL) != TRUE ||
|
||||
gst_element_link_many (data.audio_queue, data.audio_convert1, data.audio_resample, data.audio_sink, NULL) != TRUE ||
|
||||
gst_element_link_many (data.video_queue, data.audio_convert2, data.visual, data.video_convert, data.video_sink, NULL) != TRUE ||
|
||||
gst_element_link_many (data.app_queue, data.app_sink, NULL) != TRUE) {
|
||||
g_printerr ("Elements could not be linked.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Manually link the Tee, which has "Request" pads */
|
||||
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (data.tee), "src%d");
|
||||
tee_audio_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
|
||||
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
|
||||
queue_audio_pad = gst_element_get_static_pad (data.audio_queue, "sink");
|
||||
tee_video_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
|
||||
g_print ("Obtained request pad %s for video branch.\n", gst_pad_get_name (tee_video_pad));
|
||||
queue_video_pad = gst_element_get_static_pad (data.video_queue, "sink");
|
||||
tee_app_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
|
||||
g_print ("Obtained request pad %s for app branch.\n", gst_pad_get_name (tee_app_pad));
|
||||
queue_app_pad = gst_element_get_static_pad (data.app_queue, "sink");
|
||||
if (gst_pad_link (tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK ||
|
||||
gst_pad_link (tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK ||
|
||||
gst_pad_link (tee_app_pad, queue_app_pad) != GST_PAD_LINK_OK) {
|
||||
g_printerr ("Tee could not be linked\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
gst_object_unref (queue_audio_pad);
|
||||
gst_object_unref (queue_video_pad);
|
||||
gst_object_unref (queue_app_pad);
|
||||
|
||||
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
||||
bus = gst_element_get_bus (data.pipeline);
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
|
||||
gst_object_unref (bus);
|
||||
|
||||
/* Start playing the pipeline */
|
||||
gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
data.main_loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.main_loop);
|
||||
|
||||
/* Release the request pads from the Tee, and unref them */
|
||||
gst_element_release_request_pad (data.tee, tee_audio_pad);
|
||||
gst_element_release_request_pad (data.tee, tee_video_pad);
|
||||
gst_element_release_request_pad (data.tee, tee_app_pad);
|
||||
gst_object_unref (tee_audio_pad);
|
||||
gst_object_unref (tee_video_pad);
|
||||
gst_object_unref (tee_app_pad);
|
||||
|
||||
/* Free resources */
|
||||
gst_element_set_state (data.pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (data.pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1760171459" class="expand-container">
|
||||
<div id="expander-control-1760171459" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1760171459" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-8.c -o basic-tutorial-8 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial plays an audible tone for varying frequency through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.</span></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
The code to create the pipeline (Lines 131 to 205) is an enlarged
|
||||
version of [Basic tutorial 7: Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html).
|
||||
It involves instantiating all the elements, link the elements with
|
||||
Always Pads, and manually link the Request Pads of the `tee` element.
|
||||
|
||||
Regarding the configuration of the `appsrc` and `appsink` elements:
|
||||
|
||||
``` first-line: 159; theme: Default; brush: cpp; gutter: true
|
||||
/* Configure appsrc */
|
||||
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
|
||||
audio_caps = gst_caps_from_string (audio_caps_text);
|
||||
g_object_set (data.app_source, "caps", audio_caps, NULL);
|
||||
g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
|
||||
g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);
|
||||
```
|
||||
|
||||
The first property that needs to be set on the `appsrc` is `caps`. It
|
||||
specifies the kind of data that the element is going to produce, so
|
||||
GStreamer can check if linking with downstream elements is possible
|
||||
(this is, if the downstream elements will understand this kind of data).
|
||||
This property must be a `GstCaps` object, which is easily built from a
|
||||
string with `gst_caps_from_string()`.
|
||||
|
||||
We then connect to the `need-data` and `enough-data` signals. These are
|
||||
fired by `appsrc` when its internal queue of data is running low or
|
||||
almost full, respectively. We will use these signals to start and stop
|
||||
(respectively) our signal generation process.
|
||||
|
||||
``` first-line: 166; theme: Default; brush: cpp; gutter: true
|
||||
/* Configure appsink */
|
||||
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
|
||||
g_signal_connect (data.app_sink, "new-buffer", G_CALLBACK (new_buffer), &data);
|
||||
gst_caps_unref (audio_caps);
|
||||
g_free (audio_caps_text);
|
||||
```
|
||||
|
||||
Regarding the `appsink` configuration, we connect to the
|
||||
`new-buffer` signal, which is emitted every time the sink receives a
|
||||
buffer. Also, the signal emission needs to be enabled through the
|
||||
`emit-signals` property, because, by default, it is disabled.
|
||||
|
||||
Starting the pipeline, waiting for messages and final cleanup is done as
|
||||
usual. Let's review the callbacks we have just
|
||||
registered:
|
||||
|
||||
``` first-line: 67; theme: Default; brush: cpp; gutter: true
|
||||
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
|
||||
* to the mainloop to start pushing data into the appsrc */
|
||||
static void start_feed (GstElement *source, guint size, CustomData *data) {
|
||||
if (data->sourceid == 0) {
|
||||
g_print ("Start feeding\n");
|
||||
data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This function is called when the internal queue of `appsrc` is about to
|
||||
starve (run out of data). The only thing we do here is register a GLib
|
||||
idle function with `g_idle_add()` that feeds data to `appsrc` until it
|
||||
is full again. A GLib idle function is a method that GLib will call from
|
||||
its main loop whenever it is “idle”, this is, when it has no
|
||||
higher-priority tasks to perform. It requires a GLib `GMainLoop` to be
|
||||
instantiated and running, obviously.
|
||||
|
||||
This is only one of the multiple approaches that `appsrc` allows. In
|
||||
particular, buffers do not need to be fed into `appsrc` from the main
|
||||
thread using GLib, and you do not need to use the `need-data` and
|
||||
`enough-data` signals to synchronize with `appsrc` (although this is
|
||||
allegedly the most convenient).
|
||||
|
||||
We take note of the sourceid that `g_idle_add()` returns, so we can
|
||||
disable it
|
||||
later.
|
||||
|
||||
``` first-line: 76; theme: Default; brush: cpp; gutter: true
|
||||
/* This callback triggers when appsrc has enough data and we can stop sending.
|
||||
* We remove the idle handler from the mainloop */
|
||||
static void stop_feed (GstElement *source, CustomData *data) {
|
||||
if (data->sourceid != 0) {
|
||||
g_print ("Stop feeding\n");
|
||||
g_source_remove (data->sourceid);
|
||||
data->sourceid = 0;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This function is called when the internal queue of `appsrc` is full
|
||||
enough so we stop pushing data. Here we simply remove the idle function
|
||||
by using `g_source_remove()` (The idle function is implemented as a
|
||||
`GSource`).
|
||||
|
||||
``` first-line: 22; theme: Default; brush: cpp; gutter: true
|
||||
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
|
||||
* The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
|
||||
* and is removed when appsrc has enough data (enough-data signal).
|
||||
*/
|
||||
static gboolean push_data (CustomData *data) {
|
||||
GstBuffer *buffer;
|
||||
GstFlowReturn ret;
|
||||
int i;
|
||||
gint16 *raw;
|
||||
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
|
||||
gfloat freq;
|
||||
|
||||
/* Create a new empty buffer */
|
||||
buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);
|
||||
|
||||
/* Set its timestamp and duration */
|
||||
GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
|
||||
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);
|
||||
|
||||
/* Generate some psychodelic waveforms */
|
||||
raw = (gint16 *)GST_BUFFER_DATA (buffer);
|
||||
```
|
||||
|
||||
This is the function that feeds `appsrc`. It will be called by GLib at
|
||||
times and rates which are out of our control, but we know that we will
|
||||
disable it when its job is done (when the queue in `appsrc` is full).
|
||||
|
||||
Its first task is to create a new buffer with a given size (in this
|
||||
example, it is arbitrarily set to 1024 bytes) with
|
||||
`gst_buffer_new_and_alloc()`.
|
||||
|
||||
We count the number of samples that we have generated so far with the
|
||||
`CustomData.num_samples` variable, so we can time-stamp this buffer
|
||||
using the `GST_BUFFER_TIMESTAMP` macro in `GstBuffer`.
|
||||
|
||||
Since we are producing buffers of the same size, their duration is the
|
||||
same and is set using the `GST_BUFFER_DURATION` in `GstBuffer`.
|
||||
|
||||
`gst_util_uint64_scale()` is a utility function that scales (multiply
|
||||
and divide) numbers which can be large, without fear of overflows.
|
||||
|
||||
The bytes that for the buffer can be accessed with GST\_BUFFER\_DATA in
|
||||
`GstBuffer` (Be careful not to write past the end of the buffer: you
|
||||
allocated it, so you know its size).
|
||||
|
||||
We will skip over the waveform generation, since it is outside the scope
|
||||
of this tutorial (it is simply a funny way of generating a pretty
|
||||
psychedelic wave).
|
||||
|
||||
``` first-line: 53; theme: Default; brush: cpp; gutter: true
|
||||
/* Push the buffer into the appsrc */
|
||||
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
|
||||
|
||||
/* Free the buffer now that we are done with it */
|
||||
gst_buffer_unref (buffer);
|
||||
```
|
||||
|
||||
Once we have the buffer ready, we pass it to `appsrc` with the
|
||||
`push-buffer` action signal (see information box at the end of [Playback
|
||||
tutorial 1: Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html)), and then
|
||||
`gst_buffer_unref()` it since we no longer need it.
|
||||
|
||||
``` first-line: 86; theme: Default; brush: cpp; gutter: true
|
||||
/* The appsink has received a buffer */
|
||||
static void new_buffer (GstElement *sink, CustomData *data) {
|
||||
GstBuffer *buffer;
|
||||
|
||||
/* Retrieve the buffer */
|
||||
g_signal_emit_by_name (sink, "pull-buffer", &buffer);
|
||||
if (buffer) {
|
||||
/* The only thing we do in this example is print a * to indicate a received buffer */
|
||||
g_print ("*");
|
||||
gst_buffer_unref (buffer);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Finally, this is the function that gets called when the
|
||||
`appsink` receives a buffer. We use the `pull-buffer` action signal to
|
||||
retrieve the buffer and then just print some indicator on the screen. We
|
||||
can retrieve the data pointer using the `GST_BUFFER_DATA` macro and the
|
||||
data size using the `GST_BUFFER_SIZE` macro in `GstBuffer`. Remember
|
||||
that this buffer does not have to match the buffer that we produced in
|
||||
the `push_data` function, any element in the path could have altered the
|
||||
buffers in any way (Not in this example: there is only a `tee` in the
|
||||
path between `appsrc` and `appsink`, and it does not change the content
|
||||
of the buffers).
|
||||
|
||||
We then `gst_buffer_unref()` the buffer, and this tutorial is done.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown how applications can:
|
||||
|
||||
- Inject data into a pipeline using the `appsrc`element.
|
||||
- Retrieve data from a pipeline using the `appsink` element.
|
||||
- Manipulate this data by accessing the `GstBuffer`.
|
||||
|
||||
In a playbin2-based pipeline, the same goals are achieved in a slightly
|
||||
different way. [Playback tutorial 3: Short-cutting the
|
||||
pipeline](Playback%2Btutorial%2B3%253A%2BShort-cutting%2Bthe%2Bpipeline.html) shows
|
||||
how to do it.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-8.png](attachments/1442189/1540159.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-8.png](attachments/1442189/1540160.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[basic-tutorial-8.png](attachments/1442189/1540158.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
554
Basic+tutorial+9%3A+Media+information+gathering.markdown
Normal file
|
@ -0,0 +1,554 @@
|
|||
# GStreamer SDK documentation : Basic tutorial 9: Media information gathering
|
||||
|
||||
This page last changed on May 30, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Sometimes you might want to quickly find out what kind of media a file
|
||||
(or URI) contains, or if you will be able to play the media at all. You
|
||||
can build a pipeline, set it to run, and watch the bus messages, but
|
||||
GStreamer has a utility that does just that for you. This tutorial
|
||||
shows:
|
||||
|
||||
- How to recover information regarding a URI
|
||||
|
||||
- How to find out if a URI is playable
|
||||
|
||||
# Introduction
|
||||
|
||||
`GstDiscoverer` is a utility object found in the `pbutils` library
|
||||
(Plug-in Base utilities) that accepts a URI or list of URIs, and returns
|
||||
information about them. It can work in synchronous or asynchronous
|
||||
modes.
|
||||
|
||||
In synchronous mode, there is only a single function to call,
|
||||
`gst_discoverer_discover_uri()`, which blocks until the information is
|
||||
ready. Due to this blocking, it is usually less interesting for
|
||||
GUI-based applications and the asynchronous mode is used, as described
|
||||
in this tutorial.
|
||||
|
||||
The recovered information includes codec descriptions, stream topology
|
||||
(number of streams and sub-streams) and available metadata (like the
|
||||
audio language).
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)As an example, this is the result
|
||||
of discovering http://docs.gstreamer.com/media/sintel\_trailer-480p.webm
|
||||
(Click to expand)
|
||||
|
||||
Duration: 0:00:52.250000000
|
||||
Tags:
|
||||
video codec: On2 VP8
|
||||
language code: en
|
||||
container format: Matroska
|
||||
application name: ffmpeg2theora-0.24
|
||||
encoder: Xiph.Org libVorbis I 20090709
|
||||
encoder version: 0
|
||||
audio codec: Vorbis
|
||||
nominal bitrate: 80000
|
||||
bitrate: 80000
|
||||
Seekable: yes
|
||||
Stream information:
|
||||
container: WebM
|
||||
audio: Vorbis
|
||||
Tags:
|
||||
language code: en
|
||||
container format: Matroska
|
||||
audio codec: Vorbis
|
||||
application name: ffmpeg2theora-0.24
|
||||
encoder: Xiph.Org libVorbis I 20090709
|
||||
encoder version: 0
|
||||
nominal bitrate: 80000
|
||||
bitrate: 80000
|
||||
video: VP8
|
||||
Tags:
|
||||
video codec: VP8 video
|
||||
container format: Matroska
|
||||
|
||||
The following code tries to discover the URI provided through the
|
||||
command line, and outputs the retrieved information (If no URI is
|
||||
provided it uses a default one).
|
||||
|
||||
This is a simplified version of what the `gst-discoverer` tool does
|
||||
([Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)), which is
|
||||
an application that only displays data, but does not perform any
|
||||
playback.
|
||||
|
||||
# The GStreamer Discoverer
|
||||
|
||||
Copy this code into a text file named `basic-tutorial-9.c` (or find it
|
||||
in the SDK installation).
|
||||
|
||||
**basic-tutorial-9.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <string.h>
|
||||
#include <gst/gst.h>
|
||||
#include <gst/pbutils/pbutils.h>
|
||||
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstDiscoverer *discoverer;
|
||||
GMainLoop *loop;
|
||||
} CustomData;
|
||||
|
||||
/* Print a tag in a human-readable format (name: value) */
|
||||
static void print_tag_foreach (const GstTagList *tags, const gchar *tag, gpointer user_data) {
|
||||
GValue val = { 0, };
|
||||
gchar *str;
|
||||
gint depth = GPOINTER_TO_INT (user_data);
|
||||
|
||||
gst_tag_list_copy_value (&val, tags, tag);
|
||||
|
||||
if (G_VALUE_HOLDS_STRING (&val))
|
||||
str = g_value_dup_string (&val);
|
||||
else
|
||||
str = gst_value_serialize (&val);
|
||||
|
||||
g_print ("%*s%s: %s\n", 2 * depth, " ", gst_tag_get_nick (tag), str);
|
||||
g_free (str);
|
||||
|
||||
g_value_unset (&val);
|
||||
}
|
||||
|
||||
/* Print information regarding a stream */
|
||||
static void print_stream_info (GstDiscovererStreamInfo *info, gint depth) {
|
||||
gchar *desc = NULL;
|
||||
GstCaps *caps;
|
||||
const GstTagList *tags;
|
||||
|
||||
caps = gst_discoverer_stream_info_get_caps (info);
|
||||
|
||||
if (caps) {
|
||||
if (gst_caps_is_fixed (caps))
|
||||
desc = gst_pb_utils_get_codec_description (caps);
|
||||
else
|
||||
desc = gst_caps_to_string (caps);
|
||||
gst_caps_unref (caps);
|
||||
}
|
||||
|
||||
g_print ("%*s%s: %s\n", 2 * depth, " ", gst_discoverer_stream_info_get_stream_type_nick (info), (desc ? desc : ""));
|
||||
|
||||
if (desc) {
|
||||
g_free (desc);
|
||||
desc = NULL;
|
||||
}
|
||||
|
||||
tags = gst_discoverer_stream_info_get_tags (info);
|
||||
if (tags) {
|
||||
g_print ("%*sTags:\n", 2 * (depth + 1), " ");
|
||||
gst_tag_list_foreach (tags, print_tag_foreach, GINT_TO_POINTER (depth + 2));
|
||||
}
|
||||
}
|
||||
|
||||
/* Print information regarding a stream and its substreams, if any */
|
||||
static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
|
||||
GstDiscovererStreamInfo *next;
|
||||
|
||||
if (!info)
|
||||
return;
|
||||
|
||||
print_stream_info (info, depth);
|
||||
|
||||
next = gst_discoverer_stream_info_get_next (info);
|
||||
if (next) {
|
||||
print_topology (next, depth + 1);
|
||||
gst_discoverer_stream_info_unref (next);
|
||||
} else if (GST_IS_DISCOVERER_CONTAINER_INFO (info)) {
|
||||
GList *tmp, *streams;
|
||||
|
||||
streams = gst_discoverer_container_info_get_streams (GST_DISCOVERER_CONTAINER_INFO (info));
|
||||
for (tmp = streams; tmp; tmp = tmp->next) {
|
||||
GstDiscovererStreamInfo *tmpinf = (GstDiscovererStreamInfo *) tmp->data;
|
||||
print_topology (tmpinf, depth + 1);
|
||||
}
|
||||
gst_discoverer_stream_info_list_free (streams);
|
||||
}
|
||||
}
|
||||
|
||||
/* This function is called every time the discoverer has information regarding
|
||||
* one of the URIs we provided.*/
|
||||
static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info, GError *err, CustomData *data) {
|
||||
GstDiscovererResult result;
|
||||
const gchar *uri;
|
||||
const GstTagList *tags;
|
||||
GstDiscovererStreamInfo *sinfo;
|
||||
|
||||
uri = gst_discoverer_info_get_uri (info);
|
||||
result = gst_discoverer_info_get_result (info);
|
||||
switch (result) {
|
||||
case GST_DISCOVERER_URI_INVALID:
|
||||
g_print ("Invalid URI '%s'\n", uri);
|
||||
break;
|
||||
case GST_DISCOVERER_ERROR:
|
||||
g_print ("Discoverer error: %s\n", err->message);
|
||||
break;
|
||||
case GST_DISCOVERER_TIMEOUT:
|
||||
g_print ("Timeout\n");
|
||||
break;
|
||||
case GST_DISCOVERER_BUSY:
|
||||
g_print ("Busy\n");
|
||||
break;
|
||||
case GST_DISCOVERER_MISSING_PLUGINS:{
|
||||
const GstStructure *s;
|
||||
gchar *str;
|
||||
|
||||
s = gst_discoverer_info_get_misc (info);
|
||||
str = gst_structure_to_string (s);
|
||||
|
||||
g_print ("Missing plugins: %s\n", str);
|
||||
g_free (str);
|
||||
break;
|
||||
}
|
||||
case GST_DISCOVERER_OK:
|
||||
g_print ("Discovered '%s'\n", uri);
|
||||
break;
|
||||
}
|
||||
|
||||
if (result != GST_DISCOVERER_OK) {
|
||||
g_printerr ("This URI cannot be played\n");
|
||||
return;
|
||||
}
|
||||
|
||||
/* If we got no error, show the retrieved information */
|
||||
|
||||
g_print ("\nDuration: %" GST_TIME_FORMAT "\n", GST_TIME_ARGS (gst_discoverer_info_get_duration (info)));
|
||||
|
||||
tags = gst_discoverer_info_get_tags (info);
|
||||
if (tags) {
|
||||
g_print ("Tags:\n");
|
||||
gst_tag_list_foreach (tags, print_tag_foreach, GINT_TO_POINTER (1));
|
||||
}
|
||||
|
||||
g_print ("Seekable: %s\n", (gst_discoverer_info_get_seekable (info) ? "yes" : "no"));
|
||||
|
||||
g_print ("\n");
|
||||
|
||||
sinfo = gst_discoverer_info_get_stream_info (info);
|
||||
if (!sinfo)
|
||||
return;
|
||||
|
||||
g_print ("Stream information:\n");
|
||||
|
||||
print_topology (sinfo, 1);
|
||||
|
||||
gst_discoverer_stream_info_unref (sinfo);
|
||||
|
||||
g_print ("\n");
|
||||
}
|
||||
|
||||
/* This function is called when the discoverer has finished examining
|
||||
* all the URIs we provided.*/
|
||||
static void on_finished_cb (GstDiscoverer *discoverer, CustomData *data) {
|
||||
g_print ("Finished discovering\n");
|
||||
|
||||
g_main_loop_quit (data->loop);
|
||||
}
|
||||
|
||||
int main (int argc, char **argv) {
|
||||
CustomData data;
|
||||
GError *err = NULL;
|
||||
gchar *uri = "http://docs.gstreamer.com/media/sintel_trailer-480p.webm";
|
||||
|
||||
/* if a URI was provided, use it instead of the default one */
|
||||
if (argc > 1) {
|
||||
uri = argv[1];
|
||||
}
|
||||
|
||||
/* Initialize cumstom data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
g_print ("Discovering '%s'\n", uri);
|
||||
|
||||
/* Instantiate the Discoverer */
|
||||
data.discoverer = gst_discoverer_new (5 * GST_SECOND, &err);
|
||||
if (!data.discoverer) {
|
||||
g_print ("Error creating discoverer instance: %s\n", err->message);
|
||||
g_clear_error (&err);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Connect to the interesting signals */
|
||||
g_signal_connect (data.discoverer, "discovered", G_CALLBACK (on_discovered_cb), &data);
|
||||
g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &data);
|
||||
|
||||
/* Start the discoverer process (nothing to do yet) */
|
||||
gst_discoverer_start (data.discoverer);
|
||||
|
||||
/* Add a request to process asynchronously the URI passed through the command line */
|
||||
if (!gst_discoverer_discover_uri_async (data.discoverer, uri)) {
|
||||
g_print ("Failed to start discovering URI '%s'\n", uri);
|
||||
g_object_unref (data.discoverer);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Create a GLib Main Loop and set it to run, so we can wait for the signals */
|
||||
data.loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.loop);
|
||||
|
||||
/* Stop the discoverer process */
|
||||
gst_discoverer_stop (data.discoverer);
|
||||
|
||||
/* Free resources */
|
||||
g_object_unref (data.discoverer);
|
||||
g_main_loop_unref (data.loop);
|
||||
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-2049220294" class="expand-container">
|
||||
<div id="expander-control-2049220294" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-2049220294" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc basic-tutorial-9.c -o basic-tutorial-9 `pkg-config --cflags --libs gstreamer-pbutils-0.10 gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial opens the URI passed as the first parameter in the command line (or a default URI if none is provided) and outputs information about it on the screen. If the media is located on the Internet, the application might take a bit to react depending on your connection speed.</span></p>
|
||||
<p>Required libraries: <code>gstreamer-pbutils-0.10 gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
These are the main steps to use the `GstDiscoverer`:
|
||||
|
||||
``` first-line: 182; theme: Default; brush: cpp; gutter: true
|
||||
/* Instantiate the Discoverer */
|
||||
data.discoverer = gst_discoverer_new (5 * GST_SECOND, &err);
|
||||
if (!data.discoverer) {
|
||||
g_print ("Error creating discoverer instance: %s\n", err->message);
|
||||
g_clear_error (&err);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
`gst_discoverer_new()` creates a new Discoverer object. The first
|
||||
parameter is the timeout per file, in nanoseconds (use the
|
||||
`GST_SECOND` macro for simplicity).
|
||||
|
||||
``` first-line: 190; theme: Default; brush: cpp; gutter: true
|
||||
/* Connect to the interesting signals */
|
||||
g_signal_connect (data.discoverer, "discovered", G_CALLBACK (on_discovered_cb), &data);
|
||||
g_signal_connect (data.discoverer, "finished", G_CALLBACK (on_finished_cb), &data);
|
||||
```
|
||||
|
||||
Connect to the interesting signals, as usual. We discuss them in the
|
||||
snippet for their callbacks.
|
||||
|
||||
``` first-line: 194; theme: Default; brush: cpp; gutter: true
|
||||
/* Start the discoverer process (nothing to do yet) */
|
||||
gst_discoverer_start (data.discoverer);
|
||||
```
|
||||
|
||||
`gst_discoverer_start()` launches the discovering process, but we have
|
||||
not provided any URI to discover yet. This is done
|
||||
next:
|
||||
|
||||
``` first-line: 197; theme: Default; brush: cpp; gutter: true
|
||||
/* Add a request to process asynchronously the URI passed through the command line */
|
||||
if (!gst_discoverer_discover_uri_async (data.discoverer, uri)) {
|
||||
g_print ("Failed to start discovering URI '%s'\n", uri);
|
||||
g_object_unref (data.discoverer);
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
`gst_discoverer_discover_uri_async()` enqueues the provided URI for
|
||||
discovery. Multiple URIs can be enqueued with this function. As the
|
||||
discovery process for each of them finishes, the registered callback
|
||||
functions will be fired
|
||||
up.
|
||||
|
||||
``` first-line: 204; theme: Default; brush: cpp; gutter: true
|
||||
/* Create a GLib Main Loop and set it to run, so we can wait for the signals */
|
||||
data.loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.loop);
|
||||
```
|
||||
|
||||
The usual GLib main loop is instantiated and executed. We will get out
|
||||
of it when `g_main_loop_quit()` is called from the
|
||||
`on_finished_cb` callback.
|
||||
|
||||
``` first-line: 208; theme: Default; brush: cpp; gutter: true
|
||||
/* Stop the discoverer process */
|
||||
gst_discoverer_stop (data.discoverer);
|
||||
```
|
||||
|
||||
Once we are done with the discoverer, we stop it with
|
||||
`gst_discoverer_stop()` and unref it with `g_object_unref()`.
|
||||
|
||||
Let's review now the callbacks we have
|
||||
registered:
|
||||
|
||||
``` first-line: 85; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called every time the discoverer has information regarding
|
||||
* one of the URIs we provided.*/
|
||||
static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info, GError *err, CustomData *data) {
|
||||
GstDiscovererResult result;
|
||||
const gchar *uri;
|
||||
const GstTagList *tags;
|
||||
GstDiscovererStreamInfo *sinfo;
|
||||
|
||||
uri = gst_discoverer_info_get_uri (info);
|
||||
result = gst_discoverer_info_get_result (info);
|
||||
```
|
||||
|
||||
We got here because the Discoverer has finished working on one URI, and
|
||||
provides us a `GstDiscovererInfo` structure with all the information.
|
||||
|
||||
The first step is to retrieve the particular URI this call refers to (in
|
||||
case we had multiple discover process running, which is not the case in
|
||||
this example) with `gst_discoverer_info_get_uri()` and the discovery
|
||||
result with `gst_discoverer_info_get_result()`.
|
||||
|
||||
``` first-line: 95; theme: Default; brush: cpp; gutter: true
|
||||
switch (result) {
|
||||
case GST_DISCOVERER_URI_INVALID:
|
||||
g_print ("Invalid URI '%s'\n", uri);
|
||||
break;
|
||||
case GST_DISCOVERER_ERROR:
|
||||
g_print ("Discoverer error: %s\n", err->message);
|
||||
break;
|
||||
case GST_DISCOVERER_TIMEOUT:
|
||||
g_print ("Timeout\n");
|
||||
break;
|
||||
case GST_DISCOVERER_BUSY:
|
||||
g_print ("Busy\n");
|
||||
break;
|
||||
case GST_DISCOVERER_MISSING_PLUGINS:{
|
||||
const GstStructure *s;
|
||||
gchar *str;
|
||||
|
||||
s = gst_discoverer_info_get_misc (info);
|
||||
str = gst_structure_to_string (s);
|
||||
|
||||
g_print ("Missing plugins: %s\n", str);
|
||||
g_free (str);
|
||||
break;
|
||||
}
|
||||
case GST_DISCOVERER_OK:
|
||||
g_print ("Discovered '%s'\n", uri);
|
||||
break;
|
||||
}
|
||||
|
||||
|
||||
if (result != GST_DISCOVERER_OK) {
|
||||
g_printerr ("This URI cannot be played\n");
|
||||
return;
|
||||
}
|
||||
```
|
||||
|
||||
As the code shows, any result other than `GST_DISCOVERER_OK` means that
|
||||
there has been some kind of problem, and this URI cannot be played. The
|
||||
reasons can vary, but the enum values are quite explicit
|
||||
(`GST_DISCOVERER_BUSY` can only happen when in synchronous mode, which
|
||||
is not used in this example).
|
||||
|
||||
If no error happened, information can be retrieved from the
|
||||
`GstDiscovererInfo` structure with the different
|
||||
`gst_discoverer_info_get_*` methods (like,
|
||||
`gst_discoverer_info_get_duration()`, for example).
|
||||
|
||||
Bits of information which are made of lists, like tags and stream info,
|
||||
needs some extra parsing:
|
||||
|
||||
``` first-line: 133; theme: Default; brush: cpp; gutter: true
|
||||
tags = gst_discoverer_info_get_tags (info);
|
||||
if (tags) {
|
||||
g_print ("Tags:\n");
|
||||
gst_tag_list_foreach (tags, print_tag_foreach, GINT_TO_POINTER (1));
|
||||
}
|
||||
```
|
||||
|
||||
Tags are metadata (labels) attached to the media. They can be examined
|
||||
with `gst_tag_list_foreach()`, which will call `print_tag_foreach` for
|
||||
each tag found (the list could also be traversed manually, for example,
|
||||
or a specific tag could be searched for with
|
||||
`gst_tag_list_get_string()`). The code for `print_tag_foreach` is pretty
|
||||
much self-explicative.
|
||||
|
||||
``` first-line: 143; theme: Default; brush: cpp; gutter: false
|
||||
sinfo = gst_discoverer_info_get_stream_info (info);
|
||||
if (!sinfo)
|
||||
return;
|
||||
|
||||
g_print ("Stream information:\n");
|
||||
|
||||
print_topology (sinfo, 1);
|
||||
|
||||
gst_discoverer_stream_info_unref (sinfo);
|
||||
```
|
||||
|
||||
`gst_discoverer_info_get_stream_info()` returns
|
||||
a `GstDiscovererStreamInfo` structure that is parsed in
|
||||
the `print_topology` function, and then discarded
|
||||
with `gst_discoverer_stream_info_unref()`.
|
||||
|
||||
``` first-line: 60; theme: Default; brush: cpp; gutter: true
|
||||
/* Print information regarding a stream and its substreams, if any */
|
||||
static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
|
||||
GstDiscovererStreamInfo *next;
|
||||
|
||||
if (!info)
|
||||
return;
|
||||
|
||||
print_stream_info (info, depth);
|
||||
|
||||
next = gst_discoverer_stream_info_get_next (info);
|
||||
if (next) {
|
||||
print_topology (next, depth + 1);
|
||||
gst_discoverer_stream_info_unref (next);
|
||||
} else if (GST_IS_DISCOVERER_CONTAINER_INFO (info)) {
|
||||
GList *tmp, *streams;
|
||||
|
||||
streams = gst_discoverer_container_info_get_streams (GST_DISCOVERER_CONTAINER_INFO (info));
|
||||
for (tmp = streams; tmp; tmp = tmp->next) {
|
||||
GstDiscovererStreamInfo *tmpinf = (GstDiscovererStreamInfo *) tmp->data;
|
||||
print_topology (tmpinf, depth + 1);
|
||||
}
|
||||
gst_discoverer_stream_info_list_free (streams);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The `print_stream_info` function's code is also pretty much
|
||||
self-explicative: it prints the stream's capabilities and then the
|
||||
associated caps, using `print_tag_foreach` too.
|
||||
|
||||
Then, `print_topology` looks for the next element to display. If
|
||||
`gst_discoverer_stream_info_get_next()` returns a non-NULL stream info,
|
||||
it refers to our descendant and that should be displayed. Otherwise, if
|
||||
we are a container, recursively call `print_topology` on each of our
|
||||
children obatined with `gst_discoverer_container_info_get_streams()`.
|
||||
Otherwise, we are a final stream, and do not need to recurse (This part
|
||||
of the Discoverer API is admittedly a bit obscure).
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to recover information regarding a URI using the `GstDiscoverer`
|
||||
|
||||
- How to find out if a URI is playable by looking at the return code
|
||||
obtained with `gst_discoverer_info_get_result()`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
11
Basic+tutorials.markdown
Normal file
|
@ -0,0 +1,11 @@
|
|||
# GStreamer SDK documentation : Basic tutorials
|
||||
|
||||
This page last changed on Mar 28, 2012 by xartigas.
|
||||
|
||||
# Welcome to the GStreamer SDK Basic tutorials
|
||||
|
||||
These tutorials describe general topics required to understand the rest
|
||||
of tutorials in the GStreamer SDK.
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
280
Building+from+source+using+Cerbero.markdown
Normal file
|
@ -0,0 +1,280 @@
|
|||
# GStreamer SDK documentation : Building from source using Cerbero
|
||||
|
||||
This page last changed on Jul 15, 2013 by ylatuya.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>This section is intended for advanced users.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
#### Build requirements
|
||||
|
||||
The GStreamer SDK build system provides bootstrapping facilities for all
|
||||
platforms, but it still needs a minimum base to bootstrap:
|
||||
|
||||
- python \>= 2.6 and python's `argparse` module, which is already
|
||||
included in python2.7.
|
||||
- git
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><strong>Windows users</strong><br />
|
||||
|
||||
<p>Cerbero can be used on Windows using the Msys/MinGW shell (a Unix-like shell for Windows). There is a bit of setup that you need to do before Cerbero can take control.</p>
|
||||
<p>You need to install the following programs:</p>
|
||||
<ul>
|
||||
<li><a href="http://www.python.org/getit/releases/2.7/" class="external-link">Python 2.7</a></li>
|
||||
<li><a href="http://code.google.com/p/msysgit/downloads/list?q=full+installer+official+git" class="external-link">Git</a> (Select the install option "Checkout as-is, Commit as-is" and install it in a path without spaces, eg: c:\Git)</li>
|
||||
<li><a href="https://sourceforge.net/projects/mingw/files/Installer/mingw-get-inst/" class="external-link">Msys/MinGW</a> (Install it with all the options enabled)</li>
|
||||
<li><a href="http://www.cmake.org/cmake/resources/software.html" class="external-link">CMake</a> (Select the option "Add CMake in system path for the current user")</li>
|
||||
<li><p><a href="http://yasm.tortall.net/Download.html" class="external-link">Yasm</a> (Download the win32 or win64 version for your platform, name it <code>yasm.exe</code>, and place it in your MinGW <code>bin</code> directory, typically, <code>C:\MinGW\bin</code>)</p></li>
|
||||
<li><a href="http://wix.codeplex.com/releases/view/60102" class="external-link">WiX 3.5</a> </li>
|
||||
<li><a href="http://www.microsoft.com/en-us/download/details.aspx?id=8279" class="external-link">Microsoft SDK 7.1</a> (Install the SDK samples and the Visual C++ Compilers, required to build the DirectShow base classes. Might need installing the .NET 4 Framework first if the SDK installer doesn't find it)</li>
|
||||
<li><a href="http://msdn.microsoft.com/en-us/windows/hardware/hh852365" class="external-link">Windows Driver Kit 7.1.0</a></li>
|
||||
</ul>
|
||||
<p>Your user ID can't have spaces (eg: John Smith). Paths with spaces are not correctly handled in the build system and msys uses the user ID for the home folder.</p>
|
||||
<p>Cerbero must be run in the MinGW shell, which is accessible from the main menu once MinGW is installed.</p>
|
||||
<p>The last step is making <code>python</code> and <code>git</code> available from the shell, for which you will need to create a <code>.profile</code> file. Issue this command from within the MinGW shell:</p>
|
||||
<div class="code panel" style="border-width: 1px;">
|
||||
<div class="codeContent panelContent">
|
||||
<pre class="theme: Default; brush: plain; gutter: false" style="font-size:12px;"><code>echo "export PATH=\"\$PATH:/c/Python27:/c/Git/bin\"" >> ~/.profile</code></pre>
|
||||
</div>
|
||||
</div>
|
||||
<p>Using the appropriate paths to where you installed <code>python</code> and <code>git</code>.</p>
|
||||
<p>(Note that inside the shell, / is mapped to c:\Mingw\msys\1.0\ )</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><strong>OS X users</strong><br />
|
||||
|
||||
<p>To use cerbero on OS X you need to install the "Command Line Tools" from XCode. They are available from the "Preferences" dialog under "Downloads".</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><strong>iOS developers</strong><br />
|
||||
|
||||
<p>If you want to build the GStreamer-SDK for iOS, you also need the iOS SDK. The minimum required iOS SDK version is 6.0 and is included in <a href="https://developer.apple.com/devcenter/ios/index.action#downloads" class="external-link">Xcode</a> since version 4.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Download the sources
|
||||
|
||||
To build the GStreamer SDK, you first need to download **Cerbero**.
|
||||
Cerbero is a multi-platform build system for Open Source projects that
|
||||
builds and creates native packages for different platforms,
|
||||
architectures and distributions.
|
||||
|
||||
Get a copy of Cerbero by cloning the git repository:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
git clone git://anongit.freedesktop.org/gstreamer-sdk/cerbero
|
||||
```
|
||||
|
||||
Cerbero can be run uninstalled and for convenience you can create an
|
||||
alias in your `.bashrc` file*. *If you prefer to skip this step,
|
||||
remember that you need to replace the calls to `cerbero` with
|
||||
`./cerbero-uninstalled` in the next steps.
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
echo "alias cerbero='~/git/cerbero/cerbero-uninstalled'" >> ~/.bashrc
|
||||
```
|
||||
|
||||
#### Setup environment
|
||||
|
||||
After Cerbero and the base requirements are in place, you need to setup
|
||||
the build environment.
|
||||
|
||||
Cerbero reads the configuration file `$HOME/.cerbero/cerbero.cbc` to
|
||||
determine the build options. This file is a python code which allows
|
||||
overriding/defining some options.
|
||||
|
||||
If the file does not exist, Cerbero will try to determine the distro you
|
||||
are running and will use default build options such as the default build
|
||||
directory. The default options should work fine on the supported
|
||||
distributions.
|
||||
|
||||
An example configuration file with detailed comments can be found
|
||||
[here](http://www.freedesktop.org/software/gstreamer-sdk/cerbero.cbc.template)
|
||||
|
||||
To fire up the bootstrapping process, go to the directory where you
|
||||
cloned/unpacked Cerbero and type:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero bootstrap
|
||||
```
|
||||
|
||||
Enter the superuser/root password when prompted.
|
||||
|
||||
The bootstrap process will then install all packages required to build
|
||||
the GStreamer SDK.
|
||||
|
||||
#### Build the SDK
|
||||
|
||||
To generate the SDK, use the following command:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero package gstreamer-sdk
|
||||
```
|
||||
|
||||
This should build all required SDK components and create packages for
|
||||
your distribution at the Cerbero source directory.
|
||||
|
||||
A list of supported packages to build can be retrieved using:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero list-packages
|
||||
```
|
||||
|
||||
Packages are composed of 0 (in case of a meta package) or more
|
||||
components that can be built separately if desired. The components are
|
||||
defined as individual recipes and can be listed with:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero list
|
||||
```
|
||||
|
||||
To build an individual recipe and its dependencies, do the following:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero build <recipe_name>
|
||||
```
|
||||
|
||||
Or to build or force a rebuild of a recipe without building its
|
||||
dependencies use:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero buildone <recipe_name>
|
||||
```
|
||||
|
||||
To wipe everything and start from scratch:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero wipe
|
||||
```
|
||||
|
||||
Once built, the output of the recipes will be installed at the prefix
|
||||
defined in the Cerbero configuration file `$HOME/.cerbero/cerbero.cbc`
|
||||
or at `$HOME/cerbero/dist` if no prefix is defined.
|
||||
|
||||
#### Build a single project with the SDK
|
||||
|
||||
Rebuilding the whole SDK is relatively fast on Linux and OS X, but it
|
||||
can be very slow on Windows, so if you only need to rebuild a single
|
||||
project (eg: gst-plugins-good to patch qtdemux) there is a much faster
|
||||
way of doing it. You will need to follow the steps detailed in this
|
||||
page, but skipping the step "**Build the SDK**", and installing the
|
||||
SDK's development files as explained in [Installing the
|
||||
SDK](Installing%2Bthe%2BSDK.html).
|
||||
|
||||
By default, Cerbero uses as prefix a folder in the user directory with
|
||||
the following schema ~/cerbero/dist/$platform\_$arch, but for the SDK we
|
||||
must change this prefix to use its installation directory. This can be
|
||||
done with a custom configuration file named *custom.cbc*:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
# For Windows x86
|
||||
prefix='/c/gstreamer-sdk/0.10/x86/'
|
||||
|
||||
# For Windows x86_64
|
||||
#prefix='/c/gstreamer-sdk/0.10/x86_64'
|
||||
|
||||
# For Linux
|
||||
#prefix='/opt/gstreamer-sdk'
|
||||
|
||||
# For OS X
|
||||
#prefix='/Library/Frameworks/GStreamer.framework/Versions/0.10'
|
||||
```
|
||||
|
||||
The prefix path might not be writable by your current user. Make sure
|
||||
you fix it before, for instance with:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
$ sudo chown -R <username> /Library/Frameworks/GStreamer.framework/
|
||||
```
|
||||
|
||||
Cerbero has a shell command that starts a new shell with all the
|
||||
environment set up to target the SDK. You can start a new shell using
|
||||
the installation prefix defined in *custom.cbc *with the following
|
||||
command:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
$ cerbero -c custom.cbc shell
|
||||
```
|
||||
|
||||
Once you are in Cerbero's shell you can compile new
|
||||
projects targeting the SDK using the regular build
|
||||
process:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
$ git clone -b sdk-0.10.31 git://anongit.freedesktop.org/gstreamer-sdk/gst-plugins-good; cd gst-plugins-good
|
||||
$ sh autogen.sh --disable-gtk-doc --prefix=<prefix>
|
||||
$ make -C gst/isomp4
|
||||
```
|
||||
|
||||
|
||||
|
||||
#### Cross-compilation of the SDK
|
||||
|
||||
Cerbero can be used to cross-compile the SDK to other platforms like
|
||||
Android or Windows. You only need to use a configuration file that sets
|
||||
the target platform, but we also provide a set of of pre-defined
|
||||
configuration files for the supported platforms (you will find them in
|
||||
the `config` folder with the `.cbc` extension
|
||||
|
||||
##### Android
|
||||
|
||||
You can cross-compile the SDK for Android from a Linux host using the
|
||||
configuration file `config/cross-android.cbc`. Replace all the previous
|
||||
commands with:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero -c config/cross-android.cbc <command>
|
||||
```
|
||||
|
||||
##### Windows
|
||||
|
||||
The SDK can also be cross-compiled to Windows from Linux, but you should
|
||||
only use it for testing purpose. The DirectShow plugins cannot be
|
||||
cross-compiled yet and WiX can't be used with Wine yet, so packages can
|
||||
only be created from Windows.
|
||||
|
||||
Replace all the above commands for Windows 32bits with:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero -c config/cross-win32.cbc <command>
|
||||
```
|
||||
|
||||
Or with using the following for Windows 64bits:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
cerbero -c config/cross-win64.cbc <command>
|
||||
```
|
||||
|
||||
##### iOS
|
||||
|
||||
To cross compile for iOS from OS X, use the configuration file
|
||||
`config/cross-ios-universal.cbc`. Replace all previous commands with:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
cerbero -c config/cross-ios-universal.cbc <command>
|
||||
```
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
38
Contact.markdown
Normal file
|
@ -0,0 +1,38 @@
|
|||
# GStreamer SDK documentation : Contact
|
||||
|
||||
This page last changed on Dec 03, 2012 by xartigas.
|
||||
|
||||
.
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="50%" />
|
||||
<col width="50%" />
|
||||
</colgroup>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><h3 id="Contact-MainSite">Main Site</h3>
|
||||
<p>For general information</p></td>
|
||||
<td><h3 id="Contact-wwwgstreamercom"><a href="http://www.gstreamer.com/" class="external-link">www.gstreamer.com</a></h3></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><h3 id="Contact-DocumentationSite">Documentation Site</h3>
|
||||
<p>Technical information, tutorials and reference guide</p></td>
|
||||
<td><h3 id="Contact-docsgstreamercom"><a href="http://docs.gstreamer.com/" class="external-link">docs.gstreamer.com</a></h3></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><h3 id="Contact-Commercialsupport">Commercial support</h3>
|
||||
<p>For commercial enquires</p></td>
|
||||
<td><h3 id="Contact-wwwgstreamercomsupporthtml"><a href="http://www.gstreamer.com/contact.html" class="external-link">www.gstreamer.com/support.html</a></h3></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><h3 id="Contact-Bugtracker">Bug tracker</h3>
|
||||
<p>File a bug, make a suggestion or simply leave a comment.<br />
|
||||
We want to hear from you!</p></td>
|
||||
<td><h3 id="Contact-TheGStreamerSDKBugzilla"><a href="https://bugs.freedesktop.org/enter_bug.cgi?product=GStreamer%20SDK" class="external-link">The GStreamer SDK Bugzilla</a></h3></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:28
|
||||
|
131
Deploying+your+application.markdown
Normal file
|
@ -0,0 +1,131 @@
|
|||
# GStreamer SDK documentation : Deploying your application
|
||||
|
||||
This page last changed on Jun 12, 2013 by xartigas.
|
||||
|
||||
Once the development of your application is finished, you will need to
|
||||
deploy it to the target machine, usually in the form of a package or
|
||||
installer. You have several options here, and, even though this subject
|
||||
is not really in the scope of this documentation, we will give some
|
||||
hints to try to help you.
|
||||
|
||||
# Multiplatform vs. single-platform packaging system
|
||||
|
||||
The first choice you need to make is whether you want to deploy your
|
||||
application to more than one platform. If yes, then you have the choice
|
||||
to use a different packaging system for each platform, or use one that
|
||||
can deliver to all platforms simultaneously. This table summarizes the
|
||||
pros and cons of each option.
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="33%" />
|
||||
<col width="33%" />
|
||||
<col width="33%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th> </th>
|
||||
<th>Pros</th>
|
||||
<th>Cons</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p><strong><span style="color: rgb(0,0,0);">Multiplatform packaging </span></strong><span style="color: rgb(0,0,0);"><strong>system</strong></span></p>
|
||||
<p><span style="color: rgb(0,0,0);"> </span>The same system is used to package your application for all platforms</p></td>
|
||||
<td><ul>
|
||||
<li><p>You only need to develop your packaging system once, and it works for all supported platforms.</p></li>
|
||||
</ul></td>
|
||||
<td><ul>
|
||||
<li>On some platforms, the packaging system might impose artificial restrictions inherited from the other platforms.</li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><p><strong>Single-platform packaging system</strong></p>
|
||||
<p>Your application is packaged using a different system on each platform.</p></td>
|
||||
<td><ul>
|
||||
<li><p>You can make use of all the advantages each <span>packaging </span>system can offer.</p>
|
||||
<p> </p></li>
|
||||
</ul></td>
|
||||
<td><ul>
|
||||
<li><p>You need to develop a new <span>packaging </span>system for each supported platform.</p></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
The GStreamer SDK itself supports three different platforms (Linux, Mac
|
||||
OS X and Windows) and has been built using a multiplatform packaging
|
||||
system named **Cerbero**, which is available for you to use, should you
|
||||
choose to go down this route.
|
||||
|
||||
# Shared vs. private GStreamer deployment
|
||||
|
||||
You can install the GStreamer SDK in the target machine in the same way
|
||||
you installed it in your development machine, you can deploy it
|
||||
privately, or you can even customize it before deploying. Here you have
|
||||
a few options:
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="33%" />
|
||||
<col width="33%" />
|
||||
<col width="33%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th> </th>
|
||||
<th>Pros</th>
|
||||
<th>Cons</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p><strong>Shared SDK</strong></p>
|
||||
<p>The GStreamer SDK is installed independently of your application, as a prerequisite, in a common place in the target computer (<code>C:\Program Files</code>, for example). You application uses an environment variable to locate it.</p></td>
|
||||
<td><ul>
|
||||
<li><p>If more than one application in the target computer uses the SDK, it is installed only once and shared, reducing disk usage.</p></li>
|
||||
</ul></td>
|
||||
<td><ul>
|
||||
<li>Tampering or corruption of the shared SDK installation can make your application fail.</li>
|
||||
<li><p>The SDK libraries are unprotected and open to tampering.</p></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><p><strong>Private SDK with dynamic linking</strong></p>
|
||||
<p>Your application deploys the GStreamer SDK to a private folder.</p></td>
|
||||
<td><ul>
|
||||
<li><p>Your SDK is independent of other applications, so it does not get corrupted if other applications mess with their installations.</p></li>
|
||||
</ul></td>
|
||||
<td><ul>
|
||||
<li><p>If multiple applications in the target computer use the GStreamer SDK, it won’t be shared, consuming more disk space.</p></li>
|
||||
<li><p>The SDK libraries are unprotected and open to tampering.</p></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><p><strong>Private SDK with static linking</strong></p>
|
||||
<p>Your application links statically against the GStreamer SDK, so it effectively becomes part of your application binary.</p></td>
|
||||
<td><ul>
|
||||
<li>Your SDK is independent of other applications, so it does not get corrupted if other applications mess with their installations.</li>
|
||||
<li>It is much harder to tamper with the SDK, since it is embedded in your application.</li>
|
||||
</ul></td>
|
||||
<td><ul>
|
||||
<li><span>If multiple applications in the target computer use the GStreamer SDK, it won’t be shared, consuming more disk space.</span></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
The following pages give further directions for some of the above
|
||||
options.
|
||||
|
||||
- Platform-specific packaging methods:
|
||||
- For [Mac OS X](Mac%2BOS%2BX%2Bdeployment.html)
|
||||
- For [Windows](Windows%2Bdeployment.html)
|
||||
- [Multiplatform deployment using
|
||||
Cerbero](Multiplatform%2Bdeployment%2Busing%2BCerbero.html)
|
||||
|
||||
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
157
Frequently+Asked+Questions.markdown
Normal file
|
@ -0,0 +1,157 @@
|
|||
# GStreamer SDK documentation : Frequently Asked Questions
|
||||
|
||||
This page last changed on Jun 12, 2013 by xartigas.
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="50%" />
|
||||
<col width="50%" />
|
||||
</colgroup>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p>This is a list of frequently asked questions. Use the menu on the right to quickly locate your enquire.</p>
|
||||
<h1 id="FrequentlyAskedQuestions-WhatisGStreamer" class="western">What is GStreamer?</h1>
|
||||
<p>As stated in the <a href="http://gstreamer.freedesktop.org/" class="external-link">GStreamer home page</a>: GStreamer is a library for constructing graphs of media-handling components. The applications it supports range from simple Ogg/Vorbis playback, audio/video streaming to complex audio (mixing) and video (non-linear editing) processing.</p>
|
||||
<p>Applications can take advantage of advances in codec and filter technology transparently. Developers can add new codecs and filters by writing a simple plugin with a clean, generic interface.</p>
|
||||
<p>GStreamer is released under the LGPL. The 0.10 series is API and ABI stable.</p></td>
|
||||
<td><div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p></p>
|
||||
<div>
|
||||
<ul>
|
||||
<li><a href="#FrequentlyAskedQuestions-WhatisGStreamer">What is GStreamer?</a></li>
|
||||
<li><a href="#FrequentlyAskedQuestions-WhatistheGStreamerSDK">What is the GStreamer SDK?</a></li>
|
||||
<li><a href="#FrequentlyAskedQuestions-Whatisthedifferencebetweenthissiteandtheoneatfreedesktop">What is the difference between this site and the one at freedesktop?</a></li>
|
||||
<li><a href="#FrequentlyAskedQuestions-WhousesGStreamer">Who uses GStreamer?</a></li>
|
||||
<li><a href="#FrequentlyAskedQuestions-Whatisthetargetaudience">What is the target audience?</a></li>
|
||||
<li><a href="#FrequentlyAskedQuestions-HowmanyversionsoftheGStreamerSDKarethere">How many versions of the GStreamer SDK are there?</a></li>
|
||||
<li><a href="#FrequentlyAskedQuestions-IsthereanSDKforAndroid">Is there an SDK for Android?</a></li>
|
||||
<li><a href="#FrequentlyAskedQuestions-WhataboutiOS">What about iOS?</a></li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# What is the GStreamer SDK?
|
||||
|
||||
GStreamer has sometimes proven to be difficult to understand, in part
|
||||
due to the complex nature of the data it manipulates (multimedia
|
||||
graphs), and in part due to it being a live framework which continuously
|
||||
evolves.
|
||||
|
||||
This SDK, essentially, does four things for you:
|
||||
|
||||
- Sticks to one particular version of GStreamer, so you do not have to
|
||||
worry about changing documentation and features being added or
|
||||
deprecated.
|
||||
- Provides a wide range of tutorials to get you up to speed as fast as
|
||||
possible.
|
||||
- Provides a “build system”, this is, instructions and tools to build
|
||||
your application from the source code. On Linux, this is more or
|
||||
less straightforward, but GStreamer support for other operating
|
||||
system has been historically more cumbersome. The SDK is also
|
||||
available on Windows and Mac OS X to ease building on these systems.
|
||||
- Provides a one-stop place for all documentation related to
|
||||
GStreamer, including the libraries it depends on.
|
||||
|
||||
# What is the difference between this site and the one at freedesktop?
|
||||
|
||||
The main and most important difference between these two sites is the
|
||||
SDK: while [gstreamer.com](http://gstreamer.com/) provides a binary
|
||||
ready for use for anyone interested in this
|
||||
framework, [gstreamer.freedesktop.org](http://gstreamer.freedesktop.org/) pursues
|
||||
other objectives:
|
||||
|
||||
- [gstreamer.freedesktop.org](http://gstreamer.freedesktop.org/) is
|
||||
the main vehicle for the GStreamer community members to communicate
|
||||
with each other and improve this framework. This site is oriented to
|
||||
developers contributing to both, the development of the framework
|
||||
itself and building multimedia applications.
|
||||
- In contrast, the objective
|
||||
of [gstreamer.com](http://gstreamer.com/) is facilitating the use
|
||||
of GStreamer by providing a stable version of this framework,
|
||||
pre-built and ready to use. People using this SDK are mainly
|
||||
interested in the use of GStreamer as a tool that will help them
|
||||
build multimedia applications.
|
||||
|
||||
In summary:
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th> </th>
|
||||
<th><a href="http://gstreamer.freedesktop.org/" class="external-link">gstreamer.freedesktop.org</a></th>
|
||||
<th><a href="http://gstreamer.com/" class="external-link">gstreamer.com</a></th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td>Version</td>
|
||||
<td>All GStreamer versions</td>
|
||||
<td>One single stable version</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>Documentation</td>
|
||||
<td>Mainly for consulting, covering all aspects of developing on and for GStreamer.</td>
|
||||
<td>Consulting, How-to’s and tutorials only for the version of the SDK</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>SDK</td>
|
||||
<td>No</td>
|
||||
<td>Yes</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Who uses GStreamer?
|
||||
|
||||
Some cool media apps using GStreamer:
|
||||
|
||||
- [Banshee](http://banshee.fm/)
|
||||
- [Songbird](http://getsongbird.com/)
|
||||
- [Snappy](http://live.gnome.org/snappy)
|
||||
- [Empathy](https://live.gnome.org/Empathy)
|
||||
- [Totem](http://projects.gnome.org/totem/)
|
||||
- [Transmaggedon](http://www.linuxrising.org/)
|
||||
- [Flumotion](http://www.flumotion.net/)
|
||||
- [Landell](http://landell.holoscopio.com/)
|
||||
- [Longo match](http://longomatch.org/)
|
||||
- [Rygel](https://live.gnome.org/Rygel)
|
||||
- [Sound
|
||||
juicer](http://www.burtonini.com/blog/computers/sound-juicer)
|
||||
- [Buzztard](http://wiki.buzztard.org/index.php/Overview)
|
||||
- [Moovida](http://www.moovida.com/) (Based on Banshee)
|
||||
- [Fluendo DVD
|
||||
Player](http://www.fluendo.com/shop/product/fluendo-dvd-player/)
|
||||
- and many [more](http://gstreamer.freedesktop.org/apps/)
|
||||
|
||||
# What is the target audience?
|
||||
|
||||
This SDK is mainly intended for application developers wanting to add
|
||||
cross-platform multimedia capabilities to their programs.
|
||||
|
||||
Developers wanting to write their own plug-ins for GStreamer should
|
||||
already be familiar with the topics covered in this documentation, and
|
||||
should refer to the [GStreamer
|
||||
documentation](http://gstreamer.freedesktop.org/documentation/) (particularly,
|
||||
the plug-in writer's guide).
|
||||
|
||||
# How many versions of the GStreamer SDK are there?
|
||||
|
||||
Check out the [Releases](Releases.html) page for detailed information.
|
||||
|
||||
# Is there an SDK for Android?
|
||||
|
||||
The GStreamer SDK supports the Android platform since [version 2012.11
|
||||
(Brahmaputra)](2012.11%2BBrahmaputra.html).
|
||||
|
||||
# What about iOS?
|
||||
|
||||
The GStreamer SDK supports the iOS platform since [version 2013.6
|
||||
(Congo)](2013.6%2BCongo.html).
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:28
|
||||
|
57
GStreamer+reference.markdown
Normal file
|
@ -0,0 +1,57 @@
|
|||
# GStreamer SDK documentation : GStreamer reference
|
||||
|
||||
This page last changed on Jun 25, 2012 by xartigas.
|
||||
|
||||
# GStreamer reference
|
||||
|
||||
This is the complete documentation for the GStreamer API, automatically
|
||||
generated from the source code of GStreamer.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/forbidden.png" width="16" height="16" /></td>
|
||||
<td><p>This documentation is still being prepared. Meanwhile, all links in the tutorials point to <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/docs/2012.5/gstreamer-0.10" class="external-link">gstreamer.freedesktop.org</a></p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## Comments:
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="100%" />
|
||||
</colgroup>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><a href=""></a>
|
||||
<p>It seems links point to the HEAD (1.0) docs, not 0.10 docs (<a href="http://gstreamer.freedesktop.org/data/doc/gstreamer/0.10.36/" class="uri" class="external-link">http://gstreamer.freedesktop.org/data/doc/gstreamer/0.10.36/</a>).</p>
|
||||
<div class="smallfont" align="left" style="color: #666666; width: 98%; margin-bottom: 10px;">
|
||||
<img src="images/icons/contenttypes/comment_16.png" width="16" height="16" /> Posted by at Jun 11, 2012 11:34
|
||||
</div></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><a href=""></a>
|
||||
<p>Thanks for fixing that link, but it seems all the links in the tutorials also need to be updated. Hopefully Confluence provides for global search and replace. <img src="s/en_GB/3152/1/_/images/icons/emoticons/smile.png" alt="(smile)" class="emoticon emoticon-smile" /></p>
|
||||
<div class="smallfont" align="left" style="color: #666666; width: 98%; margin-bottom: 10px;">
|
||||
<img src="images/icons/contenttypes/comment_16.png" width="16" height="16" /> Posted by at Jun 11, 2012 12:49
|
||||
</div></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><a href=""></a>
|
||||
<p>Fixed! Thanks for the feedback.</p>
|
||||
<div class="smallfont" align="left" style="color: #666666; width: 98%; margin-bottom: 10px;">
|
||||
<img src="images/icons/contenttypes/comment_16.png" width="16" height="16" /> Posted by xartigas at Jun 11, 2012 13:20
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:28
|
||||
|
106
Home.markdown
Normal file
|
@ -0,0 +1,106 @@
|
|||
# ![Home Page](images/icons/contenttypes/home_page_16.png) GStreamer SDK documentation : Home
|
||||
|
||||
This page last changed on Jun 14, 2013 by xartigas.
|
||||
|
||||
# Welcome to the GStreamer SDK documentation\!
|
||||
|
||||
Use the navigation bar on the left or choose from the following menu.
|
||||
Feel free to jump straight to the download section, start practicing
|
||||
with the tutorials, or read the F.A.Q. if you don’t know what this is
|
||||
all about.
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
</colgroup>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><a href="Installing%2Bthe%2BSDK.html"><img src="attachments/327688/2424851.png" class="confluence-embedded-image" /></a>
|
||||
<h3 id="Home-Download" style="text-align: center;margin-top: 0.0px;">Download</h3>
|
||||
Download and install the GStreamer SDK</td>
|
||||
<td><a href="Tutorials.html"><img src="attachments/327688/2424852.png" class="confluence-embedded-image" /></a>
|
||||
<h3 id="Home-Tutorials" style="text-align: center;margin-top: 0.0px;">Tutorials</h3>
|
||||
Learn how to use the GStreamer SDK</td>
|
||||
<td><a href="Deploying%2Byour%2Bapplication.html"><img src="attachments/327688/2424853.png" class="confluence-embedded-image" /></a>
|
||||
<h3 id="Home-Deploymentinstructions" style="text-align: center;margin-top: 0.0px;">Deployment instructions</h3>
|
||||
Deploy the SDK with your application</td>
|
||||
<td><a href="GStreamer%2Breference.html"><img src="attachments/327688/2424870.png" class="confluence-embedded-image" /></a>
|
||||
<h3 id="Home-Referenceguide" style="text-align: center;margin-top: 0.0px;">Reference guide</h3>
|
||||
GStreamer API reference</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><a href="Frequently%2BAsked%2BQuestions.html"><img src="attachments/327688/2424855.png" class="confluence-embedded-image" /></a>
|
||||
<h3 id="Home-FAQ" style="text-align: center;margin-top: 0.0px;">F.A.Q.</h3>
|
||||
Frequently Asked Questions</td>
|
||||
<td><a href="Legal%2Binformation.html"><img src="attachments/327688/2424856.png" class="confluence-embedded-image" /></a>
|
||||
<h3 id="Home-Legalinformation" style="text-align: center;margin-top: 0.0px;">Legal information</h3>
|
||||
Patents, Licenses and legal F.A.Q.</td>
|
||||
<td><a href="Releases.html"><img src="attachments/327688/2424871.png" class="confluence-embedded-image" /></a>
|
||||
<h3 id="Home-Releases" style="text-align: center;margin-top: 0.0px;">Releases</h3>
|
||||
Version details</td>
|
||||
<td><a href="Contact.html"><img src="attachments/327688/2424857.png" class="confluence-embedded-image" /></a>
|
||||
<h3 id="Home-Contact" style="text-align: center;margin-top: 0.0px;">Contact</h3>
|
||||
Commercial support, bug reporting...</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p><strong>The latest Release is <a href="2013.6%2BCongo.html">2013.6 Congo</a>.</strong></p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[download.png](attachments/327688/2424858.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorials.png](attachments/327688/2424859.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[deploy.png](attachments/327688/2424860.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[faq.png](attachments/327688/2424862.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[legal.png](attachments/327688/2424863.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[contact.png](attachments/327688/2424864.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[download.png](attachments/327688/2424865.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorials.png](attachments/327688/2424872.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[deploy.png](attachments/327688/2424867.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[faq.png](attachments/327688/2424868.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[legal.png](attachments/327688/2424869.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[contact.png](attachments/327688/2424866.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[download.png](attachments/327688/2424851.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[contact.png](attachments/327688/2424857.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[deploy.png](attachments/327688/2424853.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[faq.png](attachments/327688/2424855.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[legal.png](attachments/327688/2424856.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[reference.png](attachments/327688/2424870.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[releases.png](attachments/327688/2424873.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[tutorials.png](attachments/327688/2424852.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[releases.png](attachments/327688/2424871.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
394
Installing+for+Android+development.markdown
Normal file
|
@ -0,0 +1,394 @@
|
|||
# GStreamer SDK documentation : Installing for Android development
|
||||
|
||||
This page last changed on May 24, 2013 by xartigas.
|
||||
|
||||
# Prerequisites
|
||||
|
||||
The development machine is where you will develop your Android
|
||||
application, which then you will deploy on the target machine, which
|
||||
should obviously be an Android device.
|
||||
|
||||
The development machine can either be a Linux, Mac OS X or Windows, and
|
||||
needs to have installed:
|
||||
|
||||
- The latest version of the [Android
|
||||
SDK](http://developer.android.com/sdk/index.html)
|
||||
- The latest version of the [Android
|
||||
NDK](http://developer.android.com/tools/sdk/ndk/index.html)
|
||||
- The GStreamer SDK for Android is targeted at API version 9 (Android
|
||||
2.3.1, Gingerbread) or higher. Use the SDK Manager tool to make sure
|
||||
you have at least one Android SDK platform installed with API
|
||||
version 9 or higher.
|
||||
|
||||
Optionally, you can use the [Eclipse
|
||||
IDE](http://www.eclipse.org/eclipse/). As stated in the Android
|
||||
documentation, *developing in Eclipse with ADT is highly recommended and
|
||||
is the fastest way to get started*. If you plan to use the Eclipse IDE:
|
||||
|
||||
- Install the [Android ADT
|
||||
plugin](http://developer.android.com/sdk/installing/installing-adt.html) for
|
||||
Eclipse
|
||||
- Install the [Android NDK
|
||||
plugin](http://tools.android.com/recent/usingthendkplugin) for
|
||||
Eclipse
|
||||
|
||||
Before continuing, make sure you can compile and run the samples
|
||||
included in the Android NDK, and that you understand how the integration
|
||||
of C and Java works via the [Java Native
|
||||
Interface](http://en.wikipedia.org/wiki/Java_Native_Interface) (JNI).
|
||||
Besides the [Android
|
||||
NDK](http://developer.android.com/tools/sdk/ndk/index.html)
|
||||
documentation, you can find some useful [Android JNI tips
|
||||
here](http://developer.android.com/guide/practices/jni.html).
|
||||
|
||||
# Download and install the SDK
|
||||
|
||||
The SDK has two variants: **Debug** and **Release**. The Debug variant
|
||||
produces lots of debug output and is useful while developing your
|
||||
application. The Release variant is what you will use to produce the
|
||||
final version of your application, since GStreamer code runs slightly
|
||||
faster and the libraries are smaller.
|
||||
|
||||
Get the compressed file below and just unzip it into any folder of your
|
||||
choice (Choose your preferred file format; both files have exactly the
|
||||
same content)
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="100%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Debug variant</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><ul>
|
||||
<li><a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.tar.bz2" class="external-link">GStreamer SDK 2013.6 (Congo) for Android ARM (Debug, tar.bz2)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/android/arm/gstreamer-sdk-android-arm-debug-2013.6.tar.bz2" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.tar.bz2.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.tar.bz2.sha1" class="external-link">sha1</a></li>
|
||||
<li><a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.zip" class="external-link">GStreamer SDK 2013.6 (Congo) for Android ARM (Debug, zip)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/android/arm/gstreamer-sdk-android-arm-debug-2013.6.zip" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.zip.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.zip.sha1" class="external-link">sha1</a></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><span style="color: rgb(0,0,0);">Release variant</span></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><ul>
|
||||
<li><a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-release-2013.6.tar.bz2" class="external-link">GStreamer SDK 2013.6 (Congo) for Android ARM (Release, tar.bz2)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/android/arm/gstreamer-sdk-android-arm-release-2013.6.tar.bz2" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-release-2013.6.tar.bz2.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-release-2013.6.tar.bz2.sha1" class="external-link">sha1</a></li>
|
||||
<li><a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.zip" class="external-link">GStreamer SDK 2013.6 (Congo) for Android ARM (Release, zip)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/android/arm/gstreamer-sdk-android-arm-debug-2013.6.zip" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.zip.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/android/arm/gstreamer-sdk-android-arm-debug-2013.6.zip.sha1" class="external-link">sha1</a></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>Due to the size of these files, usage of a <a href="http://en.wikipedia.org/wiki/Download_manager" class="external-link">Download Manager</a> is <strong>highly recommended</strong>. Take a look at <a href="http://en.wikipedia.org/wiki/Comparison_of_download_managers" class="external-link">this list</a> if you do not have one installed. If, after downloading, the installer reports itself as corrupt, chances are that the connection ended before the file was complete. A Download Manager will typically re-start the process and fetch the missing parts.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
If you intend to build the tutorials in this same folder, make sure you
|
||||
have writing permissions.
|
||||
|
||||
In the process of building GStreamer-enabled Android applications, some
|
||||
tools will need to know where you installed the SDK. You must define an
|
||||
environment variable called `GSTREAMER_SDK_ROOT_ANDROID` and point it to
|
||||
the folder where you extracted the SDK. This environment variable must
|
||||
be available at build time, so maybe you want to make it available
|
||||
system-wide by adding it to your `~/.profile` file (on Linux and Mac) or
|
||||
to the Environment Variables in the System Properties dialog (on
|
||||
Windows).
|
||||
|
||||
- Point `GSTREAMER_SDK_ROOT_ANDROID` to the folder where you unzipped
|
||||
the SDK.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>If you plan to use Eclipse and do not want to define this environment variable globally, you can set it inside Eclipse. Go to Window → Preferences → C/C++ → Build → Build Variables and define <code>GSTREAMER_SDK_ROOT_ANDROID</code> there.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Configure your development environment
|
||||
|
||||
There are two routes to use GStreamer in an Android application: Either
|
||||
writing your GStreamer code in Java or in C.
|
||||
|
||||
Android applications are mainly written in Java, so adding GStreamer
|
||||
code to them in the same language is a huge advantage. However, this
|
||||
requires using [language
|
||||
bindings](http://en.wikipedia.org/wiki/Language_binding) for the
|
||||
GStreamer API which are not complete yet. In the meantime, this
|
||||
documentation will use Java for the User Interface (UI) part and C for
|
||||
the GStreamer code. Both parts interact through
|
||||
[JNI](http://en.wikipedia.org/wiki/Java_Native_Interface).
|
||||
|
||||
### Building the tutorials
|
||||
|
||||
There are a few Android-specific tutorials in the
|
||||
`$GSTREAMER_SDK_ROOT_ANDROID\share\gst-sdk\tutorials` folder. Each
|
||||
tutorial is a folder containing source code (in Java and C) and the
|
||||
resource files required to build a complete Android application.
|
||||
|
||||
The rest of the GStreamer SDK tutorials (basic and playback tutorials)
|
||||
cannot be run on Android without modification.
|
||||
|
||||
Android projects with GStreamer support are built like conventional
|
||||
Android NDK projects, so the instructions at the [Android
|
||||
NDK](http://developer.android.com/tools/sdk/ndk/index.html) home can be
|
||||
followed:
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)Using Eclipse (Click to expand)
|
||||
|
||||
Make sure you have installed the ADT and NDK plugins listed in the
|
||||
prerequisites section, and that they are both aware of the location of
|
||||
the Android SDK and NDK respectively.
|
||||
|
||||
Import a tutorial into the Eclipse workspace:
|
||||
File → New → Project… → Android Project from Existing
|
||||
Code, and select the folder called `android-tutorial-1`.
|
||||
|
||||
After reading in the project and generating some extra files and
|
||||
folders, Eclipse might complain about missing files. **This is normal**,
|
||||
we are not finished yet.
|
||||
|
||||
Provide native development support by activating the NDK plugin:
|
||||
Right-click on the project in the Project Explorer (this should be the
|
||||
top-most folder,
|
||||
called `com.gst_sdk_tutorials.tutorial_1.Tutorial1`) → Android
|
||||
tools → Add Native Support… Here the NDK plugin asks for a library name.
|
||||
This is irrelevant and any valid file name will do. Accept.
|
||||
|
||||
Eclipse will still complain about errors in the code. **This is
|
||||
normal**. Some files are missing because they are generated during the
|
||||
first build run.
|
||||
|
||||
Build the project: Project → Build Project. If you bring up the Eclipse
|
||||
Console, you should see some progress messages. Once finished, the
|
||||
missing files will appear and all error messages should be gone. The
|
||||
project is now ready to run. Hit Run → Run.
|
||||
|
||||
A new application called “Android tutorial 1” should now be available on
|
||||
your device, with the GStreamer SDK logo. If you want to run the
|
||||
tutorial in an Android Virtual Device (AVD), make sure to create the
|
||||
device with support for audio playback and GPU Emulation (to enable
|
||||
OpenGL ES).
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)Using the command line (Click to
|
||||
expand)
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>Note that, on Windows, this procedure requires a working Cygwin shell, as explained in the <a href="http://developer.android.com/tools/sdk/ndk/index.html#Reqs" class="external-link">Android NDK System Requirements</a>.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
For each tutorial, move to its folder and run:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
android update project -p . -s --target X
|
||||
```
|
||||
|
||||
Where `X` is one of the targets available in your system (the ones you
|
||||
installed with the SDK manager). Make sure to use a target with at least
|
||||
API level 9.
|
||||
|
||||
To get a list of all available targets in your system issue this
|
||||
command:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
android list
|
||||
```
|
||||
|
||||
The “update project” command generates the `build.xml` file needed by
|
||||
the build system. You only need to perform this action once per project.
|
||||
|
||||
To build the C part, just call:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
ndk-build
|
||||
```
|
||||
|
||||
A few lines in the `Android.mk` file (reviewed later) pull up the
|
||||
necessary machinery to compile the GStreamer bits and generate the
|
||||
Shared Object libraries (.so) that the Java code can use as native
|
||||
methods.
|
||||
|
||||
Finally, compile the Java code with:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
ant debug
|
||||
```
|
||||
|
||||
And install on the device with:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
adb install -r bin/Tutorial1-debug.apk
|
||||
```
|
||||
|
||||
The `-r` switch allows the installer to overwrite previous versions.
|
||||
Otherwise, you need to manually uninstall previous versions of your
|
||||
application.
|
||||
|
||||
A new application called “Android tutorial 1” should now be available on
|
||||
your device, with the GStreamer SDK logo. If you want to run the
|
||||
tutorial in an Android Virtual Device (AVD), make sure to create the
|
||||
device with support for audio playback and GPU Emulation (to enable
|
||||
OpenGL ES).
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><strong>Windows linkage problems</strong><br />
|
||||
|
||||
<p>Due to problems related to the standard linker, Google’s <a href="http://en.wikipedia.org/wiki/Gold_(linker)" class="external-link">Gold Linker</a> is used to build GStreamer applications. Unfortunately, the Android NDK toolchain for Windows does not include the gold linker and the standard one has to be used.</p>
|
||||
<p>If you observe linkage problems, you can replace the linker in your Android NDK with the gold one from <a href="http://code.google.com/p/mingw-and-ndk/downloads/detail?name=android-ndk-r8b-ma-windows.7z&can=2&q=" class="external-link">this project</a>. Download the <code>android-ndk-r8b-ma-windows.7z</code> file, extract <code>\android-ndk-r8b\toolchains\arm-linux-androideabi-4.6\prebuilt\windows\arm-linux-androideabi\bin\ld.exe</code> (only this file is needed) and overwrite the one in the same folder in your Android NDK installation. You might need the free <a href="http://www.7-zip.org/" class="external-link">7-Zip archiving utility</a>.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
### Creating new projects
|
||||
|
||||
Create a normal NDK project, either from the command line as described
|
||||
in the [Android
|
||||
NDK](http://developer.android.com/tools/sdk/ndk/index.html#GetStarted)
|
||||
home, or use Eclipse: File → New → Project… → Android Application
|
||||
Project, and, once the wizard is complete, right click on the project
|
||||
→ Android Tools → Add Native Support …
|
||||
|
||||
To add GStreamer support you only need to modify the
|
||||
`jni/Android.mk` file. This file describes the native files in your
|
||||
project, and its barebones structure (as auto-generated by Eclipse) is:
|
||||
|
||||
**Android.mk**
|
||||
|
||||
``` theme: Default; brush: plain; gutter: true
|
||||
LOCAL_PATH := $(call my-dir)
|
||||
|
||||
include $(CLEAR_VARS)
|
||||
|
||||
LOCAL_MODULE := NativeApplication
|
||||
LOCAL_SRC_FILES := NativeApplication.c
|
||||
|
||||
include $(BUILD_SHARED_LIBRARY)
|
||||
```
|
||||
|
||||
Where line 5 specifies the name of the `.so` file that will contain your
|
||||
native code and line 6 states all source files that compose your native
|
||||
code, separated by spaces.
|
||||
|
||||
Adding GStreamer support only requires adding these lines:
|
||||
|
||||
**Android.mk with GStreamer support**
|
||||
|
||||
``` theme: Default; brush: plain; gutter: true
|
||||
LOCAL_PATH := $(call my-dir)
|
||||
|
||||
include $(CLEAR_VARS)
|
||||
|
||||
LOCAL_MODULE := NativeApplication
|
||||
LOCAL_SRC_FILES := NativeApplication.c
|
||||
LOCAL_SHARED_LIBRARIES := gstreamer_android
|
||||
LOCAL_LDLIBS := -landroid
|
||||
|
||||
include $(BUILD_SHARED_LIBRARY)
|
||||
|
||||
GSTREAMER_SDK_ROOT := $(GSTREAMER_SDK_ROOT_ANDROID)
|
||||
GSTREAMER_NDK_BUILD_PATH := $(GSTREAMER_SDK_ROOT)/share/gst-android/ndk-build/
|
||||
GSTREAMER_PLUGINS := coreelements ogg theora vorbis ffmpegcolorspace playback eglglessink soup opensles
|
||||
G_IO_MODULES := gnutls
|
||||
GSTREAMER_EXTRA_DEPS := gstreamer-interfaces-0.10 gstreamer-video-0.10
|
||||
|
||||
include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer.mk
|
||||
```
|
||||
|
||||
Where line 7 specifies an extra library to be included in the project:
|
||||
`libgstreamer_android.so`. This library contains all GStreamer code,
|
||||
tailored for your application’s needs, as shown below.
|
||||
|
||||
Line 8 specifies additional system libraries, in this case, in order to
|
||||
access android-specific functionality.
|
||||
|
||||
Lines 12 and 13 simply define some convenient macros.
|
||||
|
||||
Line 14 lists the plugins you want statically linked into
|
||||
`libgstreamer_android.so`. Listing only the ones you need makes your
|
||||
application smaller.
|
||||
|
||||
Line 15 is required to have internet access from GStreamer, through the
|
||||
`souphttpsrc` element.
|
||||
|
||||
Line 16 defines which GStreamer libraries your application requires.
|
||||
|
||||
Finally, line 18 includes the make files which perform the rest of the
|
||||
magic.
|
||||
|
||||
Listing all desired plugins can be cumbersome, so they have been grouped
|
||||
into categories, which can be used by including the `plugins.mk` file,
|
||||
and used as follows:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
include $(GSTREAMER_NDK_BUILD_PATH)/plugins.mk
|
||||
GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_CODECS) playbin souphttpsrc
|
||||
```
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)List of categories and included
|
||||
plugins (Click to expand)
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Category</th>
|
||||
<th>Included plugins</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><code>GSTREAMER_PLUGINS_CORE</code></td>
|
||||
<td><code>coreelements coreindexers adder app audioconvert audiorate audioresample audiotestsrc</code> <code>ffmpegcolorspace gdp gio pango typefindfunctions videorate videoscale videotestsrc</code> <code>volume autodetect videofilter</code></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>GSTREAMER_PLUGINS_PLAYBACK</code></td>
|
||||
<td><code>decodebin2 playbin</code></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><code>GSTREAMER_PLUGINS_CODECS</code></td>
|
||||
<td><code>subparse ogg theora vorbis alaw annodex apetag audioparsers auparse avi flac flv flxdec</code> <code>icydemux id3demux isomp4 jpeg matroska mulaw multipart png speex taglib wavenc wavpack</code> <code>wavparse y4menc adpcmdec adpcmenc aiff cdxaparse dtmf dvbsuboverlay dvdspu fragmented</code> <code>hdvparse id3tag ivfparse jp2k kate mve mxf nsf nuvdemux opus pcapparse pnm schro siren</code> <code>subenc tta videoparsersbad vmnc vp8 y4mdec</code></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>GSTREAMER_PLUGINS_VIS</code></td>
|
||||
<td><code>libvisual goom goom2k1 audiovisualizers</code></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><code>GSTREAMER_PLUGINS_EFFECTS</code></td>
|
||||
<td><code>alpha alphacolor audiofx cutter debug deinterlace effectv equalizer gdkpixbuf imagefreeze</code> <code>interleave level multifile replaygain shapewipe smpte spectrum videobox videocrop videomixer</code> <code>autoconvert bayer coloreffects faceoverlay fieldanalysis freeverb frei0r gaudieffects</code> <code>geometrictransform interlace jp2kdecimator liveadder rawparse removesilence scaletempoplugin</code> <code>segmentclip smooth speed stereo videofiltersbad videomeasure videosignal</code></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>GSTREAMER_PLUGINS_NET</code></td>
|
||||
<td><code>rtsp rtp rtpmanager souphttpsrc udp dataurisrc rtpmux rtpvp8 sdpelem</code></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><code>GSTREAMER_PLUGINS_CODECS_GPL</code></td>
|
||||
<td><code>assrender</code></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>GSTREAMER_PLUGINS_SYS</code></td>
|
||||
<td><code>eglglessink opensles amc</code></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Build and run your application as explained in the **Building the
|
||||
tutorials** section.
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
117
Installing+for+iOS+development.markdown
Normal file
|
@ -0,0 +1,117 @@
|
|||
# GStreamer SDK documentation : Installing for iOS development
|
||||
|
||||
This page last changed on Jun 12, 2013 by xartigas.
|
||||
|
||||
# Prerequisites
|
||||
|
||||
For iOS development you need to download Xcode and the iOS SDK. Xcode
|
||||
can be found at the App Store or
|
||||
[here](https://developer.apple.com/devcenter/ios/index.action#downloads)
|
||||
and the iOS SDK, if it is not already included in your version of Xcode,
|
||||
can be downloaded from Xcode's preferences menu under the downloads tab.
|
||||
The minimum required iOS version is 6.0. The minimum required version of
|
||||
Xcode is 4, but 4.5 is recommended.
|
||||
|
||||
In case you are not familiar with iOS, Objective-C or Xcode, we
|
||||
recommend taking a look at the available documentation at Apple's
|
||||
website.
|
||||
[This](http://developer.apple.com/library/ios/#DOCUMENTATION/iPhone/Conceptual/iPhone101/Articles/00_Introduction.html)
|
||||
can be a good starting point.
|
||||
|
||||
# Download and install the SDK
|
||||
|
||||
The GStreamer SDK installer can be found at:
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="100%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Universal</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><ul>
|
||||
<li><a href="http://cdn.gstreamer.com/ios/gstreamer-sdk-devel-2013.6-ios-universal.dmg" class="external-link">GStreamer SDK 2013.6 (Congo) for iOS (Installer Package)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/ios/gstreamer-sdk-devel-2013.6-ios-universal.dmg" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/ios/gstreamer-sdk-devel-2013.6-ios-universal.dmg.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/ios/gstreamer-sdk-devel-2013.6-ios-universal.dmg.sha1" class="external-link">sha1</a></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p><span>Due to the size of these files, usage of a </span><a href="http://en.wikipedia.org/wiki/Download_manager" class="external-link">Download Manager</a><span> is </span><strong>highly recommended</strong><span>. Take a look at </span><a href="http://en.wikipedia.org/wiki/Comparison_of_download_managers" class="external-link">this list</a><span> if you do not have one installed. If, after downloading, the installer reports itself as corrupt, chances are that the connection ended before the file was complete. A Download Manager will typically re-start the process and fetch the missing parts.</span></p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Double click the package file and follow the instructions presented by
|
||||
the install wizard. In case the system complains about the package not
|
||||
being signed, you can control-click it and open to start the
|
||||
installation. When you do this, it will warn you, but there is an option
|
||||
to install anyway. Otherwise you can go to System Preferences → Security
|
||||
and Privacy → General and select the option to allow installation of
|
||||
packages from "anywhere".
|
||||
|
||||
The GStreamer SDK installs itself in your home directory, so it is
|
||||
available only to the user that installed it. The SDK library is
|
||||
installed to `~/Library/Developer/GStreamer/iPhone.sdk`. Inside this
|
||||
directory there is the GStreamer.framework that contains the libs,
|
||||
headers and resources, and there is a `Templates` directory that has
|
||||
Xcode application templates for GStreamer development. Those templates
|
||||
are also copied to `~/Library/Developer/Xcode/Templates` during
|
||||
installation so that Xcode can find them.
|
||||
|
||||
# Configure your development environment
|
||||
|
||||
GStreamer is written in C, and the iOS API uses mostly Objective-C (and
|
||||
C for some parts), but this should cause no problems as those languages
|
||||
interoperate freely. You can mix both in the same source code, for
|
||||
example.
|
||||
|
||||
### Building the tutorials
|
||||
|
||||
The GStreamer SDK ships a few tutorials in the `xcode iOS` folder inside
|
||||
the `.dmg` file. Copy them out of the package and into a more suitable
|
||||
place.** **We recommend that you open the project in Xcode, take a look
|
||||
at the sources and build them. This should confirm that the installation
|
||||
works and give some insight on how simple it is to mix Objective-C and C
|
||||
code.
|
||||
|
||||
### Creating new projects
|
||||
|
||||
After installation, when creating a new Xcode project, you should see
|
||||
the GStreamer project templates under the `Templates` category. OS X and
|
||||
iOS have a different way of organizing libraries headers and binaries.
|
||||
They are grouped into Frameworks, and that's how we ship GStreamer and
|
||||
its dependencies for iOS (and OS X). Due to this difference from
|
||||
traditional linux development, we strongly recommend using the SDK
|
||||
templates, as they set a few variables on your project that allows Xcode
|
||||
to find, use and link GStreamer just like in traditional linux
|
||||
development. For example, if you don't use the templates, you'll have to
|
||||
use:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
#include <GStreamer/gst/gst.h>
|
||||
```
|
||||
|
||||
instead of the usual:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
#include <gst/gst.h>
|
||||
```
|
||||
|
||||
Among some other things the template does, this was a decision made to
|
||||
keep development consistent across all the platforms the SDK supports.
|
||||
|
||||
Once a project has been created using a GStreamer SDK Template, it is
|
||||
ready to build and run. All necessary infrastructure is already in
|
||||
place. To understand what files have been created and how they interact,
|
||||
take a look at the [iOS tutorials](iOS%2Btutorials.html).
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
285
Installing+on+Linux.markdown
Normal file
|
@ -0,0 +1,285 @@
|
|||
# GStreamer SDK documentation : Installing on Linux
|
||||
|
||||
This page last changed on Jun 12, 2013 by slomo.
|
||||
|
||||
# Prerequisites
|
||||
|
||||
To develop applications using the GStreamer SDK on Linux you will need
|
||||
one of the following supported distributions:
|
||||
|
||||
- Ubuntu Precise Pangolin (12.04)
|
||||
- Ubuntu Quantal Quetzal (12.10)
|
||||
- Ubuntu Raring Ringtail (13.04)
|
||||
- Debian Squeeze (6.0)
|
||||
- Debian Wheezy (7.0)
|
||||
- Fedora 17
|
||||
- Fedora 18
|
||||
|
||||
Other distributions or version might work, but they have not been
|
||||
tested. If you want to try, do it at your own risk\!
|
||||
|
||||
All the commands given in this section are intended to be typed in from
|
||||
a terminal.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>Make sure you have superuser (root) access rights to install the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Download and install the SDK
|
||||
|
||||
The GStreamer SDK provides a set of binary packages for supported Linux
|
||||
distributions. Detailed instructions on how to install the packages for
|
||||
each supported distribution can be found below. Optionally, you can
|
||||
download the source code and build the SDK yourself, which should then
|
||||
work on any platform.
|
||||
|
||||
These packages will install the SDK at `/opt/gstreamer-sdk`. If you
|
||||
build the SDK yourself, you can install it anywhere you want.
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)Ubuntu (Click here to expand)
|
||||
|
||||
In order to install the SDK on **Ubuntu**, it is required that the
|
||||
public repository where the SDK resides is added to the apt sources
|
||||
list.
|
||||
|
||||
To do so, download the appropriate definition file for your
|
||||
distribution:
|
||||
|
||||
- Ubuntu Precise Pangolin (12.04)
|
||||
[i386](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/ubuntu/precise/i386/gstreamer-sdk.list) [amd64](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/ubuntu/precise/amd64/gstreamer-sdk.list)
|
||||
|
||||
- Ubuntu Quantal Quetzal (12.10)
|
||||
[i386](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/ubuntu/quantal/i386/gstreamer-sdk.list) [amd64](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/ubuntu/quantal/amd64/gstreamer-sdk.list)
|
||||
|
||||
- Ubuntu Raring Ringtail (13.04)
|
||||
[i386](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/ubuntu/raring/i386/gstreamer-sdk.list) [amd64](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/ubuntu/raring/amd64/gstreamer-sdk.list)
|
||||
|
||||
And copy it to the `/etc/apt/sources.list.d/` directory on your system:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
sudo cp gstreamer-sdk.list /etc/apt/sources.list.d/
|
||||
```
|
||||
|
||||
After adding the repositories, the GPG key of the apt repository has to
|
||||
be added and the apt repository list needs to be refreshed. This can be
|
||||
done by
|
||||
running:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
wget -q -O - http://www.freedesktop.org/software/gstreamer-sdk/sdk.gpg | sudo apt-key add -
|
||||
sudo apt-get update
|
||||
```
|
||||
|
||||
Now that the new repositories are available, install the SDK with the
|
||||
following command:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
sudo apt-get install gstreamer-sdk-dev
|
||||
```
|
||||
|
||||
Enter the superuser/root password when prompted.
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)Debian (Click here to expand)
|
||||
|
||||
In order to install the SDK on **Debian**, it is required that the
|
||||
public repository where the SDK resides is added to the apt sources
|
||||
list.
|
||||
|
||||
To do so, download the appropriate definition file for your
|
||||
distribution:
|
||||
|
||||
- Debian Squeeze (6.0)
|
||||
[i386](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/debian/squeeze/i386/gstreamer-sdk.list)
|
||||
[amd64](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/debian/squeeze/amd64/gstreamer-sdk.list)
|
||||
|
||||
- Debian Wheezy (7.0)
|
||||
[i386](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/debian/wheezy/i386/gstreamer-sdk.list)
|
||||
[amd64](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/debian/wheezy/amd64/gstreamer-sdk.list)
|
||||
|
||||
And copy it to the `/etc/apt/sources.list.d/` directory on your system:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
su -c 'cp gstreamer-sdk.list /etc/apt/sources.list.d/'
|
||||
```
|
||||
|
||||
After adding the repositories, the GPG key of the apt repository has to
|
||||
be added and the apt repository list needs to be refreshed. This can be
|
||||
done by
|
||||
running:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
su -c 'wget -q -O - http://www.freedesktop.org/software/gstreamer-sdk/sdk.gpg | apt-key add -'
|
||||
su -c 'apt-get update'
|
||||
```
|
||||
|
||||
Now that the new repositories are available, install the SDK with the
|
||||
following command:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
su -c 'apt-get install gstreamer-sdk-dev'
|
||||
```
|
||||
|
||||
Enter the superuser/root password when prompted.
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)Fedora (Click here to expand)
|
||||
|
||||
In order to install the SDK on **Fedora**, it is required that the
|
||||
public repository where the SDK resides is added to the yum sources
|
||||
list.
|
||||
|
||||
To do so, download the appropriate definition file for your
|
||||
distribution:
|
||||
|
||||
- [Fedora 17](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/fedora/gstreamer-sdk.repo)
|
||||
|
||||
- [Fedora 18](http://www.freedesktop.org/software/gstreamer-sdk/data/packages/fedora/gstreamer-sdk.repo)
|
||||
|
||||
And copy it to the `/etc/yum.repos.d/` directory on your system:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
su -c 'cp gstreamer-sdk.repo /etc/yum.repos.d/'
|
||||
```
|
||||
|
||||
After adding the repositories, the yum repository list needs to be
|
||||
refreshed. This can be done by running:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
su -c 'yum update'
|
||||
```
|
||||
|
||||
Now that the new repositories are available, install the SDK with the
|
||||
following command:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
su -c 'yum install gstreamer-sdk-devel'
|
||||
```
|
||||
|
||||
Enter the superuser/root password when prompted.
|
||||
|
||||
# Configure your development environment
|
||||
|
||||
When building applications using GStreamer, the compiler must be able to
|
||||
locate its libraries. However, in order to prevent possible collisions
|
||||
with the GStreamer installed in the system, the GStreamer SDK is
|
||||
installed in a non-standard location `/opt/gstreamer-sdk`. The shell
|
||||
script `gst-sdk-shell` sets the required environment variables for
|
||||
building applications with the GStreamer SDK:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
/opt/gstreamer-sdk/bin/gst-sdk-shell
|
||||
```
|
||||
|
||||
The only other “development environment” that is required is
|
||||
the `gcc` compiler and a text editor. In order to compile code that
|
||||
requires the GStreamer SDK and uses the GStreamer core library, remember
|
||||
to add this string to your `gcc` command:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
`pkg-config --cflags --libs gstreamer-0.10`
|
||||
```
|
||||
|
||||
If you're using other GStreamer libraries, e.g. the video library, you
|
||||
have to add additional packages after gstreamer-0.10 in the above string
|
||||
(gstreamer-video-0.10 for the video library, for example).
|
||||
|
||||
If your application is built with the help of libtool, e.g. when using
|
||||
automake/autoconf as a build system, you have to run
|
||||
the `configure `script from inside the `gst-sdk-shell` environment.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>You have also the option to embed the SDK's path into your binaries so they do not need to be executed from within the <code>gst-sdk-shell</code>. To do so, add these options to gcc:</p>
|
||||
<div class="code panel" style="border-width: 1px;">
|
||||
<div class="codeContent panelContent">
|
||||
<pre class="theme: Default; brush: plain; gutter: false" style="font-size:12px;"><code>-Wl,-rpath=/opt/gstreamer-sdk/lib `pkg-config --cflags --libs gstreamer-0.10`</code></pre>
|
||||
</div>
|
||||
</div>
|
||||
<p>In case you are using libtool, it will automatically add the <code>-Wl </code>and<code> -rpath</code> options and you do not need to worry about it.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
### Getting the tutorial's source code
|
||||
|
||||
The source code for the tutorials can be copied and pasted from the
|
||||
tutorial pages into a text file, but, for convenience, it is also
|
||||
available in a GIT repository and distributed with the SDK.
|
||||
|
||||
The GIT repository can be cloned with:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
git clone git://anongit.freedesktop.org/gstreamer-sdk/gst-sdk-tutorials
|
||||
```
|
||||
|
||||
Or you can locate the source code in
|
||||
`/opt/gstreamer-sdk/share/gst-sdk/tutorials`, and copy it to a working
|
||||
folder of your choice.
|
||||
|
||||
### Building the tutorials
|
||||
|
||||
You need to enter the GStreamer SDK shell in order for the compiler to
|
||||
use the right libraries (and avoid conflicts with the system libraries).
|
||||
Run `/opt/gstreamer-sdk/bin/gst-sdk-shell` to enter this shell.
|
||||
|
||||
Then go to the folder where you copied/cloned the tutorials and
|
||||
write:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
gcc basic-tutorial-1.c -o basic-tutorial-1 `pkg-config --cflags --libs gstreamer-0.10`
|
||||
```
|
||||
|
||||
Using the file name of the tutorial you are interested in
|
||||
(`basic-tutorial-1` in this example).
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p><strong>Depending on the GStreamer libraries you need to use, you will have to add more packages to the <code class="western">pkg-config </code>command, besides <code class="western">gstreamer-0.10</code></strong></p>
|
||||
<p>At the bottom of each tutorial's source code you will find the command for that specific tutorial, including the required libraries, in the required order.</p>
|
||||
<p>When developing your own applications, the GStreamer documentation will tell you what library a function belongs to.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
### Running the tutorials
|
||||
|
||||
To run the tutorials, simply execute the desired tutorial (**from within
|
||||
the `gst-sdk-shell`**):
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
./basic-tutorial-1
|
||||
```
|
||||
|
||||
### Deploying your application
|
||||
|
||||
Your application built with the GStreamer SDK must be able locate the
|
||||
GStreamer libraries when deployed in the target machine. You have at
|
||||
least a couple of options:
|
||||
|
||||
If you want to install a shared SDK, you can put your application
|
||||
in `/opt/gstreamer-sdk` (next to the SDK) and create a `.desktop` file
|
||||
in `/usr/share/applications` pointing to it. For this to work without
|
||||
any problems you must make sure that your application is built with
|
||||
the `-Wl,-rpath=/opt/gstreamer-sdk/lib` parameter (this is done
|
||||
automatically by libtool, if you use it).
|
||||
|
||||
Or, you can deploy a wrapper script (similar to `gst-sdk-shell`), which
|
||||
sets the necessary environment variables and then calls your application
|
||||
and create a `.desktop` file in `/usr/share/applications` pointing to
|
||||
the wrapper script. This is the most usual approach, doesn't require the
|
||||
use of the `-Wl,-rpath` parameters and is more flexible. Take a look
|
||||
at `gst-sdk-shell` to see what this script needs to do. It is also
|
||||
possible to create a custom wrapper script with
|
||||
the `gensdkshell` command of the Cerbero build system, if you built
|
||||
the SDK yourself as explained above.
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
142
Installing+on+Mac+OS+X.markdown
Normal file
|
@ -0,0 +1,142 @@
|
|||
# GStreamer SDK documentation : Installing on Mac OS X
|
||||
|
||||
This page last changed on Jun 12, 2013 by xartigas.
|
||||
|
||||
# Prerequisites
|
||||
|
||||
To develop applications using the GStreamer SDK for OS X you will need
|
||||
OS X Snow Leopard (10.6) or later and
|
||||
[XCode 3.2.6](https://developer.apple.com/devcenter/mac/index.action) or
|
||||
later.
|
||||
|
||||
The recommended system is [Mac OS X
|
||||
Lion](http://www.apple.com/macosx/) with
|
||||
[XCode 4](https://developer.apple.com/xcode/)
|
||||
|
||||
Installing the SDK for 32-bits platforms requires approximately 145MB of
|
||||
free disk space for the runtime and 193MB for the development files.
|
||||
|
||||
Installing the SDK for 64-bits platforms requires approximately 152MB of
|
||||
free disk space for the runtime and 223MB for the development files.
|
||||
|
||||
# Download and install the SDK
|
||||
|
||||
There are 3 sets of files in the SDK:
|
||||
|
||||
- The runtime files are needed to run GStreamer applications. You
|
||||
probably want to distribute these files with your application (or
|
||||
the installer below).
|
||||
- The development files are **additional** files you need to create
|
||||
GStreamer applications.
|
||||
- Mac OS X packages that you can use
|
||||
with [PackageMaker](https://developer.apple.com/library/mac/#documentation/DeveloperTools/Conceptual/PackageMakerUserGuide/Introduction/Introduction.html)
|
||||
to deploy GStreamer with your application
|
||||
|
||||
Get **both the runtime and the development installers** from here:
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="100%" />
|
||||
</colgroup>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td>Universal</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><ul>
|
||||
<li><a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal.pkg" class="external-link">GStreamer SDK 2013.6 (Congo) for Mac OS X (Runtime)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/osx/universal/gstreamer-sdk-2013.6-universal.pkg" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal.pkg.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal.pkg.sha1" class="external-link">sha1</a></li>
|
||||
<li><a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-devel-2013.6-universal.pkg" class="external-link">GStreamer SDK 2013.6 (Congo) for Mac OS X (Development files)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/osx/universal/gstreamer-sdk-devel-2013.6-universal.pkg" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-devel-2013.6-universal.pkg.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-devel-2013.6-universal.pkg.sha1" class="external-link">sha1</a></li>
|
||||
<li><a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal-packages.dmg" class="external-link">GStreamer SDK 2013.6 (Congo) for Mac OS X (Deployment Packages)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/osx/universal/gstreamer-sdk-2013.6-universal-packages.dmg" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal-packages.dmg.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal-packages.dmg.sha1" class="external-link">sha1</a></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>On Mac OS X 10.6 (Snow Leopard) you have to install Python 2.7 manually. It is included in later versions of OS X already. You can get it from <a href="http://www.python.org/getit" class="external-link">here</a>.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>Due to the size of these files, usage of a <a href="http://en.wikipedia.org/wiki/Download_manager" class="external-link">Download Manager</a> is <strong>highly recommended</strong>. Take a look at <a href="http://en.wikipedia.org/wiki/Comparison_of_download_managers" class="external-link">this list</a> if you do not have one installed. If, after downloading, the installer reports itself as corrupt, chances are that the connection ended before the file was complete. A Download Manager will typically re-start the process and fetch the missing parts.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
The downloads are [Apple Disk Images
|
||||
(.dmg)](http://en.wikipedia.org/wiki/Apple_Disk_Image) containing an
|
||||
[Installer Package
|
||||
(.pkg)](http://en.wikipedia.org/wiki/Installer_%28Mac_OS_X%29). Double
|
||||
click in the installer to start the installation process.
|
||||
|
||||
These are some paths of the GStreamer framework that you might find
|
||||
useful:
|
||||
|
||||
- /Library/Frameworks/GStreamer.framework/: Framework's root path
|
||||
- /Library/Frameworks/GStreamer.framework/Versions: path with all the
|
||||
versions of the framework
|
||||
- /Library/Frameworks/GStreamer.framework/Versions/Current: link to
|
||||
the current version of the framework
|
||||
- /Library/Frameworks/GStreamer.framework/Headers: path with the
|
||||
development headers
|
||||
- /Library/Frameworks/GStreamer.framework/Commands: link to the
|
||||
commands provided by the framework, such as gst-inspect-0.10 or
|
||||
gst-launch-0.10
|
||||
|
||||
For more information on OS X Frameworks anatomy, you can consult the
|
||||
following [link](https://developer.apple.com/library/mac/#documentation/MacOSX/Conceptual/BPFrameworks/Concepts/FrameworkAnatomy.html)
|
||||
|
||||
# Configure your development environment
|
||||
|
||||
### Building the tutorials
|
||||
|
||||
The tutorial's code, along with project files and a solution file for
|
||||
them all are included in the SDK. The source code and the XCode project
|
||||
files are located
|
||||
in `/Library/Frameworks/GStreamer.framework/Current/share/gst-sdk/tutorials.`
|
||||
|
||||
To start building the tutorials, create a new folder in your Documents
|
||||
directory and copy the
|
||||
folder `/Library/Frameworks/GStreamer.framework/Current/share/gst-sdk/tutorials`.
|
||||
|
||||
You can fire up XCode and load the project file.
|
||||
|
||||
Press the **Run **button to build and run the first tutorial. You can
|
||||
switch the tutorial to build selecting one of the available schemes.
|
||||
|
||||
### Creating new projects
|
||||
|
||||
The GStreamer SDK provides a
|
||||
[framework](https://developer.apple.com/library/mac/#documentation/MacOSX/Conceptual/BPFrameworks/Tasks/IncludingFrameworks.html)
|
||||
that you can drag and drop to XCode to start using it, or using the
|
||||
linker option ***-framework GStreamer****.*
|
||||
|
||||
There is a small exception to the regular use of frameworks, and you
|
||||
will need to manually include the headers search
|
||||
path `/Library/Frameworks/GStreamer.framework/Headers`
|
||||
|
||||
- XCode: Add the headers path to **Search Paths-\> Header Search
|
||||
Paths**
|
||||
- GCC: Using the compiler
|
||||
option** -I/Library/Frameworks/GStreamer.framework/Headers**
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[Configurations.png](attachments/327710/2424835.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[Schemes.png](attachments/327710/2424836.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif) [Add
|
||||
files.png](attachments/327710/2424837.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif) [Project
|
||||
Files.png](attachments/327710/2424838.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
358
Installing+on+Windows.markdown
Normal file
|
@ -0,0 +1,358 @@
|
|||
# GStreamer SDK documentation : Installing on Windows
|
||||
|
||||
This page last changed on Jun 12, 2013 by xartigas.
|
||||
|
||||
# Prerequisites
|
||||
|
||||
To develop applications using the GStreamer SDK for Windows you will
|
||||
need [Windows
|
||||
XP](http://windows.microsoft.com/en-US/windows/products/windows-xp) or
|
||||
later.
|
||||
|
||||
The GStreamer SDK includes C headers (`.h`) and library files (`.lib`)
|
||||
valid for any version of [Microsoft Visual
|
||||
Studio](http://www.microsoft.com/visualstudio). For convenience,
|
||||
property pages (`.props`) are also included which extremely simplify
|
||||
creating new projects. These property pages, though, only work with
|
||||
[Microsoft Visual
|
||||
Studio 2010](http://www.microsoft.com/visualstudio/en-us/products/2010-editions)
|
||||
(including the free [Visual C++ Express
|
||||
edition](http://www.microsoft.com/visualstudio/en-us/products/2010-editions/visual-cpp-express)).
|
||||
|
||||
The recommended system is
|
||||
[Windows 7](http://windows.microsoft.com/en-us/windows7/products/home)
|
||||
with [Microsoft Visual
|
||||
Studio 2010](http://www.microsoft.com/visualstudio/en-us/products/2010-editions) (Take
|
||||
a look at its [system
|
||||
requirements](http://www.microsoft.com/visualstudio/en-us/products/2010-editions/visual-cpp-express)).
|
||||
|
||||
Installing the SDK for 32-bits platforms requires approximately 286MB of
|
||||
free disk space for the runtime and 207MB for the development files.
|
||||
|
||||
Installing the SDK for 64-bits platforms requires approximately 340MB of
|
||||
free disk space for the runtime and 216MB for the development files.
|
||||
|
||||
# Download and install the SDK
|
||||
|
||||
There are 3 sets of files in the SDK:
|
||||
|
||||
- The runtime files are needed to run GStreamer applications. You
|
||||
probably want to distribute these files with your application (or
|
||||
the installer below).
|
||||
- The development files are **additional** files you need to create
|
||||
GStreamer applications.
|
||||
- The [Merge
|
||||
Modules](http://msdn.microsoft.com/en-us/library/windows/desktop/aa369820%28v=vs.85%29.aspx)
|
||||
files are **additional** files you can use to deploy the GStreamer
|
||||
SDK alongside your application (see [Windows
|
||||
deployment](Windows%2Bdeployment.html)).
|
||||
|
||||
Get **the Runtime and Development files** installers appropriate for
|
||||
your architecture from here:
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="100%" />
|
||||
</colgroup>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td>32 bits</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><ul>
|
||||
<li><a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6.msi" class="external-link">GStreamer SDK 2013.6 (Congo) for Windows 32 bits (Runtime)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/windows/x86/gstreamer-sdk-x86-2013.6.msi" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6.msi.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6.msi.sha1" class="external-link">sha1</a></li>
|
||||
<li><a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-devel-x86-2013.6.msi" class="external-link">GStreamer SDK 2013.6 (Congo) for Windows 32 bits (Development files)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/windows/x86/gstreamer-sdk-devel-x86-2013.6.msi" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-devel-x86-2013.6.msi.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-devel-x86-2013.6.msi.sha1" class="external-link">sha1</a></li>
|
||||
<li><a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6-merge-modules.zip" class="external-link">GStreamer SDK 2013.6 (Congo) for Windows 32 bits (Merge Modules)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/windows/x86/gstreamer-sdk-x86-2013.6-merge-modules.zip" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6-merge-modules.zip.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6-merge-modules.zip.sha1" class="external-link">sha1</a></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><span style="color: rgb(0,51,102);">64 bits</span></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><ul>
|
||||
<li><a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6.msi" class="external-link">GStreamer SDK 2013.6 (Congo) for Windows 64 bits (Runtime)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/windows/x86-64/gstreamer-sdk-x86_64-2013.6.msi" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6.msi.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6.msi.sha1" class="external-link">sha1</a></li>
|
||||
<li><a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-devel-x86_64-2013.6.msi" class="external-link">GStreamer SDK 2013.6 (Congo) for Windows 64 bits (Development files)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/windows/x86-64/gstreamer-sdk-devel-x86_64-2013.6.msi" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-devel-x86_64-2013.6.msi.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-devel-x86_64-2013.6.msi.sha1" class="external-link">sha1</a></li>
|
||||
<li><a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6-merge-modules.zip" class="external-link">GStreamer SDK 2013.6 (Congo) for Windows 64 bits (Merge Modules)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/windows/x86-64/gstreamer-sdk-x86_64-2013.6-merge-modules.zip" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6-merge-modules.zip.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6-merge-modules.zip.sha1" class="external-link">sha1</a></li>
|
||||
</ul></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>Due to the size of these files, usage of a <a href="http://en.wikipedia.org/wiki/Download_manager" class="external-link">Download Manager</a> is <strong>highly recommended</strong>. Take a look at <a href="http://en.wikipedia.org/wiki/Comparison_of_download_managers" class="external-link">this list</a> if you do not have one installed. If, after downloading, the installer reports itself as corrupt, chances are that the connection ended before the file was complete. A Download Manager will typically re-start the process and fetch the missing parts.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Execute the installers and choose an installation folder. The suggested
|
||||
default is usually OK.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>If you plan to use Visual Studio, <strong>close it before installing the GStreamer SDK</strong>. The installer will define new environment variables which will not be picked up by Visual Studio if it is open.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>On <strong>Windows 8</strong>, it might be necessary to log out and log back in to your account after the installation for the newly defined environment variables to be picked up by Visual Studio.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
It is the application's responsibility to ensure that, at runtime,
|
||||
GStreamer can access its libraries and plugins. It can be done by adding
|
||||
`%GSTREAMER_SDK_ROOT_X86%\bin` to the `%PATH%` environment variable, or
|
||||
by running the application from this same folder.
|
||||
|
||||
At runtime, GStreamer will look for its plugins in the following
|
||||
folders:
|
||||
|
||||
- `%HOMEDRIVE%%HOMEFOLDER%/.gstreamer-0.10/plugins`
|
||||
- `C:\gstreamer-sdk\0.10\x86\lib\gstreamer-0.10`
|
||||
- `<location of libgstreamer-0.10-0.dll>\..\lib\gstreamer-0.10`
|
||||
- `%GST_PLUGIN_PATH%`
|
||||
|
||||
So, typically, if your application can find `libgstreamer-0.10-0.dll`,
|
||||
it will find the GStreamer plugins, as long as the installation folder
|
||||
structure is unmodified. If you do change this structure in your
|
||||
application, then you can use the `%GST_PLUGIN_PATH%` environment
|
||||
variable to point GStreamer to its plugins. The plugins are initially
|
||||
found at `%GSTREAMER_SDK_ROOT_X86%\lib\gstreamer-0.10`.
|
||||
|
||||
Additionally, if you want to prevent GStreamer from looking in all the
|
||||
default folders listed above, you can set the
|
||||
`%GST_PLUGIN_SYSTEM_PATH%` environment variable to point where the
|
||||
plugins are located.
|
||||
|
||||
# Configure your development environment
|
||||
|
||||
### Building the tutorials
|
||||
|
||||
The tutorial's code, along with project files and a solution file for
|
||||
Visual Studio 2010 are all included in the SDK, in
|
||||
the `%GSTREAMER_SDK_ROOT_X86%``\share\gst-sdk\tutorials` folder.
|
||||
|
||||
`%GSTREAMER_SDK_ROOT_X86%` is an environment variable that the installer
|
||||
defined for you, and points to the installation folder of the SDK.
|
||||
|
||||
In order to prevent accidental modification of the original code, and to
|
||||
make sure Visual Studio has the necessary permissions to write the
|
||||
output files, copy the entire `tutorials` folder to a place of your
|
||||
liking, and work from there.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><strong>64-bit Users</strong><br />
|
||||
|
||||
<p>Use <code>%GSTREAMER_SDK_ROOT_X86_64%</code> if you have installed the SDK for 64-bit platforms. Both SDKs (32 and 64-bit) can be installed simultaneously, and hence the separate environment variables.</p>
|
||||
<p>Make sure you select the Solution Configuration that matches the GStreamer SDK that you have installed: <code>Win32</code> for 32 bits or <code>x64</code> for 64 bits. <img src="attachments/thumbnails/327707/1540165" class="confluence-embedded-image confluence-thumbnail image-right" width="100" /></p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
You can fire up Visual Studio 2010 and load your copy of the
|
||||
`tutorials.sln` solution file (Click on the screen shots to enlarge
|
||||
them).![](attachments/thumbnails/327707/1540147)
|
||||
![](attachments/thumbnails/327707/1540146)
|
||||
Hit **F7**, press the Build Solution button
|
||||
![](attachments/327707/1540148.png) or go to Build → Build Solution. All
|
||||
projects should build without problems.
|
||||
|
||||
### Running the tutorials
|
||||
|
||||
In order to run the tutorials, we will set the current working directory
|
||||
to `%GSTREAMER_SDK_ROOT_X86%``\``bin` in the Debugging section of the
|
||||
project properties. **This property is not stored in the project files,
|
||||
so you will need to manually add it to every tutorial you want to run
|
||||
from within Visual Studio**. Right click on a project in the Solution
|
||||
Explorer, Properties → Debugging → Working Directory, and type
|
||||
`$(GSTREAMER_SDK_ROOT_X86)``\``bin`
|
||||
|
||||
` `(The `$(...)` notation is required to access environment variables
|
||||
from within Visual Studio. You use the `%...%` notation from Windows
|
||||
Explorer)
|
||||
|
||||
You should now be able to run the tutorials.
|
||||
|
||||
|
||||
|
||||
### Creating new projects manually
|
||||
|
||||
**If you want to create 64-bit applications, remember also to create x64
|
||||
Solution and Project configurations as
|
||||
explained [here](http://msdn.microsoft.com/en-us/library/9yb4317s\(v=vs.100\).aspx).**
|
||||
|
||||
#### Include the necessary SDK Property Sheet
|
||||
|
||||
The included property sheets make creating new projects extremely easy.
|
||||
In Visual Studio 2010 create a new project (Normally a `Win32
|
||||
Console` or `Win32 Application`). Then go to the Property Manager
|
||||
(View→Property Manager), right-click on your project and select “Add
|
||||
Existing Property Sheet...”. Navigate to
|
||||
`%GSTREAMER_SDK_ROOT_X86%``\``share\vs\2010\libs` and
|
||||
load `gstreamer-0.10.props `
|
||||
|
||||
This property sheet contains the directories where the headers and
|
||||
libraries are located, and the necessary options for the compiler and
|
||||
linker, so you do not need to change anything else in your project.
|
||||
|
||||
**
|
||||
**
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
If you cannot find the Property Manager, you might need to enable Expert
|
||||
Settings. Go to Tools → Settings → Expert Settings. Upon first
|
||||
installation of Visual Studio, Expert Settings are disabled by
|
||||
default.![](attachments/thumbnails/327707/1540154)
|
||||
|
||||
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p><strong>Depending on the GStreamer libraries you need to use, you will have to add more property pages, besides <code class="western">gstreamer-0.10</code></strong> (each property page corresponds to one GStreamer library).</p>
|
||||
<p>The tutorial's project files already contain all necessary property pages. When developing your own applications, the GStreamer documentation will tell you what library a function belongs to, and therefore, what property pages you need to add.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
#### Remove the dependency with the Visual Studio runtime
|
||||
|
||||
At this point, you have a working environment, which you can test by
|
||||
running the tutorials. However, there is a last step remaining.
|
||||
|
||||
Applications built with Visual C++ 2010 depend on the Visual C++ 2010
|
||||
Runtime, which is a DLL that gets installed when you install Visual
|
||||
Studio. If you were to distribute your application, you would need to
|
||||
distribute this DLL with it (What is known as the [Visual C++ 2010
|
||||
Redistributable
|
||||
Package](http://www.microsoft.com/download/en/details.aspx?id=5555)).
|
||||
This happens with every version of Visual Studio, and the Runtime DLL is
|
||||
different for every version of Visual Studio.
|
||||
|
||||
Furthermore, GStreamer itself is built using a “basic” C runtime which
|
||||
comes in every Windows system since Windows XP, and is named
|
||||
`MSVCRT.DLL`. If your application and GStreamer do not use the same C
|
||||
Runtime, problems are bound to crop out.
|
||||
|
||||
In order to avoid these issues you must instruct your application to use
|
||||
the system's C Runtime. First install the [Windows Device Driver Kit
|
||||
Version 7.1.0](http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=11800) (DDK).
|
||||
When the installer asks about the features, select only “Build
|
||||
Environments”. Accept the suggested location for the installation, which
|
||||
is usually `C:\WinDDK\7600.16385.1`. This download is an ISO file, you
|
||||
can either burn a DVD with it (as recommended in the Microsoft site. You
|
||||
will need DVD burning software), mount the file in a virtual DVD device
|
||||
(you will need DVD virtualization software) or unpack the file as if it
|
||||
was a regular compressed file (you will need decompression software that
|
||||
understands the ISO format).
|
||||
|
||||
Then, add the `x86.props` or `x86_64.props` (for 32 or 64 bits) property
|
||||
sheet found in `%GSTREAMER_SDK_ROOT_X86%``\``share\vs\2010\msvc` to your
|
||||
project. This will make your application use the ubiquitous
|
||||
`MSVCRT.DLL` saving you some troubles in the future.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>If you did not install the WinDDK to the standard path <code>C:\WinDDK\7600.16385.1</code>, you will need to tell Visual Studio where it is. Unfortunately, there is no automated way to do this. Once you have added the <code>x86.props</code> or <code>x86_64.props</code> to your project, go to the Property Manager, expand your project and its subfolders until you find the property sheet called <code>config</code>. Double click to edit it, and select the section called “User Macros” in the list on the left. You should see a macro called <code>WINDOWS_DRIVER_KIT</code>. Double click to edit it, and set its value to the root folder where you installed the DDK. This is the folder containing a file called <code>samples.txt </code></p>
|
||||
<p>That's it. Accept the changes, right click on the <code>config</code> property sheet and select “Save”. The path to the DDK is now stored in <code>config.props</code> and you do not need to perform this operation anymore.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
### Creating new projects using the wizard
|
||||
|
||||
Go to File → New → Project… and you should find a template
|
||||
named **GStreamer SDK Project**. It takes no parameters, and sets all
|
||||
necessary project settings, both for 32 and 64 bits architectures.
|
||||
|
||||
The generated project file includes the two required Property Sheets
|
||||
described in the previous section, so, in order to link to the correct
|
||||
`MSVCRT.DLL`, **you still need to install the Windows Device Driver
|
||||
Kit** and change the appropriate property sheets.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>GStreamer SDK versions prior to Congo did not automatically <span>deploy </span>the required Visual Studio Wizard files. We recommend that you upgrade to the latest available GStreamer SDK version.</p>
|
||||
<div id="expander-2126536236" class="expand-container">
|
||||
<div id="expander-control-2126536236" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Click here to see the manual installation instructions...</span>
|
||||
</div>
|
||||
<div id="expander-content-2126536236" class="expand-content">
|
||||
<p>1. The three files in the <code>%GSTREAMER_SDK_ROOT_X86%\share\vs\2010\wizard</code> folder need to be copied to the following location, depending on your version of Visual Studio:</p>
|
||||
<ul>
|
||||
<li>Visual Studio 2010:
|
||||
<ul>
|
||||
<li><code>C:\Program Files\Microsoft Visual Studio 10.0\VC\vcprojects</code></li>
|
||||
</ul></li>
|
||||
<li>Visual Studio 2010 Express:
|
||||
<ul>
|
||||
<li><code>C:\Program Files\Microsoft Visual Studio 10.0\VC\Express\VCProjects</code></li>
|
||||
</ul></li>
|
||||
</ul>
|
||||
<p>2. The entire <code>%GSTREAMER_SDK_ROOT_X86%\share\vs\2010\gst-sdk-template</code> folder (including the <code>gst-sdk-template</code> folder) needs to be copied to:</p>
|
||||
<ul>
|
||||
<li><code>C:\Program Files\Microsoft Visual Studio 10.0\VC\VCWizards</code></li>
|
||||
</ul>
|
||||
<div>
|
||||
<p><strong>To clarify:</strong> you must end up with three additional files in the <code>VCProjects</code> folder and a new folder named <code>gst-sdk-template</code> inside the <code>VCWizards</code> folder.</p>
|
||||
<p>Use the right path for your Visual Studio folder.</p>
|
||||
</div>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall1.png](attachments/327707/1540146.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall2.png](attachments/327707/1540147.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall-BuildSolution.png](attachments/327707/1540148.png)
|
||||
(image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall3.png](attachments/327707/1540149.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall4.png](attachments/327707/1540150.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall5.png](attachments/327707/1540151.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall6.png](attachments/327707/1540152.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall7.png](attachments/327707/1540153.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall8.png](attachments/327707/1540154.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall9.png](attachments/327707/1540155.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall10.png](attachments/327707/1540156.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall11.png](attachments/327707/1540157.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[WindowsInstall-Configuration.png](attachments/327707/1540165.png)
|
||||
(image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
99
Installing+the+SDK.markdown
Normal file
|
@ -0,0 +1,99 @@
|
|||
# GStreamer SDK documentation : Installing the SDK
|
||||
|
||||
This page last changed on Jun 12, 2013 by xartigas.
|
||||
|
||||
### Choose your platform
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="33%" />
|
||||
<col width="33%" />
|
||||
<col width="33%" />
|
||||
</colgroup>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p><a href="Installing%2Bon%2BLinux.html"><img src="attachments/327700/1540162.png" class="confluence-embedded-image image-center" /></a></p>
|
||||
<h3 id="InstallingtheSDK-Linux" style="text-align: center;">Linux</h3>
|
||||
<ul>
|
||||
<li>Ubuntu 12.04 (Precise Pangolin)</li>
|
||||
<li>Ubuntu 12.10 (Quantal <span style="color: rgb(0,0,0);">Quetzal</span>)</li>
|
||||
<li>Debian 6.0 (Squeeze)</li>
|
||||
<li>Debian 7.0 (Wheezy)</li>
|
||||
<li>Fedora 16</li>
|
||||
<li>Fedora 17</li>
|
||||
</ul>
|
||||
<p><span style="color: rgb(255,255,255);">______________________________________</span></p></td>
|
||||
<td><p><a href="Installing%2Bon%2BMac%2BOS%2BX.html"><img src="attachments/327700/1540163.png" class="confluence-embedded-image image-center" /></a></p>
|
||||
<h3 id="InstallingtheSDK-MacOSX" style="text-align: center;">Mac OS X</h3>
|
||||
<ul>
|
||||
<li>10.6 (Snow Leopard)</li>
|
||||
<li>10.7 (Lion)</li>
|
||||
<li>10.8 (Mountain Lion)</li>
|
||||
</ul>
|
||||
<p><span style="color: rgb(255,255,255);">______________________________________</span></p></td>
|
||||
<td><p><a href="Installing%2Bon%2BWindows.html"><img src="attachments/327700/1540164.png" class="confluence-embedded-image image-center" /></a></p>
|
||||
<h3 id="InstallingtheSDK-MicrosoftWindows" style="text-align: center;">Microsoft Windows</h3>
|
||||
<ul>
|
||||
<li>Windows XP</li>
|
||||
<li>Windows Vista</li>
|
||||
<li>Windows 7</li>
|
||||
<li>Windows 8</li>
|
||||
</ul>
|
||||
<p><span style="color: rgb(255,255,255);">______________________________________</span></p>
|
||||
<div>
|
||||
<span style="color: rgb(255,255,255);"><br />
|
||||
</span>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="50%" />
|
||||
<col width="50%" />
|
||||
</colgroup>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p><a href="Installing%2Bfor%2BAndroid%2Bdevelopment.html"><img src="attachments/327700/2654239.png" class="confluence-embedded-image image-center" /></a></p>
|
||||
<h3 id="InstallingtheSDK-Android" style="text-align: center;">Android</h3>
|
||||
<ul>
|
||||
<li>2.3.1 Gingerbread and above</li>
|
||||
</ul>
|
||||
<p><span style="color: rgb(255,255,255);">______________________________________</span></p>
|
||||
<div>
|
||||
<span style="color: rgb(255,255,255);"><br />
|
||||
</span>
|
||||
</div></td>
|
||||
<td><p><a href="Installing%2Bfor%2BiOS%2Bdevelopment.html"><img src="attachments/327700/3539150.jpeg" class="confluence-embedded-image image-center" /></a></p>
|
||||
<h3 id="InstallingtheSDK-iOS" style="text-align: center;">iOS</h3>
|
||||
<ul>
|
||||
<li>iOS 6 and above</li>
|
||||
</ul>
|
||||
<p><span style="color: rgb(255,255,255);">______________________________________</span></p>
|
||||
<div>
|
||||
<span style="color: rgb(255,255,255);"><br />
|
||||
</span>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
The installation instructions are different depending on your platform.
|
||||
Please select the appropriate one by clicking on its logo.
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[lin200px.png](attachments/327700/1540162.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[mac200px.png](attachments/327700/1540163.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[win200px.png](attachments/327700/1540164.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[andr200px.png](attachments/327700/2654239.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[ios200px.jpeg](attachments/327700/3539150.jpeg) (image/jpeg)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
222
Legal+information.markdown
Normal file
|
@ -0,0 +1,222 @@
|
|||
# GStreamer SDK documentation : Legal information
|
||||
|
||||
This page last changed on Jun 11, 2012 by xartigas.
|
||||
|
||||
# Installer, default installation
|
||||
|
||||
The installer (Microsoft Windows and MacOSX) and the default
|
||||
installation (GNU/Linux) contain and install the minimal default
|
||||
installation. At install time or later, the downloading of optional
|
||||
components is also possible, but read on for certain legal cautions you
|
||||
might want to take. All downloads are from the
|
||||
[freedesktop.org](http://freedesktop.org) website.
|
||||
|
||||
# Licensing of SDK
|
||||
|
||||
Gstreamer SDK minimal default installation only contains packages which
|
||||
are licensed under the [GNU LGPL license
|
||||
v2.1](http://www.gnu.org/licenses/old-licenses/lgpl-2.1.html). This
|
||||
license gives you the Freedom to use, modify, make copies of the
|
||||
software either in the original or in a modified form, provided that the
|
||||
software you redistribute is licensed under the same licensing terms.
|
||||
This only extends to the software itself and modified versions of it,
|
||||
but you are free to link the LGPL software as a library used by other
|
||||
software under whichever license. In other words, it is a weak copyleft
|
||||
license.
|
||||
|
||||
Therefore, it is possible to use the SDK to build applications that are
|
||||
then distributed under a different license, including a proprietary one,
|
||||
provided that reverse engineering is not prohibited for debugging
|
||||
modifications purposes. Only the pieces of the SDK that are under the
|
||||
LGPL need to be kept under the LGPL, and the corresponding source code
|
||||
must be distributed along with the application (or an irrevocable offer
|
||||
to do so for at least three years from distribution). Please consult
|
||||
section 6 of the
|
||||
[LGPL](http://www.gnu.org/licenses/old-licenses/lgpl-2.1.html) for
|
||||
further details as to what the corresponding source code must contain.
|
||||
|
||||
Some portions of the minimal default installation may be under
|
||||
different licenses, which are both more liberal than the LGPL (they are
|
||||
less strict conditions for granting the license) and compatible with the
|
||||
LGPL. This is advised locally.
|
||||
|
||||
# Optional packages
|
||||
|
||||
There are two types of optional packages (GPL and Patented), which are
|
||||
under a different license or have other issues concerning patentability
|
||||
(or both).
|
||||
|
||||
### GPL code
|
||||
|
||||
Part of the optional packages are under the GNU GPL
|
||||
[v2](http://www.gnu.org/licenses/old-licenses/gpl-2.0.html) or
|
||||
[v3](http://www.gnu.org/licenses/gpl-3.0.html). This means that you
|
||||
cannot link the GPL software in a program unless the same program is
|
||||
also under the GPL, but you are invited to seek competent advice on how
|
||||
this works in your precise case and design choices. GPL is called
|
||||
“strong copyleft” because the condition to distributed under the same
|
||||
license has the largest possible scope and extends to all derivative
|
||||
works.
|
||||
|
||||
### Patents
|
||||
|
||||
Certain software, and in particular software that implements
|
||||
multimedia standard formats such as Mp3, MPEG 2 video and audio, h.264,
|
||||
MPEG 4 audio and video, AC3, etc, can have patent issues. In certain
|
||||
countries patents are granted on software and even software-only
|
||||
solution are by and large considered patentable and are patented (such
|
||||
as in the United States). In certain others, patents on pure software
|
||||
solutions are formally prohibited, but granted (this is the case of
|
||||
Europe), and in others again are neither allowed nor granted.
|
||||
|
||||
It is up to you to make sure that in the countries where the SDK is
|
||||
used, products are made using it and product are distributed, a license
|
||||
from the applicable patent holders is required or not. Receiving the SDK
|
||||
– or links to other downloadable software – does not provide any license
|
||||
expressed or implied over these patents, except in very limited
|
||||
conditions where the license so provides. No representation is made.
|
||||
|
||||
In certain cases, the optional packages are distributed only as source
|
||||
code. It is up to the receiver to make sure that in the applicable
|
||||
circumstances compiling the same code for a given platform or
|
||||
distributing the object code is not an act that infringes one or more
|
||||
patents.
|
||||
|
||||
# Software is as-is
|
||||
|
||||
All software and the entire SDK is provided as-is, without any
|
||||
warranty whatsoever. The individual licenses have particular language
|
||||
disclaiming liability: we invite you to read all of them. Should you
|
||||
need a warranty on the fact that software works as intended or have any
|
||||
kind of indemnification, you have the option to subscribe a software
|
||||
maintenance agreement with a company or entity that is in that business.
|
||||
Fluendo and Collabora, as well as some other companies, provide software
|
||||
maintenance agreements under certain conditions, you are invited to
|
||||
contact them in order to receive further details and discuss of the
|
||||
commercial terms.
|
||||
|
||||
# Data protection
|
||||
|
||||
This website might use cookies and HTTP logs for statistical analysis
|
||||
and on an aggregate basis only.
|
||||
|
||||
# Frequently Asked Questions
|
||||
|
||||
#### What licenses are there?
|
||||
|
||||
The SDK containst software under various licenses. See above.
|
||||
|
||||
#### How does this relate to the packaging system?
|
||||
|
||||
The packaging is only a more convenient way to install software and
|
||||
decide what's good for you. GStreamer is meant to be modular, making use
|
||||
of different modules, or plugins, that perform different activities.
|
||||
|
||||
We provide some of them by default. Others are provided as an additional
|
||||
download, should you elect to do so. You could do the same by finding
|
||||
and downloading the same packages for your own platform. So it is
|
||||
entirely up to you to decide what to do.
|
||||
|
||||
Also, we note that SDK elements are divided into different packages,
|
||||
roughly following the licensing conditions attached to the same. For
|
||||
instance, the codecs-gpl package contains GPL licensed codecs. All the
|
||||
packages installed by default, conversely, are licensed under the LGPL
|
||||
or a more liberal license. This division is provided only for ease of
|
||||
reference, but we cannot guarantee that our selection is 100% correct,
|
||||
so it is up to the user to verify the actual licensing conditions before
|
||||
distributing works that utilize the SDK.
|
||||
|
||||
#### Can I / must I distribute the SDK along with my application?
|
||||
|
||||
You surely can. All software is Free/Open Source software, and can be
|
||||
distributed freely. You are not **required** to distribute it. Only, be
|
||||
reminded that one of the conditions for you to use software under
|
||||
certain licenses to make a work containing such software, is that you
|
||||
also distribute the complete source code of the original code (or of the
|
||||
modified code, if you have modified it). There are alternative ways to
|
||||
comply with this obligation, some of them do not require any actual
|
||||
distribution of source code, but since the SDK contains the entire
|
||||
source code, you might want to include it (or the directories containing
|
||||
the source code) with your application as a safe way to comply with this
|
||||
requirement of the license.
|
||||
|
||||
#### What happens when I modify the GStreamer SDK's source code?
|
||||
|
||||
You are invited to do so, as the licenses (unless you are dealing with
|
||||
proprietary bits, but in that case you will not find the corresponding
|
||||
source code) so permit. Be reminded though that in that case you need to
|
||||
also provide the complete corresponding source code (and to preserve the
|
||||
same license, of course). You might also consider to push your
|
||||
modifications upstream, so that they are merged into the main branch of
|
||||
development if they are worth it and will be maintained by the GStreamer
|
||||
project and not by you individually. We invite you not to fork the code,
|
||||
if at all
|
||||
possible.
|
||||
|
||||
#### How does licensing relate to software patents? What about software patents in general?
|
||||
|
||||
This is a tricky question. We believe software patents should not exist,
|
||||
so that by distributing and using software on a general purpose machine
|
||||
you would not violate any of them. But the inconvenient truth is that
|
||||
they do exist.
|
||||
|
||||
Software patents are widely available in the USA. Despite they are
|
||||
formally prohibited in the European Union, they indeed are granted by
|
||||
the thousand by the European Patent Office, and also some national
|
||||
patent offices follow the same path. In other countries they are not
|
||||
available.
|
||||
|
||||
Since patent protection is a national state-granted monopoly,
|
||||
distributing software that violates patents in a given country could be
|
||||
entirely safe if done in another country. Fair use exceptions also
|
||||
exist. So we cannot advice you whether the software we provide would be
|
||||
considered violating patents in your country or in any other country,
|
||||
but that can be said for virtually all kinds of sofware. Only, since we
|
||||
deal with audio-video standards, and these standards are by and large
|
||||
designed to use certain patented technologies, it is common wisdom that
|
||||
the pieces of software that implement these standards are sensitive in
|
||||
this respect.
|
||||
|
||||
This is why GStreamer has taken a modular approach, so that you can use
|
||||
a Free plugins or a proprietary, patent royalty bearing, plugin for a
|
||||
given standard.
|
||||
|
||||
#### What about static vs. dynamic linking and copyleft?
|
||||
|
||||
We cannot provide one single answer to that question. Since copyright in
|
||||
software works as copyright in literature, static linking means
|
||||
basically that the programmer includes bits of code of the original
|
||||
library in the bytecode at compile time. This amounts to make derivative
|
||||
code of the library without conceivable exceptions, so you need a
|
||||
permission from the copyright holders of the library to do this.
|
||||
|
||||
A widespread line of thinking says that dynamic linking is conversely
|
||||
not relevant to the copyleft effect, since the mingling of code in a
|
||||
larger work is done at runtime. However, another equally authoritative
|
||||
line of thought says that only certain type of dynamic linking is not
|
||||
copyright relevant. Therefore, using a library that is specifically
|
||||
designed to be loaded into a particular kind of software, even through
|
||||
API, requires permission by the copyright holder of the library when
|
||||
the two pieces are distributed together.
|
||||
|
||||
In all cases, since most of the software we include in the SDK is under
|
||||
the LGPL, this permission is granted once for all, subject to compliance
|
||||
with the conditions set out by it. Therefore, the problem only arises
|
||||
when you want to use GPL libraries to make non-GPL applications, and you
|
||||
need to audit your software in that case to make sure that what you do
|
||||
is not an infringement. This is why we have put the GPL libraries in a
|
||||
separate set of optional components, so you have a clearer view of what
|
||||
is safely clear for use, and what might need better investigation on a
|
||||
case-by-case basis.
|
||||
|
||||
Please be reminded that even for LGPL, the recipient of the software
|
||||
must be in a position to replace the current library with a modified
|
||||
one, and to that effect some conditions apply, among which that for
|
||||
static linking you must also provide the complete toolchain required to
|
||||
relink the library (“any data and utility programs needed for
|
||||
reproducing the executable from it”, except the “major components”) and
|
||||
that the license of the conditions of the resulting program must allow
|
||||
decompilation to debug modifications to the library.
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:28
|
||||
|
384
Mac+OS+X+deployment.markdown
Normal file
|
@ -0,0 +1,384 @@
|
|||
# GStreamer SDK documentation : Mac OS X deployment
|
||||
|
||||
This page last changed on Nov 28, 2012 by xartigas.
|
||||
|
||||
This page explains how to deploy GStreamer along your application. There
|
||||
are different mechanisms, which have been reviewed in [Deploying your
|
||||
application](Deploying%2Byour%2Bapplication.html). The details for some
|
||||
of the mechanisms are given here, and more options might be added to
|
||||
this documentation in the future.
|
||||
|
||||
The recommended tool to create installer packages for Mac OS X
|
||||
is [PackageMaker](http://developer.apple.com/library/mac/#documentation/DeveloperTools/Conceptual/PackageMakerUserGuide/Introduction/Introduction.html),
|
||||
provided with the [XCode developer
|
||||
tools](https://developer.apple.com/technologies/tools/).
|
||||
|
||||
# Shared GStreamer
|
||||
|
||||
This is the easiest way to deploy GStreamer, although most of the time
|
||||
it installs unnecessary files which grow the size of the installer and
|
||||
the target drive free space requirements. Since the SDK might be shared
|
||||
among all applications that use it, though, the extra space requirements
|
||||
are somewhat blurred.![](attachments/2097292/2424841.png)
|
||||
![](attachments/thumbnails/2097292/2424842)
|
||||
|
||||
With PackageMaker, simply add the GStreamer SDK **runtime ** disk image
|
||||
([the same one you used to install the runtime in your development
|
||||
machine](Installing%2Bon%2BMac%2BOS%2BX.html)) inside your installer
|
||||
package and create a post-install script that mounts the disk image and
|
||||
installs the SDK package. You can use the following example, where you
|
||||
should replace `$INSTALL_PATH` with the path where your installer copied
|
||||
the SDK's disk image files (the `/tmp` directory is good place to
|
||||
install it as it will be removed at the end of the installation):
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
hdiutil attach $INSTALL_PATH/gstreamer-sdk-2012.7-x86.dmg
|
||||
cd /Volumes/gstreamer-sdk-2012.7-x86/
|
||||
installer -pkg gstreamer-sdk-2012.7-x86.pkg -target "/"
|
||||
hdiutil detach /Volumes/gstreamer-sdk-2012.7-x86/
|
||||
rm $INSTALL_PATH/gstreamer-sdk-2012.7-x86.dmg
|
||||
```
|
||||
|
||||
# Private deployment of GStreamer
|
||||
|
||||
You can decide to distribute a private copy of the SDK with your
|
||||
application, although it's not the recommended method. In this case,
|
||||
simply copy the framework to the application's Frameworks folder as
|
||||
defined in the [bundle programming
|
||||
guide](https://developer.apple.com/library/mac/documentation/CoreFoundation/Conceptual/CFBundles/BundleTypes/BundleTypes.html#//apple_ref/doc/uid/10000123i-CH101-SW19):
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
cp -r /Library/Frameworks/GStreamer.framework ~/MyApp.app/Contents/Frameworks
|
||||
```
|
||||
|
||||
Note that you can have several versions of the SDK, and targeting
|
||||
different architectures, installed in the system. Make sure you only
|
||||
copy the version you need and that you update accordingly the link
|
||||
`GStreamer.framework/Version/Current`:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
$ ls -l Frameworks/GStreamer.framework/Version/Current
|
||||
lrwxr-xr-x 1 fluendo staff 21 Jun 5 18:46 Frameworks/GStreamer.framework/Versions/Current -> ../Versions/0.10/x86
|
||||
```
|
||||
|
||||
Since the SDK will be relocated, you will need to follow the
|
||||
instructions on how to relocate the SDK at the end of this page.
|
||||
|
||||
# Deploy only necessary files, by manually picking them
|
||||
|
||||
On the other side of the spectrum, if you want to reduce the space
|
||||
requirements (and installer size) to the maximum, you can manually
|
||||
choose which GStreamer libraries to deploy. Unfortunately, you are on
|
||||
your own on this road, besides using the object file displaying tool:
|
||||
[otool](https://developer.apple.com/library/mac/#documentation/darwin/reference/manpages/man1/otool.1.html).
|
||||
Being a similar technique to deploying a private copy of the SDK, keep
|
||||
in mind that you should relocate the SDK too, as explained at the end of
|
||||
this page.
|
||||
|
||||
Bear also in mind that GStreamer is modular in nature. Plug-ins are
|
||||
loaded depending on the media that is being played, so, if you do not
|
||||
know in advance what files you are going to play, you do not know which
|
||||
plugins and shared libraries you need to deploy.
|
||||
|
||||
# Deploy only necessary packages, using the provided ones
|
||||
|
||||
This will produce a smaller installer than deploying the complete
|
||||
GStreamer SDK, without the added burden of having to manually pick each
|
||||
library. You just need to know which packages your application requires.
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)Available packages (Click to
|
||||
expand)
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Package name</th>
|
||||
<th>Dependencies</th>
|
||||
<th>Licenses</th>
|
||||
<th>Description</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td>base-system</td>
|
||||
<td> </td>
|
||||
<td>JPEG, FreeType, BSD-like, LGPL,<br />
|
||||
LGPL-2+, LGPL-2.1, LibPNG and MIT</td>
|
||||
<td>Base system dependencies</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gobject-python</td>
|
||||
<td>base-system</td>
|
||||
<td>LGPL</td>
|
||||
<td>GLib/GObject python bindings</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-capture</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer plugins for capture</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-clutter</td>
|
||||
<td>base-system, gtk+-2.0, gstreamer-core</td>
|
||||
<td>LGPL</td>
|
||||
<td>GStreamer Clutter support</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-codecs</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>BSD, Jasper-2.0, BSD-like, LGPL,<br />
|
||||
LGPL-2, LGPL-2+, LGPL-2.1 and LGPL-2.1+</td>
|
||||
<td>GStreamer codecs</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-codecs-gpl</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>BSD-like, LGPL, LGPL-2+ and LGPL-2.1+</td>
|
||||
<td>GStreamer codecs under the GPL license and/or with patents issues</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-core</td>
|
||||
<td>base-system</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer core</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-dvd</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>GPL-2+, LGPL and LGPL-2+</td>
|
||||
<td>GStreamer DVD support</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-effects</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer effects and instrumentation plugins</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-networking</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>GPL-3, LGPL, LGPL-2+, LGPL-2.1+<br />
|
||||
and LGPL-3+</td>
|
||||
<td>GStreamer plugins for network protocols</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-playback</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer plugins for playback</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-python</td>
|
||||
<td>base-system, gobject-python,<br />
|
||||
gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2.1+</td>
|
||||
<td>GStreamer python bindings</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-system</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL, LGPL-2+ and LGPL-2.1+</td>
|
||||
<td>GStreamer system plugins</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-tutorials</td>
|
||||
<td> </td>
|
||||
<td>LGPL</td>
|
||||
<td>Tutorials for GStreamer</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-visualizers</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer visualization plugins</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gtk+-2.0</td>
|
||||
<td>base-system</td>
|
||||
<td>LGPL</td>
|
||||
<td>Gtk toolkit</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gtk+-2.0-python</td>
|
||||
<td>base-system, gtk+-2.0</td>
|
||||
<td>LGPL and LGPL-2.1+</td>
|
||||
<td>Gtk python bindings</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>snappy</td>
|
||||
<td><p>base-system, gstreamer-clutter,<br />
|
||||
gtk+-2.0, gstreamer-playback,<br />
|
||||
gstreamer-core, gstreamer-codecs</p></td>
|
||||
<td>LGPL</td>
|
||||
<td>Snappy media player</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Get the disk image file with all the packages:
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="100%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Universal</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p><a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal-packages.dmg" class="external-link">GStreamer SDK 2013.6 (Congo) for Mac OS X (Deployment Packages)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/osx/universal/gstreamer-sdk-2013.6-universal-packages.dmg" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal-packages.dmg.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/osx/universal/gstreamer-sdk-2013.6-universal-packages.dmg.sha1" class="external-link">sha1</a></p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>Due to the size of these files, usage of a <a href="http://en.wikipedia.org/wiki/Download_manager" class="external-link">Download Manager</a> is <strong>highly recommended</strong>. Take a look at <a href="http://en.wikipedia.org/wiki/Comparison_of_download_managers" class="external-link">this list</a> if you do not have one installed. If, after downloading, the installer reports itself as corrupt, chances are that the connection ended before the file was complete. A Download Manager will typically re-start the process and fetch the missing parts.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Relocation of the SDK in OS X
|
||||
|
||||
In some situations we might need to relocate the SDK, moving it to a
|
||||
different place in the file system, like for instance when we are
|
||||
shipping a private copy of the SDK with our application.
|
||||
|
||||
### Location of dependent dynamic libraries.
|
||||
|
||||
On Darwin operating systems, the dynamic linker doesn't locate dependent
|
||||
dynamic libraries using their leaf name, but instead it uses full paths,
|
||||
which makes it harder to relocate them as explained in the DYNAMIC
|
||||
LIBRARY LOADING section of
|
||||
[dyld](https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man1/dyld.1.html)'s
|
||||
man page:
|
||||
|
||||
> Unlike many other operating systems, Darwin does not locate dependent
|
||||
> dynamic libraries via their leaf file name. Instead the full path to
|
||||
> each dylib is used (e.g. /usr/lib/libSystem.B.dylib). But there are
|
||||
> times when a full path is not appropriate; for instance, may want your
|
||||
> binaries to be installable in anywhere on the disk.
|
||||
|
||||
We can get the list of paths used by an object file to locate its
|
||||
dependent dynamic libraries
|
||||
using [otool](https://developer.apple.com/library/mac/#documentation/darwin/reference/manpages/man1/otool.1.html):
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
$ otool -L /Library/Frameworks/GStreamer.framework/Commands/gst-launch-0.10
|
||||
/Library/Frameworks/GStreamer.framework/Commands/gst-launch-0.10:
|
||||
/System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation (compatibility version 150.0.0, current version 550.43.0)
|
||||
/Library/Frameworks/GStreamer.framework/Versions/0.10/x86/lib/libgstreamer-0.10.0.dylib (compatibility version 31.0.0, current version 31.0.0)
|
||||
/Library/Frameworks/GStreamer.framework/Versions/0.10/x86/lib/libxml2.2.dylib (compatibility version 10.0.0, current version 10.8.0)
|
||||
...
|
||||
```
|
||||
|
||||
As you might have already noticed, if we move the SDK to a different
|
||||
folder, it will stop working because the runtime linker won't be able to
|
||||
find `gstreamer-0.10` in the previous location
|
||||
`/Library/Frameworks/GStreamer.framework/Versions/0.10/x86/lib/libgstreamer-0.10.0.dylib`.
|
||||
|
||||
This full path is extracted from the dynamic library ***install name***
|
||||
, a path that is used by the linker to determine its location. The
|
||||
install name of a library can be retrieved with
|
||||
[otool](https://developer.apple.com/library/mac/#documentation/darwin/reference/manpages/man1/otool.1.html) too:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
$ otool -D /Library/Frameworks/GStreamer.framework/Libraries/libgstreamer-0.10.dylib
|
||||
/Library/Frameworks/GStreamer.framework/Libraries/libgstreamer-0.10.dylib:
|
||||
/Library/Frameworks/GStreamer.framework/Versions/0.10/x86/lib/libgstreamer-0.10.0.dylib
|
||||
```
|
||||
|
||||
Any object file that links to the dynamic library `gstreamer-0.10` will
|
||||
use the
|
||||
path `/Library/Frameworks/GStreamer.framework/Versions/0.10/x86/lib/libgstreamer-0.10.0.dylib` to
|
||||
locate it, as we saw previously with `gst-launch-0.10`.
|
||||
|
||||
Since working exclusively with full paths wouldn't let us install our
|
||||
binaries anywhere in the path, the linker provides a mechanism of string
|
||||
substitution, adding three variables that can be used as a path prefix.
|
||||
At runtime the linker will replace them with the generated path for the
|
||||
prefix. These variables are `@executable_path`,
|
||||
`@loader_path` and `@rpath`, described in depth in the DYNAMIC LIBRARY
|
||||
LOADING section
|
||||
of [dyld](https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man1/dyld.1.html)'s
|
||||
man page.
|
||||
|
||||
For our purpose we will use the `@executable_path` variable, which is
|
||||
replaced with a fixed path, the path to the directory containing the
|
||||
main executable: `/Applications/MyApp.app/Contents/MacOS`.
|
||||
The `@loader_path` variable can't be used in our scope, because it will
|
||||
be replaced with the path to the directory containing the mach-o binary
|
||||
that loaded the dynamic library, which can vary.
|
||||
|
||||
Therefore, in order to relocate the SDK we will need to replace all
|
||||
paths
|
||||
containing `/Library/Frameworks/GStreamer.framework/` with `@executable_path/../Frameworks/GStreamer.framework/`, which
|
||||
can be done using
|
||||
the [install\_name\_tool](http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man1/install_name_tool.1.html)
|
||||
utility
|
||||
|
||||
### Relocation of the binaries
|
||||
|
||||
As mentioned in the previous section, we can use
|
||||
the `install_name_tool` in combination with `otool` to list all paths
|
||||
for dependant dynamic libraries and modify them to use the new location.
|
||||
However the SDK has a huge list of binaries and doing it manually would
|
||||
be a painful task. That's why a simple relocation script is provided
|
||||
which you can find in cerbero's repository
|
||||
(`cerbero/tools/osxrelocator.py`). This scripts takes 3 parameters:
|
||||
|
||||
1. `directory`: the directory to parse looking for binaries
|
||||
2. `old_prefix`: the old prefix we want to change (eg:
|
||||
`/Library/Frameworks/GStreamer.framework`)
|
||||
3. `new_prefix`: the new prefix we want to use
|
||||
(eg: `@executable_path/../Frameworks/GStreamer.framework/`)
|
||||
|
||||
When looking for binaries to fix, we will run the script in the
|
||||
following
|
||||
directories:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/lib /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
|
||||
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/libexec /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
|
||||
$ osxrelocator.py MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/bin /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
|
||||
$ osxrelocator.py MyApp.app/Contents/MacOS /Library/Frameworks/GStreamer.framework/ @executable_path/../Frameworks/GStreamer.framework/ -r
|
||||
```
|
||||
|
||||
### Adjusting environment variables with the new paths
|
||||
|
||||
The application also needs to set the following environment variables to
|
||||
help other libraries finding resources in the new
|
||||
path:
|
||||
|
||||
- `GST_PLUGIN_SYSTEM_PATH=/Applications/MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/lib/gstreamer-0.10`
|
||||
- `GST_PLUGIN_SCANNER=/Applications/MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/libexec/gstreamer-0.10/gst-plugin-scanner`
|
||||
- `GTK_PATH=/Applications/MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/`
|
||||
- `GIO_EXTRA_MODULES=/Applications/MyApp.app/Contents/Frameworks/GStreamer.framework/Versions/Current/lib/gio/modules`
|
||||
|
||||
You can use the following functions:
|
||||
|
||||
- C: [putenv("VAR=/foo/bar")](http://linux.die.net/man/3/putenv)
|
||||
|
||||
- Python: [os.environ\['VAR'\] =
|
||||
'/foo/var'](http://docs.python.org/library/os.html)
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[PackageMaker1.png](attachments/2097292/2424841.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[PackageMaker2.png](attachments/2097292/2424842.png) (image/png)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
489
Multiplatform+deployment+using+Cerbero.markdown
Normal file
|
@ -0,0 +1,489 @@
|
|||
# GStreamer SDK documentation : Multiplatform deployment using Cerbero
|
||||
|
||||
This page last changed on Nov 21, 2012 by slomo.
|
||||
|
||||
Cerbero is the build and packaging system used to construct the
|
||||
GStreamer SDK. It uses “recipe” files that indicate how to build
|
||||
particular projects, and on what other projects they depend.
|
||||
Moreover, the built projects can be combined into packages for
|
||||
distribution. These packages are, depending on the target platform,
|
||||
Windows or OS X installers or Linux packages.
|
||||
|
||||
To use Cerbero to build and package your application, you just need to
|
||||
add a recipe explaining how to build you application and make it depend
|
||||
on the `gstreamer-sdk` project. Then Cerbero can take care of building
|
||||
your application and its dependencies and package them all together.
|
||||
|
||||
Read [Building from source using
|
||||
Cerbero](Building%2Bfrom%2Bsource%2Busing%2BCerbero.html) to learn how
|
||||
to install and use Cerbero.
|
||||
|
||||
At this point, after reading the Build from source section in [Building
|
||||
from source using
|
||||
Cerbero](Building%2Bfrom%2Bsource%2Busing%2BCerbero.html), you should be
|
||||
able to build the GStreamer SDK from source and are ready to create
|
||||
recipe and package files for your application.
|
||||
|
||||
In the Cerbero installation directory you will find the
|
||||
`cerbero-uninstalled` script. Execute it without parameters to see the
|
||||
list of commands it accepts:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
./cerbero-uninstalled
|
||||
```
|
||||
|
||||
# Adding a recipe for your application
|
||||
|
||||
The first step is to create an empty recipe that you can then tailor to
|
||||
your needs:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
./cerbero-uninstalled add-recipe my-app 1.0
|
||||
```
|
||||
|
||||
This will create an initial recipe file in `recipes/my-app.recipe`,
|
||||
which contains the smallest necessary recipe. This file is a Python
|
||||
script; set the following attributes to describe your application:
|
||||
|
||||
Attribute Name
|
||||
|
||||
Description
|
||||
|
||||
Required
|
||||
|
||||
Example
|
||||
|
||||
`name`
|
||||
|
||||
The recipe name.
|
||||
|
||||
Yes
|
||||
|
||||
*name = 'my-app'*
|
||||
|
||||
`version`
|
||||
|
||||
The software version.
|
||||
|
||||
Yes
|
||||
|
||||
*version = '1.0'*
|
||||
|
||||
`licenses`
|
||||
|
||||
A list of licenses of the software (see `cerbero/enums.py:License` for
|
||||
allowed licenses).
|
||||
|
||||
Yes
|
||||
|
||||
*licenses = \[License.LGPLv2Plus\]*
|
||||
|
||||
`deps`
|
||||
|
||||
A list of build dependencies of the software as recipe names.
|
||||
|
||||
No
|
||||
|
||||
*deps = \['other', 'recipe', 'names'\]*
|
||||
|
||||
`platform_deps`
|
||||
|
||||
Platform specific build dependencies (see `cerbero/enums.py:Platform`
|
||||
for allowed platforms).
|
||||
|
||||
No
|
||||
|
||||
*platform\_deps = {Platform.LINUX: \['some-recipe'\], Platform.WINDOWS:
|
||||
\['another-recipe'\]}*
|
||||
|
||||
`remotes`
|
||||
|
||||
A dictionary specifying the git remote urls where sources are pulled
|
||||
from.
|
||||
|
||||
No
|
||||
|
||||
*remotes = {'origin': '<git://somewhere>'}*
|
||||
|
||||
`commit`
|
||||
|
||||
The git commit, tag or branch to use, defaulting to "sdk-*`version`*"*.*
|
||||
|
||||
No
|
||||
|
||||
*commit = 'my-app-branch'*
|
||||
|
||||
`config_sh`
|
||||
|
||||
Used to select the configuration script.
|
||||
|
||||
No
|
||||
|
||||
*config\_sh = 'autoreconf -fiv && sh ./configure'*
|
||||
|
||||
`configure_options`
|
||||
|
||||
Additional options that should be passed to the `configure` script.
|
||||
|
||||
No
|
||||
|
||||
*configure\_options = '--enable-something'*
|
||||
|
||||
`use_system_libs`
|
||||
|
||||
Whether to use system provided libs.
|
||||
|
||||
No
|
||||
|
||||
*use\_system\_libs = True*
|
||||
|
||||
`btype`
|
||||
|
||||
The build type (see `cerbero/build/build.py:BuildType` for allowed build
|
||||
types).
|
||||
|
||||
No
|
||||
|
||||
*btype = BuildType.CUSTOM*
|
||||
|
||||
`stype`
|
||||
|
||||
The source type (see `cerbero/build/source.py:SourceType` for allowed
|
||||
source types).
|
||||
|
||||
No
|
||||
|
||||
*stype = SourceType.CUSTOM*
|
||||
|
||||
`files_category`
|
||||
|
||||
A list of files that should be shipped with packages including this
|
||||
recipe *category*. See below for more details.
|
||||
|
||||
Cerbero comes with some predefined categories that should be used if the
|
||||
files being installed match a category criteria.
|
||||
|
||||
The predefined categories are:
|
||||
|
||||
`libs` (for libraries), `bins` (for binaries), `devel` (for development
|
||||
files - header, pkgconfig files, etc), `python` (for python files) and
|
||||
`lang` (for language files).
|
||||
|
||||
Note that for the `bins` and `libs` categories there is no need to
|
||||
specify the files extensions as Cerbero will do it for you.
|
||||
|
||||
Yes\*
|
||||
|
||||
*files\_bins = \['some-binary'\]*
|
||||
|
||||
*files\_libs = \['libsomelib'\]*
|
||||
|
||||
*files\_devel = \['include/something'\] files\_python =
|
||||
\['site-packages/some/pythonfile%(pext)s'\]*
|
||||
|
||||
*files\_lang = \['foo'\]*
|
||||
|
||||
`platform_files_category`
|
||||
|
||||
Same as *`files_category`* but for platform specific files.
|
||||
|
||||
No
|
||||
|
||||
*platform\_files\_some\_category = {Platform.LINUX: \['/some/file'\]}*
|
||||
|
||||
\* At least one “files” category should be set.
|
||||
|
||||
Apart from the attributes listed above, it is also possible to override
|
||||
some Recipe methods. For example the `prepare` method can be overridden
|
||||
to do anything before the software is built, or the `install` and
|
||||
`post_install` methods for overriding what should be done during or
|
||||
after installation. Take a look at the existing recipes in
|
||||
`cerbero/recipes` for example.
|
||||
|
||||
Alternatively, you can pass some options to cerbero-uninstalled so some
|
||||
of these attributes are already set for you. For
|
||||
example:
|
||||
|
||||
``` theme: Default; brush: python; gutter: false
|
||||
./cerbero-uninstalled add-recipe --licenses "LGPL" --deps "glib,gtk+" --origin "git://git.my-app.com" --commit "git-commit-to-use" my-app 1.0
|
||||
```
|
||||
|
||||
See `./cerbero-uninstalled add-recipe -h` for help.
|
||||
|
||||
As an example, this is the recipe used to build the Snappy media player:
|
||||
|
||||
``` theme: Default; brush: python; gutter: false
|
||||
class Recipe(recipe.Recipe):
|
||||
name = 'snappy'
|
||||
version = '0.2+git'
|
||||
licenses = [License.GPLv2Plus]
|
||||
config_sh = 'autoreconf -fiv && sh ./configure'
|
||||
deps = ['glib', 'gstreamer', 'gst-plugins-base', 'clutter', 'clutter-gst']
|
||||
platform_deps = { Platform.LINUX: ['libXtst'] }
|
||||
use_system_libs = True
|
||||
remotes = {'upstream': 'git://git.gnome.org/snappy'}
|
||||
|
||||
|
||||
files_bins = ['snappy']
|
||||
files_data = ['share/snappy']
|
||||
|
||||
|
||||
def prepare(self):
|
||||
if self.config.target_platform == Platform.LINUX:
|
||||
self.configure_options += ' --enable-dbus'
|
||||
```
|
||||
|
||||
Cerbero gets the software sources to build from a GIT repository, which
|
||||
is specified via the `git_root` configuration variable from the Cerbero
|
||||
configuration file (see the "Build from software" section in [Installing
|
||||
on Linux](Installing%2Bon%2BLinux.html)) and can be overridden by the
|
||||
`remotes` attribute inside the recipes (if setting the `origin` remote).
|
||||
In this case where no “commit” attribute is specified, Cerbero will use
|
||||
the commit named “sdk-0.2+git” from the GIT repository when building
|
||||
Snappy.
|
||||
|
||||
Once the recipe is ready, instruct Cerbero to build it:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
./cerbero-uninstalled build my-app
|
||||
```
|
||||
|
||||
# Adding a package for you software
|
||||
|
||||
To distribute your software with the SDK it is necessary to put it into
|
||||
a package or installer, depending on the target platform. This is done
|
||||
by selecting the files that should be included. To add a package you
|
||||
have to create a package file in `cerbero/packages`. The package files
|
||||
are Python scripts too and there are already many examples of package
|
||||
files in `cerbero/packages`.
|
||||
|
||||
Now, to create an empty package, do:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
./cerbero-uninstalled add-package my-app 1.0
|
||||
```
|
||||
|
||||
This will create an initial package file in `packages/my-app.package`.
|
||||
|
||||
The following Package attributes are used to describe your package:
|
||||
|
||||
**Attribute Name**
|
||||
|
||||
**Description**
|
||||
|
||||
**Required**
|
||||
|
||||
**Example**
|
||||
|
||||
`name`
|
||||
|
||||
The package name.
|
||||
|
||||
Yes
|
||||
|
||||
*name = 'my-app'*
|
||||
|
||||
`shortdesc`
|
||||
|
||||
A short description of the package.
|
||||
|
||||
No
|
||||
|
||||
*shortdesc = 'some-short-desc'*
|
||||
|
||||
`longdesc`
|
||||
|
||||
A long description of the package.
|
||||
|
||||
No
|
||||
|
||||
*longdesc = 'Some Longer Description'*
|
||||
|
||||
`codename`
|
||||
|
||||
The release codename.
|
||||
|
||||
No
|
||||
|
||||
*codename = 'MyAppReleaseName'*
|
||||
|
||||
`vendor`
|
||||
|
||||
Vendor for this package.
|
||||
|
||||
No
|
||||
|
||||
*vendor = 'MyCompany'*
|
||||
|
||||
`url`
|
||||
|
||||
The package url
|
||||
|
||||
No
|
||||
|
||||
*url = 'http://www.my-app.com'*
|
||||
|
||||
`version`
|
||||
|
||||
The package version.
|
||||
|
||||
Yes
|
||||
|
||||
*version = '1.0'*
|
||||
|
||||
`license`
|
||||
|
||||
The package license (see `cerbero/enums.py:License` for allowed
|
||||
licenses).
|
||||
|
||||
Yes
|
||||
|
||||
*license = License.LGPLv2Plus*
|
||||
|
||||
`uuid`
|
||||
|
||||
The package unique id
|
||||
|
||||
Yes
|
||||
|
||||
*uuid = '6cd161c2-4535-411f-8287-e8f6a892f853'*
|
||||
|
||||
`deps`
|
||||
|
||||
A list of package dependencies as package names.
|
||||
|
||||
No
|
||||
|
||||
*deps = \['other', 'package', 'names'\]*
|
||||
|
||||
`sys_deps`
|
||||
|
||||
The system dependencies for this package.
|
||||
|
||||
No
|
||||
|
||||
*sys\_deps= {Distro.DEBIAN: \['python'\]}*
|
||||
|
||||
`files`
|
||||
|
||||
A list of files included in the **runtime** package in the form
|
||||
*“recipe\_name:category1:category2:...”*
|
||||
|
||||
If the recipe category is omitted, all categories are included.
|
||||
|
||||
Yes\*
|
||||
|
||||
*files = \['my-app'\]*
|
||||
|
||||
*files = \['my-app:category1'\]*
|
||||
|
||||
`files_devel`
|
||||
|
||||
A list of files included in the **devel** package in the form
|
||||
*“recipe\_name:category1:category2:...”*
|
||||
|
||||
Yes\*
|
||||
|
||||
*files\_devel = \['my-app:category\_devel'\]*
|
||||
|
||||
`platform_files`
|
||||
|
||||
Same as *files* but allowing to specify different files for different
|
||||
platforms.
|
||||
|
||||
Yes\*
|
||||
|
||||
*platform\_files = {Platform.WINDOWS:
|
||||
\['my-app:windows\_only\_category'\]}*
|
||||
|
||||
`platform_files_devel`
|
||||
|
||||
Same as *files\_devel* but allowing to specify different files for
|
||||
different platforms.
|
||||
|
||||
Yes\*
|
||||
|
||||
*platform\_files\_devel = {Platform.WINDOWS:
|
||||
\['my-app:windows\_only\_category\_devel'\]}*
|
||||
|
||||
\* At least one of the “files” attributes should be set.
|
||||
|
||||
Alternatively you can also pass some options to `cerbero-uninstalled`,
|
||||
for
|
||||
example:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
./cerbero-uninstalled add-package my-app 1.0 --license "LGPL" --codename MyApp --vendor MyAppVendor --url "http://www.my-app.com" --files=my-app:bins:libs --files-devel=my-app:devel --platform-files=linux:my-app:linux_specific --platform-files-devel=linux:my-app:linux_specific_devel,windows:my-app:windows_specific_devel --deps base-system --includes gstreamer-core
|
||||
```
|
||||
|
||||
See `./cerbero-uninstalled add-package -h` for help.
|
||||
|
||||
As an example, this is the package file that is used for packaging the
|
||||
`gstreamer-core` package:
|
||||
|
||||
``` theme: Default; brush: python; gutter: false
|
||||
class Package(package.Package):
|
||||
name = 'gstreamer-codecs'
|
||||
shortdesc = 'GStreamer codecs'
|
||||
version = '2012.5'
|
||||
codename = 'Amazon'
|
||||
url = "http://www.gstreamer.com"
|
||||
license = License.LGPL
|
||||
vendor = 'GStreamer Project'
|
||||
uuid = '6cd161c2-4535-411f-8287-e8f6a892f853'
|
||||
deps = ['gstreamer-core']
|
||||
|
||||
|
||||
files = ['flac:libs',
|
||||
'jasper:libs', 'libkate:libs',
|
||||
'libogg:libs', 'schroedinger:libs', 'speex:libs',
|
||||
'libtheora:libs', 'libvorbis:libs', 'wavpack:libs', 'libvpx:libs',
|
||||
'taglib:libs',
|
||||
'gst-plugins-base:codecs', 'gst-plugins-good:codecs',
|
||||
'gst-plugins-bad:codecs', 'gst-plugins-ugly:codecs']
|
||||
files_devel = ['gst-plugins-base-static:codecs_devel',
|
||||
'gst-plugins-good-static:codecs_devel',
|
||||
'gst-plugins-bad-static:codecs_devel',
|
||||
'gst-plugins-ugly-static:codecs_devel']
|
||||
platform_files = {
|
||||
Platform.LINUX: ['libdv:libs'],
|
||||
Platform.DARWIN: ['libdv:libs']
|
||||
}
|
||||
```
|
||||
|
||||
At this point you have two main options: you could either have a single
|
||||
package that contains everything your software needs, or depend on a
|
||||
shared version of the SDK.
|
||||
|
||||
### Having a private version of the SDK
|
||||
|
||||
To have a private version of the SDK included in a single package you
|
||||
don't have to add the `deps` variable to the package file but instead
|
||||
list all files you need in the `files` variables. If you decide to go
|
||||
this road you must make sure that you use a different prefix than the
|
||||
GStreamer SDK in the Cerbero configuration file, otherwise your package
|
||||
will have file conflicts with the GStreamer SDK.
|
||||
|
||||
### Having a shared version of the SDK
|
||||
|
||||
If you decide to use a shared version of the SDK you can create a
|
||||
package file like the other package files in the GStreamer SDK. Just
|
||||
list all packages you need in the `deps` variable and put the files your
|
||||
software needs inside the `files` variables. When building a package
|
||||
this way you must make sure that you use the same prefix and
|
||||
packages\_prefix as the ones in your Cerbero configuration file.
|
||||
|
||||
Finally, build your package by using:
|
||||
|
||||
``` theme: Default; brush: bash; gutter: false
|
||||
./cerbero-uninstalled package your-package
|
||||
```
|
||||
|
||||
Where `your-package` is the name of the `.package` file that you created
|
||||
in the `packages` directory. This command will build your software and
|
||||
all its dependencies, and then make individual packages for them (both
|
||||
the dependencies and your software). The resulting files will be in the
|
||||
current working directory.
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
616
Playback+tutorial+1%3A+Playbin2+usage.markdown
Normal file
|
@ -0,0 +1,616 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 1: Playbin2 usage
|
||||
|
||||
This page last changed on Jun 26, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
We have already worked with the `playbin2` element, which is capable of
|
||||
building a complete playback pipeline without much work on our side.
|
||||
This tutorial shows how to further customize `playbin2` in case its
|
||||
default values do not suit our particular needs.
|
||||
|
||||
We will learn:
|
||||
|
||||
- How to find out how many streams a file contains, and how to switch
|
||||
among them.
|
||||
|
||||
- How to gather information regarding each stream.
|
||||
|
||||
As a side note, even though its name is `playbin2`, you can pronounce it
|
||||
“playbin”, since the original `playbin` element is deprecated and nobody
|
||||
should be using it.
|
||||
|
||||
# Introduction
|
||||
|
||||
More often than not, multiple audio, video and subtitle streams can be
|
||||
found embedded in a single file. The most common case are regular
|
||||
movies, which contain one video and one audio stream (Stereo or 5.1
|
||||
audio tracks are considered a single stream). It is also increasingly
|
||||
common to find movies with one video and multiple audio streams, to
|
||||
account for different languages. In this case, the user selects one
|
||||
audio stream, and the application will only play that one, ignoring the
|
||||
others.
|
||||
|
||||
To be able to select the appropriate stream, the user needs to know
|
||||
certain information about them, for example, their language. This
|
||||
information is embedded in the streams in the form of “metadata”
|
||||
(annexed data), and this tutorial shows how to retrieve it.
|
||||
|
||||
Subtitles can also be embedded in a file, along with audio and video,
|
||||
but they are dealt with in more detail in [Playback tutorial 2: Subtitle
|
||||
management](Playback%2Btutorial%2B2%253A%2BSubtitle%2Bmanagement.html).
|
||||
Finally, multiple video streams can also be found in a single file, for
|
||||
example, in DVD with multiple angles of the same scene, but they are
|
||||
somewhat rare.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>Embedding multiple streams inside a single file is called “multiplexing” or “muxing”, and such file is then known as a “container”. Common container formats are Matroska (.mkv), Quicktime (.qt, .mov, .mp4), Ogg (.ogg) or Webm (.webm).</p>
|
||||
<p>Retrieving the individual streams from within the container is called “demultiplexing” or “demuxing”.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
The following code recovers the amount of streams in the file, their
|
||||
associated metadata, and allows switching the audio stream while the
|
||||
media is playing.
|
||||
|
||||
# The multilingual player
|
||||
|
||||
Copy this code into a text file named `playback-tutorial-1.c` (or find
|
||||
it in the SDK installation).
|
||||
|
||||
**playback-tutorial-1.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin2; /* Our one and only element */
|
||||
|
||||
gint n_video; /* Number of embedded video streams */
|
||||
gint n_audio; /* Number of embedded audio streams */
|
||||
gint n_text; /* Number of embedded subtitle streams */
|
||||
|
||||
gint current_video; /* Currently playing video stream */
|
||||
gint current_audio; /* Currently playing audio stream */
|
||||
gint current_text; /* Currently playing subtitle stream */
|
||||
|
||||
GMainLoop *main_loop; /* GLib's Main Loop */
|
||||
} CustomData;
|
||||
|
||||
/* playbin2 flags */
|
||||
typedef enum {
|
||||
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
|
||||
GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */
|
||||
GST_PLAY_FLAG_TEXT = (1 << 2) /* We want subtitle output */
|
||||
} GstPlayFlags;
|
||||
|
||||
/* Forward definition for the message and keyboard processing functions */
|
||||
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data);
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data);
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstBus *bus;
|
||||
GstStateChangeReturn ret;
|
||||
gint flags;
|
||||
GIOChannel *io_stdin;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
|
||||
|
||||
if (!data.playbin2) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", NULL);
|
||||
|
||||
/* Set flags to show Audio and Video but ignore Subtitles */
|
||||
g_object_get (data.playbin2, "flags", &flags, NULL);
|
||||
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO;
|
||||
flags &= ~GST_PLAY_FLAG_TEXT;
|
||||
g_object_set (data.playbin2, "flags", flags, NULL);
|
||||
|
||||
/* Set connection speed. This will affect some internal decisions of playbin2 */
|
||||
g_object_set (data.playbin2, "connection-speed", 56, NULL);
|
||||
|
||||
/* Add a bus watch, so we get notified when a message arrives */
|
||||
bus = gst_element_get_bus (data.playbin2);
|
||||
gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data);
|
||||
|
||||
/* Add a keyboard watch so we get notified of keystrokes */
|
||||
#ifdef _WIN32
|
||||
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
|
||||
#else
|
||||
io_stdin = g_io_channel_unix_new (fileno (stdin));
|
||||
#endif
|
||||
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.playbin2);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
data.main_loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.main_loop);
|
||||
|
||||
/* Free resources */
|
||||
g_main_loop_unref (data.main_loop);
|
||||
g_io_channel_unref (io_stdin);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (data.playbin2, GST_STATE_NULL);
|
||||
gst_object_unref (data.playbin2);
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* Extract some metadata from the streams and print it on the screen */
|
||||
static void analyze_streams (CustomData *data) {
|
||||
gint i;
|
||||
GstTagList *tags;
|
||||
gchar *str;
|
||||
guint rate;
|
||||
|
||||
/* Read some properties */
|
||||
g_object_get (data->playbin2, "n-video", &data->n_video, NULL);
|
||||
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL);
|
||||
g_object_get (data->playbin2, "n-text", &data->n_text, NULL);
|
||||
|
||||
g_print ("%d video stream(s), %d audio stream(s), %d text stream(s)\n",
|
||||
data->n_video, data->n_audio, data->n_text);
|
||||
|
||||
g_print ("\n");
|
||||
for (i = 0; i < data->n_video; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's video tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-video-tags", i, &tags);
|
||||
if (tags) {
|
||||
g_print ("video stream %d:\n", i);
|
||||
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
|
||||
g_print (" codec: %s\n", str ? str : "unknown");
|
||||
g_free (str);
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
g_print ("\n");
|
||||
for (i = 0; i < data->n_audio; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's audio tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-audio-tags", i, &tags);
|
||||
if (tags) {
|
||||
g_print ("audio stream %d:\n", i);
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
|
||||
g_print (" codec: %s\n", str);
|
||||
g_free (str);
|
||||
}
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
|
||||
g_print (" language: %s\n", str);
|
||||
g_free (str);
|
||||
}
|
||||
if (gst_tag_list_get_uint (tags, GST_TAG_BITRATE, &rate)) {
|
||||
g_print (" bitrate: %d\n", rate);
|
||||
}
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
g_print ("\n");
|
||||
for (i = 0; i < data->n_text; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's subtitle tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-text-tags", i, &tags);
|
||||
if (tags) {
|
||||
g_print ("subtitle stream %d:\n", i);
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
|
||||
g_print (" language: %s\n", str);
|
||||
g_free (str);
|
||||
}
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
g_object_get (data->playbin2, "current-video", &data->current_video, NULL);
|
||||
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL);
|
||||
g_object_get (data->playbin2, "current-text", &data->current_text, NULL);
|
||||
|
||||
g_print ("\n");
|
||||
g_print ("Currently playing video stream %d, audio stream %d and text stream %d\n",
|
||||
data->current_video, data->current_audio, data->current_text);
|
||||
g_print ("Type any number and hit ENTER to select a different audio stream\n");
|
||||
}
|
||||
|
||||
/* Process messages from GStreamer */
|
||||
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
g_main_loop_quit (data->main_loop);
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
g_main_loop_quit (data->main_loop);
|
||||
break;
|
||||
case GST_MESSAGE_STATE_CHANGED: {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin2)) {
|
||||
if (new_state == GST_STATE_PLAYING) {
|
||||
/* Once we are in the playing state, analyze the streams */
|
||||
analyze_streams (data);
|
||||
}
|
||||
}
|
||||
} break;
|
||||
}
|
||||
|
||||
/* We want to keep receiving messages */
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
/* Process keyboard input */
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
|
||||
gchar *str = NULL;
|
||||
|
||||
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) {
|
||||
int index = atoi (str);
|
||||
if (index < 0 || index >= data->n_audio) {
|
||||
g_printerr ("Index out of bounds\n");
|
||||
} else {
|
||||
/* If the input was a valid audio stream index, set the current audio stream */
|
||||
g_print ("Setting current audio stream to %d\n", index);
|
||||
g_object_set (data->playbin2, "current-audio", index, NULL);
|
||||
}
|
||||
}
|
||||
g_free (str);
|
||||
return TRUE;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1972852059" class="expand-container">
|
||||
<div id="expander-control-1972852059" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1972852059" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc playback-tutorial-1.c -o playback-tutorial-1 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p></p>
|
||||
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The number of audio streams is shown in the terminal, and the user can switch from one to another by entering a number and pressing enter. A small delay is to be expected.</span></p>
|
||||
<p><span><span>Bear in mind that there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how </span><a href="http://docs.gstreamer.com/display/GstSDK/Tutorial+12%3A+Live+streaming">Tutorial 12: Live streaming</a><span> solves this issue.</span></span></p>
|
||||
<p></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
``` first-line: 3; theme: Default; brush: cpp; gutter: true
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin2; /* Our one and only element */
|
||||
|
||||
gint n_video; /* Number of embedded video streams */
|
||||
gint n_audio; /* Number of embedded audio streams */
|
||||
gint n_text; /* Number of embedded subtitle streams */
|
||||
|
||||
gint current_video; /* Currently playing video stream */
|
||||
gint current_audio; /* Currently playing audio stream */
|
||||
gint current_text; /* Currently playing subtitle stream */
|
||||
|
||||
GMainLoop *main_loop; /* GLib's Main Loop */
|
||||
} CustomData;
|
||||
```
|
||||
|
||||
We start, as usual, putting all our variables in a structure, so we can
|
||||
pass it around to functions. For this tutorial, we need the amount of
|
||||
streams of each type, and the currently playing one. Also, we are going
|
||||
to use a different mechanism to wait for messages that allows
|
||||
interactivity, so we need a GLib's main loop object.
|
||||
|
||||
``` first-line: 18; theme: Default; brush: cpp; gutter: true
|
||||
/* playbin2 flags */
|
||||
typedef enum {
|
||||
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
|
||||
GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */
|
||||
GST_PLAY_FLAG_TEXT = (1 << 2) /* We want subtitle output */
|
||||
} GstPlayFlags;
|
||||
```
|
||||
|
||||
Later we are going to set some of `playbin2`'s flags. We would like to
|
||||
have a handy enum that allows manipulating these flags easily, but since
|
||||
`playbin2` is a plug-in and not a part of the GStreamer core, this enum
|
||||
is not available to us. The “trick” is simply to declare this enum in
|
||||
our code, as it appears in the `playbin2` documentation: `GstPlayFlags`.
|
||||
GObject allows introspection, so the possible values for these flags can
|
||||
be retrieved at runtime without using this trick, but in a far more
|
||||
cumbersome
|
||||
way.
|
||||
|
||||
``` first-line: 25; theme: Default; brush: cpp; gutter: true
|
||||
/* Forward definition for the message and keyboard processing functions */
|
||||
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data);
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data);
|
||||
```
|
||||
|
||||
Forward declarations for the two callbacks we will be using.
|
||||
`handle_message` for the GStreamer messages, as we have already seen,
|
||||
and `handle_keyboard` for key strokes, since this tutorial is
|
||||
introducing a limited amount of interactivity.
|
||||
|
||||
We skip over the creation of the pipeline, the instantiation of
|
||||
`playbin2` and pointing it to our test media through the `uri`
|
||||
property. `playbin2` is in itself a pipeline, and in this case it is
|
||||
the only element in the pipeline, so we skip completely the creation of
|
||||
the pipeline, and use directly the `playbin2` element.
|
||||
|
||||
We focus on some of the other properties of `playbin2`, though:
|
||||
|
||||
``` first-line: 50; theme: Default; brush: cpp; gutter: true
|
||||
/* Set flags to show Audio and Video but ignore Subtitles */
|
||||
g_object_get (data.playbin2, "flags", &flags, NULL);
|
||||
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO;
|
||||
flags &= ~GST_PLAY_FLAG_TEXT;
|
||||
g_object_set (data.playbin2, "flags", flags, NULL);
|
||||
```
|
||||
|
||||
`playbin2`'s behavior can be changed through its `flags` property, which
|
||||
can have any combination of `GstPlayFlags`. The most interesting values
|
||||
are:
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_VIDEO</code></span></p></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_AUDIO</code></span></p></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_TEXT</code></span></p></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_VIS</code></span></p></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_DOWNLOAD</code></span></p></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_BUFFERING</code></span></p></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><p><a href=""></a><span class="term"><code class="literal">GST_PLAY_FLAG_DEINTERLACE</code></span></p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
In our case, for demonstration purposes, we are enabling audio and video
|
||||
and disabling subtitles, leaving the rest of flags to their default
|
||||
values (this is why we read the current value of the flags with
|
||||
`g_object_get()` before overwriting it with
|
||||
`g_object_set()`).
|
||||
|
||||
``` first-line: 56; theme: Default; brush: cpp; gutter: true
|
||||
/* Set connection speed. This will affect some internal decisions of playbin2 */
|
||||
g_object_set (data.playbin2, "connection-speed", 56, NULL);
|
||||
```
|
||||
|
||||
This property is not really useful in this example.
|
||||
`connection-speed` informs `playbin2` of the maximum speed of our
|
||||
network connection, so, in case multiple versions of the requested media
|
||||
are available in the server, `playbin2` chooses the most
|
||||
appropriate. This is mostly used in combination with streaming
|
||||
protocols like `mms` or `rtsp`.
|
||||
|
||||
We have set all these properties one by one, but we could have all of
|
||||
them with a single call to
|
||||
`g_object_set()`:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", "flags", flags, "connection-speed", 56, NULL);
|
||||
```
|
||||
|
||||
This is why `g_object_set()` requires a NULL as the last parameter.
|
||||
|
||||
``` first-line: 63; theme: Default; brush: cpp; gutter: true
|
||||
/* Add a keyboard watch so we get notified of keystrokes */
|
||||
#ifdef _WIN32
|
||||
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
|
||||
#else
|
||||
io_stdin = g_io_channel_unix_new (fileno (stdin));
|
||||
#endif
|
||||
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
|
||||
```
|
||||
|
||||
These lines connect a callback function to the standard input (the
|
||||
keyboard). The mechanism shown here is specific to GLib, and not really
|
||||
related to GStreamer, so there is no point in going into much depth.
|
||||
Applications normally have their own way of handling user input, and
|
||||
GStreamer has little to do with it besides the Navigation interface
|
||||
discussed briefly in [Tutorial 17: DVD
|
||||
playback](http://docs.gstreamer.com/display/GstSDK/Tutorial+17%3A+DVD+playback).
|
||||
|
||||
``` first-line: 79; theme: Default; brush: cpp; gutter: true
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
data.main_loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.main_loop);
|
||||
```
|
||||
|
||||
To allow interactivity, we will no longer poll the GStreamer bus
|
||||
manually. Instead, we create a `GMainLoop`(GLib main loop) and set it
|
||||
running with `g_main_loop_run()`. This function blocks and will not
|
||||
return until `g_main_loop_quit()` is issued. In the meantime, it will
|
||||
call the callbacks we have registered at the appropriate
|
||||
times: `handle_message` when a message appears on the bus, and
|
||||
`handle_keyboard` when the user presses any key.
|
||||
|
||||
There is nothing new in handle\_message, except that when the pipeline
|
||||
moves to the PLAYING state, it will call the `analyze_streams` function:
|
||||
|
||||
``` first-line: 92; theme: Default; brush: cpp; gutter: true
|
||||
/* Extract some metadata from the streams and print it on the screen */
|
||||
static void analyze_streams (CustomData *data) {
|
||||
gint i;
|
||||
GstTagList *tags;
|
||||
gchar *str;
|
||||
guint rate;
|
||||
|
||||
/* Read some properties */
|
||||
g_object_get (data->playbin2, "n-video", &data->n_video, NULL);
|
||||
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL);
|
||||
g_object_get (data->playbin2, "n-text", &data->n_text, NULL);
|
||||
```
|
||||
|
||||
As the comment says, this function just gathers information from the
|
||||
media and prints it on the screen. The number of video, audio and
|
||||
subtitle streams is directly available through the `n-video`,
|
||||
`n-audio` and `n-text` properties.
|
||||
|
||||
``` first-line: 108; theme: Default; brush: cpp; gutter: true
|
||||
for (i = 0; i < data->n_video; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's video tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-video-tags", i, &tags);
|
||||
if (tags) {
|
||||
g_print ("video stream %d:\n", i);
|
||||
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
|
||||
g_print (" codec: %s\n", str ? str : "unknown");
|
||||
g_free (str);
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Now, for each stream, we want to retrieve its metadata. Metadata is
|
||||
stored as tags in a `GstTagList` structure, which is a list of data
|
||||
pieces identified by a name. The `GstTagList` associated with a stream
|
||||
can be recovered with `g_signal_emit_by_name()`, and then individual
|
||||
tags are extracted with the `gst_tag_list_get_*` functions
|
||||
like `gst_tag_list_get_string()` for example.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This rather unintuitive way of retrieving the tag list is called an Action Signal. Action signals are emitted by the application to a specific element, which then performs an action and returns a result. They behave like a dynamic function call, in which methods of a class are identified by their name (the signal's name) instead of their memory address. These signals are listed In the documentation along with the regular signals, and are tagged “Action”. See <code>playbin2</code>, for example.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
`playbin2` defines 3 action signals to retrieve
|
||||
metadata: `get-video-tags`, `get-audio-tags` and `get-text-tags`. The
|
||||
name if the tags is standardized, and the list can be found in the
|
||||
`GstTagList` documentation. In this example we are interested in the
|
||||
`GST_TAG_LANGUAGE_CODE` of the streams and their
|
||||
`GST_TAG_*_CODEC` (audio, video or
|
||||
text).
|
||||
|
||||
``` first-line: 158; theme: Default; brush: cpp; gutter: true
|
||||
g_object_get (data->playbin2, "current-video", &data->current_video, NULL);
|
||||
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL);
|
||||
g_object_get (data->playbin2, "current-text", &data->current_text, NULL);
|
||||
```
|
||||
|
||||
Once we have extracted all the metadata we want, we get the streams that
|
||||
are currently selected through 3 more properties of `playbin2`:
|
||||
`current-video`, `current-audio` and `current-text`.
|
||||
|
||||
It is interesting to always check the currently selected streams and
|
||||
never make any assumption. Multiple internal conditions can make
|
||||
`playbin2` behave differently in different executions. Also, the order
|
||||
in which the streams are listed can change from one run to another, so
|
||||
checking the metadata to identify one particular stream becomes crucial.
|
||||
|
||||
``` first-line: 202; theme: Default; brush: cpp; gutter: true
|
||||
/* Process keyboard input */
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
|
||||
gchar *str = NULL;
|
||||
|
||||
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) {
|
||||
int index = atoi (str);
|
||||
if (index < 0 || index >= data->n_audio) {
|
||||
g_printerr ("Index out of bounds\n");
|
||||
} else {
|
||||
/* If the input was a valid audio stream index, set the current audio stream */
|
||||
g_print ("Setting current audio stream to %d\n", index);
|
||||
g_object_set (data->playbin2, "current-audio", index, NULL);
|
||||
}
|
||||
}
|
||||
g_free (str);
|
||||
return TRUE;
|
||||
}
|
||||
```
|
||||
|
||||
Finally, we allow the user to switch the running audio stream. This very
|
||||
basic function just reads a string from the standard input (the
|
||||
keyboard), interprets it as a number, and tries to set the
|
||||
`current-audio` property of `playbin2` (which previously we have only
|
||||
read).
|
||||
|
||||
Bear in mind that the switch is not immediate. Some of the previously
|
||||
decoded audio will still be flowing through the pipeline, while the new
|
||||
stream becomes active and is decoded. The delay depends on the
|
||||
particular multiplexing of the streams in the container, and the length
|
||||
`playbin2` has selected for its internal queues (which depends on the
|
||||
network conditions).
|
||||
|
||||
If you execute the tutorial, you will be able to switch from one
|
||||
language to another while the movie is running by pressing 0, 1 or 2
|
||||
(and ENTER). This concludes this tutorial.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- A few more of `playbin2`'s properties: `flags`, `connection-speed`,
|
||||
`n-video`, `n-audio`, `n-text`, `current-video`, `current-audio` and
|
||||
`current-text`.
|
||||
|
||||
- How to retrieve the list of tags associated with a stream
|
||||
with `g_signal_emit_by_name()`.
|
||||
|
||||
- How to retrieve a particular tag from the list with
|
||||
`gst_tag_list_get_string()`or `gst_tag_list_get_uint()`
|
||||
|
||||
- How to switch the current audio simply by writing to the
|
||||
`current-audio` property.
|
||||
|
||||
The next playback tutorial shows how to handle subtitles, either
|
||||
embedded in the container or in an external file.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
403
Playback+tutorial+2%3A+Subtitle+management.markdown
Normal file
|
@ -0,0 +1,403 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 2: Subtitle management
|
||||
|
||||
This page last changed on May 16, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
This tutorial is very similar to the previous one, but instead of
|
||||
switching among different audio streams, we will use subtitle streams.
|
||||
This will allow us to learn:
|
||||
|
||||
- How to choose the subtitle stream
|
||||
|
||||
- How to add external subtitles
|
||||
|
||||
- How to customize the font used for the subtitles
|
||||
|
||||
# Introduction
|
||||
|
||||
We already know (from the previous tutorial) that container files can
|
||||
hold multiple audio and video streams, and that we can very easily
|
||||
choose among them by changing the `current-audio` or
|
||||
`current-video` `playbin2` property. Switching subtitles is just as
|
||||
easy.
|
||||
|
||||
It is worth noting that, just like it happens with audio and video,
|
||||
`playbin2` takes care of choosing the right decoder for the subtitles,
|
||||
and that the plugin structure of GStreamer allows adding support for new
|
||||
formats as easily as copying a file. Everything is invisible to the
|
||||
application developer.
|
||||
|
||||
Besides subtitles embedded in the container, `playbin2` offers the
|
||||
possibility to add an extra subtitle stream from an external URI.
|
||||
|
||||
This tutorial opens a file which already contains 5 subtitle streams,
|
||||
and adds another one from another file (for the Greek language).
|
||||
|
||||
# The multilingual player with subtitles
|
||||
|
||||
Copy this code into a text file named `playback-tutorial-2.c` (or find
|
||||
it in the SDK installation).
|
||||
|
||||
**playback-tutorial-2.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* Structure to contain all our information, so we can pass it around */
|
||||
typedef struct _CustomData {
|
||||
GstElement *playbin2; /* Our one and only element */
|
||||
|
||||
gint n_video; /* Number of embedded video streams */
|
||||
gint n_audio; /* Number of embedded audio streams */
|
||||
gint n_text; /* Number of embedded subtitle streams */
|
||||
|
||||
gint current_video; /* Currently playing video stream */
|
||||
gint current_audio; /* Currently playing audio stream */
|
||||
gint current_text; /* Currently playing subtitle stream */
|
||||
|
||||
GMainLoop *main_loop; /* GLib's Main Loop */
|
||||
} CustomData;
|
||||
|
||||
/* playbin2 flags */
|
||||
typedef enum {
|
||||
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
|
||||
GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */
|
||||
GST_PLAY_FLAG_TEXT = (1 << 2) /* We want subtitle output */
|
||||
} GstPlayFlags;
|
||||
|
||||
/* Forward definition for the message and keyboard processing functions */
|
||||
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data);
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data);
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstBus *bus;
|
||||
GstStateChangeReturn ret;
|
||||
gint flags;
|
||||
GIOChannel *io_stdin;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the elements */
|
||||
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
|
||||
|
||||
if (!data.playbin2) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Set the URI to play */
|
||||
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.ogv", NULL);
|
||||
|
||||
/* Set the subtitle URI to play and some font description */
|
||||
g_object_set (data.playbin2, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL);
|
||||
g_object_set (data.playbin2, "subtitle-font-desc", "Sans, 18", NULL);
|
||||
|
||||
/* Set flags to show Audio, Video and Subtitles */
|
||||
g_object_get (data.playbin2, "flags", &flags, NULL);
|
||||
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT;
|
||||
g_object_set (data.playbin2, "flags", flags, NULL);
|
||||
|
||||
/* Add a bus watch, so we get notified when a message arrives */
|
||||
bus = gst_element_get_bus (data.playbin2);
|
||||
gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data);
|
||||
|
||||
/* Add a keyboard watch so we get notified of keystrokes */
|
||||
#ifdef _WIN32
|
||||
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
|
||||
#else
|
||||
io_stdin = g_io_channel_unix_new (fileno (stdin));
|
||||
#endif
|
||||
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.playbin2);
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
data.main_loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.main_loop);
|
||||
|
||||
/* Free resources */
|
||||
g_main_loop_unref (data.main_loop);
|
||||
g_io_channel_unref (io_stdin);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (data.playbin2, GST_STATE_NULL);
|
||||
gst_object_unref (data.playbin2);
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* Extract some metadata from the streams and print it on the screen */
|
||||
static void analyze_streams (CustomData *data) {
|
||||
gint i;
|
||||
GstTagList *tags;
|
||||
gchar *str;
|
||||
guint rate;
|
||||
|
||||
/* Read some properties */
|
||||
g_object_get (data->playbin2, "n-video", &data->n_video, NULL);
|
||||
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL);
|
||||
g_object_get (data->playbin2, "n-text", &data->n_text, NULL);
|
||||
|
||||
g_print ("%d video stream(s), %d audio stream(s), %d text stream(s)\n",
|
||||
data->n_video, data->n_audio, data->n_text);
|
||||
|
||||
g_print ("\n");
|
||||
for (i = 0; i < data->n_video; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's video tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-video-tags", i, &tags);
|
||||
if (tags) {
|
||||
g_print ("video stream %d:\n", i);
|
||||
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
|
||||
g_print (" codec: %s\n", str ? str : "unknown");
|
||||
g_free (str);
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
g_print ("\n");
|
||||
for (i = 0; i < data->n_audio; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's audio tags */
|
||||
g_signal_emit_by_name (data->playbin2, "get-audio-tags", i, &tags);
|
||||
if (tags) {
|
||||
g_print ("audio stream %d:\n", i);
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
|
||||
g_print (" codec: %s\n", str);
|
||||
g_free (str);
|
||||
}
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
|
||||
g_print (" language: %s\n", str);
|
||||
g_free (str);
|
||||
}
|
||||
if (gst_tag_list_get_uint (tags, GST_TAG_BITRATE, &rate)) {
|
||||
g_print (" bitrate: %d\n", rate);
|
||||
}
|
||||
gst_tag_list_free (tags);
|
||||
}
|
||||
}
|
||||
|
||||
g_print ("\n");
|
||||
for (i = 0; i < data->n_text; i++) {
|
||||
tags = NULL;
|
||||
/* Retrieve the stream's subtitle tags */
|
||||
g_print ("subtitle stream %d:\n", i);
|
||||
g_signal_emit_by_name (data->playbin2, "get-text-tags", i, &tags);
|
||||
if (tags) {
|
||||
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
|
||||
g_print (" language: %s\n", str);
|
||||
g_free (str);
|
||||
}
|
||||
gst_tag_list_free (tags);
|
||||
} else {
|
||||
g_print (" no tags found\n");
|
||||
}
|
||||
}
|
||||
|
||||
g_object_get (data->playbin2, "current-video", &data->current_video, NULL);
|
||||
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL);
|
||||
g_object_get (data->playbin2, "current-text", &data->current_text, NULL);
|
||||
|
||||
g_print ("\n");
|
||||
g_print ("Currently playing video stream %d, audio stream %d and subtitle stream %d\n",
|
||||
data->current_video, data->current_audio, data->current_text);
|
||||
g_print ("Type any number and hit ENTER to select a different subtitle stream\n");
|
||||
}
|
||||
|
||||
/* Process messages from GStreamer */
|
||||
static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR:
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
g_main_loop_quit (data->main_loop);
|
||||
break;
|
||||
case GST_MESSAGE_EOS:
|
||||
g_print ("End-Of-Stream reached.\n");
|
||||
g_main_loop_quit (data->main_loop);
|
||||
break;
|
||||
case GST_MESSAGE_STATE_CHANGED: {
|
||||
GstState old_state, new_state, pending_state;
|
||||
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
|
||||
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin2)) {
|
||||
if (new_state == GST_STATE_PLAYING) {
|
||||
/* Once we are in the playing state, analyze the streams */
|
||||
analyze_streams (data);
|
||||
}
|
||||
}
|
||||
} break;
|
||||
}
|
||||
|
||||
/* We want to keep receiving messages */
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
/* Process keyboard input */
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
|
||||
gchar *str = NULL;
|
||||
|
||||
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) {
|
||||
int index = atoi (str);
|
||||
if (index < 0 || index >= data->n_text) {
|
||||
g_printerr ("Index out of bounds\n");
|
||||
} else {
|
||||
/* If the input was a valid subtitle stream index, set the current subtitle stream */
|
||||
g_print ("Setting current subtitle stream to %d\n", index);
|
||||
g_object_set (data->playbin2, "current-text", index, NULL);
|
||||
}
|
||||
}
|
||||
g_free (str);
|
||||
return TRUE;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-513883844" class="expand-container">
|
||||
<div id="expander-control-513883844" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-513883844" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc playback-tutorial-2.c -o playback-tutorial-2 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p></p>
|
||||
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The number of subtitle streams is shown in the terminal, and the user can switch from one to another by entering a number and pressing enter. A small delay is to be expected. </span><strong>Please read the note at the bottom of this page</strong><span>.</span></p>
|
||||
<p><span><span>Bear in mind that there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how </span><a href="http://docs.gstreamer.com/display/GstSDK/Tutorial+12%3A+Live+streaming">Tutorial 12: Live streaming</a><span> solves this issue.</span></span></p>
|
||||
<p></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
This tutorial is copied from [Playback tutorial 1: Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html) with some
|
||||
changes, so let's review only the changes.
|
||||
|
||||
``` first-line: 50; theme: Default; brush: cpp; gutter: true
|
||||
/* Set the subtitle URI to play and some font description */
|
||||
g_object_set (data.playbin2, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL);
|
||||
g_object_set (data.playbin2, "subtitle-font-desc", "Sans, 18", NULL);
|
||||
```
|
||||
|
||||
After setting the media URI, we set the `suburi` property, which points
|
||||
`playbin2` to a file containing a subtitle stream. In this case, the
|
||||
media file already contains multiple subtitle streams, so the one
|
||||
provided in the `suburi` is added to the list, and will be the currently
|
||||
selected one.
|
||||
|
||||
Note that metadata concerning a subtitle stream (like its language)
|
||||
resides in the container file, therefore, subtitles not embedded in a
|
||||
container will not have metadata. When running this tutorial you will
|
||||
find that the first subtitle stream does not have a language tag.
|
||||
|
||||
The `subtitle-font-desc` property allows specifying the font to render
|
||||
the subtitles. Since [Pango](http://www.pango.org/) is the library used
|
||||
to render fonts, you can check its documentation to see how this font
|
||||
should be specified, in particular, the
|
||||
[pango-font-description-from-string](http://developer.gnome.org/pango/stable/pango-Fonts.html#pango-font-description-from-string) function.
|
||||
|
||||
In a nutshell, the format of the string representation is `[FAMILY-LIST]
|
||||
[STYLE-OPTIONS] [SIZE]` where `FAMILY-LIST` is a comma separated list of
|
||||
families optionally terminated by a comma, `STYLE_OPTIONS` is a
|
||||
whitespace separated list of words where each word describes one of
|
||||
style, variant, weight, or stretch, and `SIZE` is an decimal number
|
||||
(size in points). For example the following are all valid string
|
||||
representations:
|
||||
|
||||
- sans bold 12
|
||||
- serif, monospace bold italic condensed 16
|
||||
- normal 10
|
||||
|
||||
The commonly available font families are: Normal, Sans, Serif and
|
||||
Monospace.
|
||||
|
||||
The available styles are: Normal (the font is upright), Oblique (the
|
||||
font is slanted, but in a roman style), Italic (the font is slanted in
|
||||
an italic style).
|
||||
|
||||
The available weights are: Ultra-Light, Light, Normal, Bold, Ultra-Bold,
|
||||
Heavy.
|
||||
|
||||
The available variants are: Normal, Small\_Caps (A font with the lower
|
||||
case characters replaced by smaller variants of the capital characters)
|
||||
|
||||
The available stretch styles
|
||||
are: Ultra-Condensed, Extra-Condensed, Condensed, Semi-Condensed, Normal, Semi-Expanded, Expanded,
|
||||
Extra-Expanded, Ultra-Expanded
|
||||
|
||||
|
||||
|
||||
``` first-line: 54; theme: Default; brush: cpp; gutter: true
|
||||
/* Set flags to show Audio, Video and Subtitles */
|
||||
g_object_get (data.playbin2, "flags", &flags, NULL);
|
||||
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT;
|
||||
g_object_set (data.playbin2, "flags", flags, NULL);
|
||||
```
|
||||
|
||||
We set the `flags` property to allow Audio, Video and Text (Subtitles).
|
||||
|
||||
The rest of the tutorial is the same as [Playback tutorial 1: Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html), except
|
||||
that the keyboard input changes the `current-text` property instead of
|
||||
the `current-audio`. As before, keep in mind that stream changes are not
|
||||
immediate, since there is a lot of information flowing through the
|
||||
pipeline that needs to reach the end of it before the new stream shows
|
||||
up.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial showed how to handle subtitles from `playbin2`, whether
|
||||
they are embedded in the container or in a different file:
|
||||
|
||||
- Subtitles are chosen using the `current-tex`t and `n-tex`t
|
||||
properties of `playbin2`.
|
||||
|
||||
- External subtitle files can be selected using the `suburi` property.
|
||||
|
||||
- Subtitle appearance can be customized with the
|
||||
`subtitle-font-desc` property.
|
||||
|
||||
The next playback tutorial shows how to change the playback speed.
|
||||
|
||||
Remember that attached to this page you should find the complete source
|
||||
code of the tutorial and any accessory files needed to build it.
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>There is a bug in the current version of the sdk <a href="https://bugzilla.gnome.org/show_bug.cgi?id=638168" class="uri" class="external-link">https://bugzilla.gnome.org/show_bug.cgi?id=638168</a>:</p>
|
||||
<p>Switching subtitle tracks while there is a subtitle on the screen gives this warning:</p>
|
||||
<p><code>WARN katedec gstkatedec.c:309:gst_kate_dec_chain:<katedec1> failed to push buffer: wrong-state</code></p>
|
||||
<p>And after a while it freezes.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
272
Playback+tutorial+3%3A+Short-cutting+the+pipeline.markdown
Normal file
|
@ -0,0 +1,272 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 3: Short-cutting the pipeline
|
||||
|
||||
This page last changed on Jun 26, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
[Basic tutorial 8: Short-cutting the
|
||||
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) showed
|
||||
how an application can manually extract or inject data into a pipeline
|
||||
by using two special elements called `appsrc` and `appsink`.
|
||||
`playbin2` allows using these elements too, but the method to connect
|
||||
them is different. To connect an `appsink` to `playbin2` see [Playback
|
||||
tutorial 7: Custom playbin2
|
||||
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin2%2Bsinks.html).
|
||||
This tutorial shows:
|
||||
|
||||
- How to connect `appsrc` with `playbin2`
|
||||
- How to configure the `appsrc`
|
||||
|
||||
# A playbin2 waveform generator
|
||||
|
||||
Copy this code into a text file named `playback-tutorial-3.c`.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
**playback-tutorial-3.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
#include <string.h>
|
||||
|
||||
#define CHUNK_SIZE 1024 /* Amount of bytes we are sending in each buffer */
|
||||
#define SAMPLE_RATE 44100 /* Samples per second we are sending */
|
||||
#define AUDIO_CAPS "audio/x-raw-int,channels=1,rate=%d,signed=(boolean)true,width=16,depth=16,endianness=BYTE_ORDER"
|
||||
|
||||
/* Structure to contain all our information, so we can pass it to callbacks */
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline;
|
||||
GstElement *app_source;
|
||||
|
||||
guint64 num_samples; /* Number of samples generated so far (for timestamp generation) */
|
||||
gfloat a, b, c, d; /* For waveform generation */
|
||||
|
||||
guint sourceid; /* To control the GSource */
|
||||
|
||||
GMainLoop *main_loop; /* GLib's Main Loop */
|
||||
} CustomData;
|
||||
|
||||
/* This method is called by the idle GSource in the mainloop, to feed CHUNK_SIZE bytes into appsrc.
|
||||
* The ide handler is added to the mainloop when appsrc requests us to start sending data (need-data signal)
|
||||
* and is removed when appsrc has enough data (enough-data signal).
|
||||
*/
|
||||
static gboolean push_data (CustomData *data) {
|
||||
GstBuffer *buffer;
|
||||
GstFlowReturn ret;
|
||||
int i;
|
||||
gint16 *raw;
|
||||
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
|
||||
gfloat freq;
|
||||
|
||||
/* Create a new empty buffer */
|
||||
buffer = gst_buffer_new_and_alloc (CHUNK_SIZE);
|
||||
|
||||
/* Set its timestamp and duration */
|
||||
GST_BUFFER_TIMESTAMP (buffer) = gst_util_uint64_scale (data->num_samples, GST_SECOND, SAMPLE_RATE);
|
||||
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);
|
||||
|
||||
/* Generate some psychodelic waveforms */
|
||||
raw = (gint16 *)GST_BUFFER_DATA (buffer);
|
||||
data->c += data->d;
|
||||
data->d -= data->c / 1000;
|
||||
freq = 1100 + 1000 * data->d;
|
||||
for (i = 0; i < num_samples; i++) {
|
||||
data->a += data->b;
|
||||
data->b -= data->a / freq;
|
||||
raw[i] = (gint16)(500 * data->a);
|
||||
}
|
||||
data->num_samples += num_samples;
|
||||
|
||||
/* Push the buffer into the appsrc */
|
||||
g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
|
||||
|
||||
/* Free the buffer now that we are done with it */
|
||||
gst_buffer_unref (buffer);
|
||||
|
||||
if (ret != GST_FLOW_OK) {
|
||||
/* We got some error, stop sending data */
|
||||
return FALSE;
|
||||
}
|
||||
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
/* This signal callback triggers when appsrc needs data. Here, we add an idle handler
|
||||
* to the mainloop to start pushing data into the appsrc */
|
||||
static void start_feed (GstElement *source, guint size, CustomData *data) {
|
||||
if (data->sourceid == 0) {
|
||||
g_print ("Start feeding\n");
|
||||
data->sourceid = g_idle_add ((GSourceFunc) push_data, data);
|
||||
}
|
||||
}
|
||||
|
||||
/* This callback triggers when appsrc has enough data and we can stop sending.
|
||||
* We remove the idle handler from the mainloop */
|
||||
static void stop_feed (GstElement *source, CustomData *data) {
|
||||
if (data->sourceid != 0) {
|
||||
g_print ("Stop feeding\n");
|
||||
g_source_remove (data->sourceid);
|
||||
data->sourceid = 0;
|
||||
}
|
||||
}
|
||||
|
||||
/* This function is called when an error message is posted on the bus */
|
||||
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
GError *err;
|
||||
gchar *debug_info;
|
||||
|
||||
/* Print error details on the screen */
|
||||
gst_message_parse_error (msg, &err, &debug_info);
|
||||
g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
|
||||
g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
|
||||
g_clear_error (&err);
|
||||
g_free (debug_info);
|
||||
|
||||
g_main_loop_quit (data->main_loop);
|
||||
}
|
||||
|
||||
/* This function is called when playbin2 has created the appsrc element, so we have
|
||||
* a chance to configure it. */
|
||||
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
|
||||
gchar *audio_caps_text;
|
||||
GstCaps *audio_caps;
|
||||
|
||||
g_print ("Source has been created. Configuring.\n");
|
||||
data->app_source = source;
|
||||
|
||||
/* Configure appsrc */
|
||||
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
|
||||
audio_caps = gst_caps_from_string (audio_caps_text);
|
||||
g_object_set (source, "caps", audio_caps, NULL);
|
||||
g_signal_connect (source, "need-data", G_CALLBACK (start_feed), data);
|
||||
g_signal_connect (source, "enough-data", G_CALLBACK (stop_feed), data);
|
||||
gst_caps_unref (audio_caps);
|
||||
g_free (audio_caps_text);
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstBus *bus;
|
||||
|
||||
/* Initialize cumstom data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
data.b = 1; /* For waveform generation */
|
||||
data.d = 1;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Create the playbin2 element */
|
||||
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL);
|
||||
g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data);
|
||||
|
||||
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
|
||||
bus = gst_element_get_bus (data.pipeline);
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
|
||||
gst_object_unref (bus);
|
||||
|
||||
/* Start playing the pipeline */
|
||||
gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
data.main_loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.main_loop);
|
||||
|
||||
/* Free resources */
|
||||
gst_element_set_state (data.pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (data.pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
To use an `appsrc` as the source for the pipeline, simply instantiate a
|
||||
`playbin2` and set its URI to `appsrc://`
|
||||
|
||||
``` first-line: 131; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the playbin2 element */
|
||||
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL);
|
||||
```
|
||||
|
||||
`playbin2` will create an internal `appsrc` element and fire the
|
||||
`source-setup` signal to allow the application to configure
|
||||
it:
|
||||
|
||||
``` first-line: 133; theme: Default; brush: cpp; gutter: true
|
||||
g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data);
|
||||
```
|
||||
|
||||
In particular, it is important to set the caps property of `appsrc`,
|
||||
since, once the signal handler returns, `playbin2` will instantiate the
|
||||
next element in the pipeline according to these
|
||||
caps:
|
||||
|
||||
``` first-line: 100; theme: Default; brush: cpp; gutter: true
|
||||
/* This function is called when playbin2 has created the appsrc element, so we have
|
||||
* a chance to configure it. */
|
||||
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
|
||||
gchar *audio_caps_text;
|
||||
GstCaps *audio_caps;
|
||||
|
||||
g_print ("Source has been created. Configuring.\n");
|
||||
data->app_source = source;
|
||||
|
||||
/* Configure appsrc */
|
||||
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
|
||||
audio_caps = gst_caps_from_string (audio_caps_text);
|
||||
g_object_set (source, "caps", audio_caps, NULL);
|
||||
g_signal_connect (source, "need-data", G_CALLBACK (start_feed), data);
|
||||
g_signal_connect (source, "enough-data", G_CALLBACK (stop_feed), data);
|
||||
gst_caps_unref (audio_caps);
|
||||
g_free (audio_caps_text);
|
||||
}
|
||||
```
|
||||
|
||||
The configuration of the `appsrc` is exactly the same as in [Basic
|
||||
tutorial 8: Short-cutting the
|
||||
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html):
|
||||
the caps are set to `audio/x-raw-int`, and two callbacks are registered,
|
||||
so the element can tell the application when it needs to start and stop
|
||||
pushing data. See [Basic tutorial 8: Short-cutting the
|
||||
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html)
|
||||
for more details.
|
||||
|
||||
From this point onwards, `playbin2` takes care of the rest of the
|
||||
pipeline, and the application only needs to worry about generating more
|
||||
data when told so.
|
||||
|
||||
To learn how data can be extracted from `playbin2` using the
|
||||
`appsink` element, see [Playback tutorial 7: Custom playbin2
|
||||
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin2%2Bsinks.html).
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial applies the concepts shown in [Basic tutorial 8:
|
||||
Short-cutting the
|
||||
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) to
|
||||
`playbin2`. In particular, it has shown:
|
||||
|
||||
- How to connect `appsrc` with `playbin2` using the special
|
||||
URI `appsrc://`
|
||||
- How to configure the `appsrc` using the `source-setup` signal
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[playback-tutorial-3.c](attachments/1442200/2424850.c) (text/plain)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[vs2010.zip](attachments/1442200/2424849.zip) (application/zip)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[playback-tutorial-3.c](attachments/1442200/2424848.c) (text/plain)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
450
Playback+tutorial+4%3A+Progressive+streaming.markdown
Normal file
|
@ -0,0 +1,450 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 4: Progressive streaming
|
||||
|
||||
This page last changed on Sep 13, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
[Basic tutorial 12:
|
||||
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) showed how to
|
||||
enhance the user experience in poor network conditions, by taking
|
||||
buffering into account. This tutorial further expands [Basic tutorial
|
||||
12: Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) by enabling
|
||||
the local storage of the streamed media, and describes the advantages of
|
||||
this technique. In particular, it shows:
|
||||
|
||||
- How to enable progressive downloading
|
||||
- How to know what has been downloaded
|
||||
- How to know where it has been downloaded
|
||||
- How to limit the amount of downloaded data that is kept
|
||||
|
||||
# Introduction
|
||||
|
||||
When streaming, data is fetched from the network and a small buffer of
|
||||
future-data is kept to ensure smooth playback (see [Basic tutorial 12:
|
||||
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html)). However, data
|
||||
is discarded as soon as it is displayed or rendered (there is no
|
||||
past-data buffer). This means, that if a user wants to jump back and
|
||||
continue playback from a point in the past, data needs to be
|
||||
re-downloaded.
|
||||
|
||||
Media players tailored for streaming, like YouTube, usually keep all
|
||||
downloaded data stored locally for this contingency. A graphical widget
|
||||
is also normally used to show how much of the file has already been
|
||||
downloaded.
|
||||
|
||||
`playbin2` offers similar functionalities through the `DOWNLOAD` flag
|
||||
which stores the media in a local temporary file for faster playback of
|
||||
already-downloaded chunks.
|
||||
|
||||
This code also shows how to use the Buffering Query, which allows
|
||||
knowing what parts of the file are available.
|
||||
|
||||
# A network-resilient example with local storage
|
||||
|
||||
Copy this code into a text file named `playback-tutorial-4.c`.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
**playback-tutorial-4.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
#include <string.h>
|
||||
|
||||
#define GRAPH_LENGTH 80
|
||||
|
||||
/* playbin2 flags */
|
||||
typedef enum {
|
||||
GST_PLAY_FLAG_DOWNLOAD = (1 << 7) /* Enable progressive download (on selected formats) */
|
||||
} GstPlayFlags;
|
||||
|
||||
typedef struct _CustomData {
|
||||
gboolean is_live;
|
||||
GstElement *pipeline;
|
||||
GMainLoop *loop;
|
||||
gint buffering_level;
|
||||
} CustomData;
|
||||
|
||||
static void got_location (GstObject *gstobject, GstObject *prop_object, GParamSpec *prop, gpointer data) {
|
||||
gchar *location;
|
||||
g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL);
|
||||
g_print ("Temporary file: %s\n", location);
|
||||
/* Uncomment this line to keep the temporary file after the program exits */
|
||||
/* g_object_set (G_OBJECT (prop_object), "temp-remove", FALSE, NULL); */
|
||||
}
|
||||
|
||||
static void cb_message (GstBus *bus, GstMessage *msg, CustomData *data) {
|
||||
|
||||
switch (GST_MESSAGE_TYPE (msg)) {
|
||||
case GST_MESSAGE_ERROR: {
|
||||
GError *err;
|
||||
gchar *debug;
|
||||
|
||||
gst_message_parse_error (msg, &err, &debug);
|
||||
g_print ("Error: %s\n", err->message);
|
||||
g_error_free (err);
|
||||
g_free (debug);
|
||||
|
||||
gst_element_set_state (data->pipeline, GST_STATE_READY);
|
||||
g_main_loop_quit (data->loop);
|
||||
break;
|
||||
}
|
||||
case GST_MESSAGE_EOS:
|
||||
/* end-of-stream */
|
||||
gst_element_set_state (data->pipeline, GST_STATE_READY);
|
||||
g_main_loop_quit (data->loop);
|
||||
break;
|
||||
case GST_MESSAGE_BUFFERING:
|
||||
/* If the stream is live, we do not care about buffering. */
|
||||
if (data->is_live) break;
|
||||
|
||||
gst_message_parse_buffering (msg, &data->buffering_level);
|
||||
|
||||
/* Wait until buffering is complete before start/resume playing */
|
||||
if (data->buffering_level < 100)
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
else
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
case GST_MESSAGE_CLOCK_LOST:
|
||||
/* Get a new clock */
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PAUSED);
|
||||
gst_element_set_state (data->pipeline, GST_STATE_PLAYING);
|
||||
break;
|
||||
default:
|
||||
/* Unhandled message */
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
static gboolean refresh_ui (CustomData *data) {
|
||||
GstQuery *query;
|
||||
gboolean result;
|
||||
|
||||
query = gst_query_new_buffering (GST_FORMAT_PERCENT);
|
||||
result = gst_element_query (data->pipeline, query);
|
||||
if (result) {
|
||||
gint n_ranges, range, i;
|
||||
gchar graph[GRAPH_LENGTH + 1];
|
||||
GstFormat format = GST_FORMAT_TIME;
|
||||
gint64 position = 0, duration = 0;
|
||||
|
||||
memset (graph, ' ', GRAPH_LENGTH);
|
||||
graph[GRAPH_LENGTH] = '\0';
|
||||
|
||||
n_ranges = gst_query_get_n_buffering_ranges (query);
|
||||
for (range = 0; range < n_ranges; range++) {
|
||||
gint64 start, stop;
|
||||
gst_query_parse_nth_buffering_range (query, range, &start, &stop);
|
||||
start = start * GRAPH_LENGTH / 100;
|
||||
stop = stop * GRAPH_LENGTH / 100;
|
||||
for (i = (gint)start; i < stop; i++)
|
||||
graph [i] = '-';
|
||||
}
|
||||
if (gst_element_query_position (data->pipeline, &format, &position) &&
|
||||
GST_CLOCK_TIME_IS_VALID (position) &&
|
||||
gst_element_query_duration (data->pipeline, &format, &duration) &&
|
||||
GST_CLOCK_TIME_IS_VALID (duration)) {
|
||||
i = (gint)(GRAPH_LENGTH * (double)position / (double)(duration + 1));
|
||||
graph [i] = data->buffering_level < 100 ? 'X' : '>';
|
||||
}
|
||||
g_print ("[%s]", graph);
|
||||
if (data->buffering_level < 100) {
|
||||
g_print (" Buffering: %3d%%", data->buffering_level);
|
||||
} else {
|
||||
g_print (" ");
|
||||
}
|
||||
g_print ("\r");
|
||||
}
|
||||
|
||||
return TRUE;
|
||||
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline;
|
||||
GstBus *bus;
|
||||
GstStateChangeReturn ret;
|
||||
GMainLoop *main_loop;
|
||||
CustomData data;
|
||||
guint flags;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
data.buffering_level = 100;
|
||||
|
||||
/* Build the pipeline */
|
||||
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
|
||||
/* Set the download flag */
|
||||
g_object_get (pipeline, "flags", &flags, NULL);
|
||||
flags |= GST_PLAY_FLAG_DOWNLOAD;
|
||||
g_object_set (pipeline, "flags", flags, NULL);
|
||||
|
||||
/* Uncomment this line to limit the amount of downloaded data */
|
||||
/* g_object_set (pipeline, "ring-buffer-max-size", (guint64)4000000, NULL); */
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (pipeline);
|
||||
return -1;
|
||||
} else if (ret == GST_STATE_CHANGE_NO_PREROLL) {
|
||||
data.is_live = TRUE;
|
||||
}
|
||||
|
||||
main_loop = g_main_loop_new (NULL, FALSE);
|
||||
data.loop = main_loop;
|
||||
data.pipeline = pipeline;
|
||||
|
||||
gst_bus_add_signal_watch (bus);
|
||||
g_signal_connect (bus, "message", G_CALLBACK (cb_message), &data);
|
||||
g_signal_connect (pipeline, "deep-notify::temp-location", G_CALLBACK (got_location), NULL);
|
||||
|
||||
/* Register a function that GLib will call every second */
|
||||
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
|
||||
|
||||
g_main_loop_run (main_loop);
|
||||
|
||||
/* Free resources */
|
||||
g_main_loop_unref (main_loop);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
g_print ("\n");
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1295673640" class="expand-container">
|
||||
<div id="expander-control-1295673640" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1295673640" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc playback-tutorial-3.c -o playback-tutorial-3 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. In the console window, you should see a message indicating where the media is being stored, and a text graph representing the downloaded portions and the current position. A buffering message appears whenever buffering is required, which might never happen is your network connection is fast enough</p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
This code is based on that of [Basic tutorial 12:
|
||||
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html). Let’s review
|
||||
only the differences.
|
||||
|
||||
#### Setup
|
||||
|
||||
``` first-line: 133; theme: Default; brush: cpp; gutter: true
|
||||
/* Set the download flag */
|
||||
g_object_get (pipeline, "flags", &flags, NULL);
|
||||
flags |= GST_PLAY_FLAG_DOWNLOAD;
|
||||
g_object_set (pipeline, "flags", flags, NULL);
|
||||
```
|
||||
|
||||
By setting this flag, `playbin2` instructs its internal queue (a
|
||||
`queue2` element, actually) to store all downloaded
|
||||
data.
|
||||
|
||||
``` first-line: 157; theme: Default; brush: cpp; gutter: true
|
||||
g_signal_connect (pipeline, "deep-notify::temp-location", G_CALLBACK (got_location), NULL);
|
||||
```
|
||||
|
||||
`deep-notify` signals are emitted by `GstObject` elements (like
|
||||
`playbin2`) when the properties of any of their children elements
|
||||
change. In this case we want to know when the `temp-location` property
|
||||
changes, indicating that the `queue2` has decided where to store the
|
||||
downloaded
|
||||
data.
|
||||
|
||||
``` first-line: 18; theme: Default; brush: cpp; gutter: true
|
||||
static void got_location (GstObject *gstobject, GstObject *prop_object, GParamSpec *prop, gpointer data) {
|
||||
gchar *location;
|
||||
g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL);
|
||||
g_print ("Temporary file: %s\n", location);
|
||||
/* Uncomment this line to keep the temporary file after the program exits */
|
||||
/* g_object_set (G_OBJECT (prop_object), "temp-remove", FALSE, NULL); */
|
||||
}
|
||||
```
|
||||
|
||||
The `temp-location` property is read from the element that triggered the
|
||||
signal (the `queue2`) and printed on screen.
|
||||
|
||||
When the pipeline state changes from `PAUSED` to `READY`, this file is
|
||||
removed. As the comment reads, you can keep it by setting the
|
||||
`temp-remove` property of the `queue2` to `FALSE`.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>On Windows this file is usually created inside the <code>Temporary Internet Files</code> folder, which might hide it from Windows Explorer. If you cannot find the downloaded files, try to use the console.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
#### User Interface
|
||||
|
||||
In `main` we also install a timer which we use to refresh the UI every
|
||||
second.
|
||||
|
||||
``` first-line: 159; theme: Default; brush: cpp; gutter: true
|
||||
/* Register a function that GLib will call every second */
|
||||
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
|
||||
```
|
||||
|
||||
The `refresh_ui` method queries the pipeline to find out which parts of
|
||||
the file have been downloaded and what the currently playing position
|
||||
is. It builds a graph to display this information (sort of a text-mode
|
||||
user interface) and prints it on screen, overwriting the previous one so
|
||||
it looks like it is animated:
|
||||
|
||||
[---->------- ]
|
||||
|
||||
The dashes ‘`-`’ indicate the downloaded parts, and the greater-than
|
||||
sign ‘`>`’ shows the current position (turning into an ‘`X`’ when the
|
||||
pipeline is paused). Keep in mind that if your network is fast enough,
|
||||
you will not see the download bar (the dashes) advance at all; it will
|
||||
be completely full from the beginning.
|
||||
|
||||
``` first-line: 70; theme: Default; brush: cpp; gutter: true
|
||||
static gboolean refresh_ui (CustomData *data) {
|
||||
GstQuery *query;
|
||||
gboolean result;
|
||||
query = gst_query_new_buffering (GST_FORMAT_PERCENT);
|
||||
result = gst_element_query (data->pipeline, query);
|
||||
```
|
||||
|
||||
The first thing we do in `refresh_ui` is construct a new Buffering
|
||||
`GstQuery` with `gst_query_new_buffering()` and pass it to the pipeline
|
||||
(`playbin2`) with `gst_element_query()`. In [Basic tutorial 4: Time
|
||||
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html) we have
|
||||
already seen how to perform simple queries like Position and Duration
|
||||
using specific methods. More complex queries, like Buffering, need to
|
||||
use the more general `gst_element_query()`.
|
||||
|
||||
The Buffering query can be made in different `GstFormat` (TIME, BYTES,
|
||||
PERCENTAGE and a few more). Not all elements can answer the query in all
|
||||
the formats, so you need to check which ones are supported in your
|
||||
particular pipeline. If `gst_element_query()` returns `TRUE`, the query
|
||||
succeeded. The answer to the query is contained in the same
|
||||
`GstQuery` structure we created, and can be retrieved using multiple
|
||||
parse methods:
|
||||
|
||||
``` first-line: 85; theme: Default; brush: cpp; gutter: true
|
||||
n_ranges = gst_query_get_n_buffering_ranges (query);
|
||||
for (range = 0; range < n_ranges; range++) {
|
||||
gint64 start, stop;
|
||||
gst_query_parse_nth_buffering_range (query, range, &start, &stop);
|
||||
start = start * GRAPH_LENGTH / 100;
|
||||
stop = stop * GRAPH_LENGTH / 100;
|
||||
for (i = (gint)start; i < stop; i++)
|
||||
graph [i] = '-';
|
||||
}
|
||||
```
|
||||
|
||||
Data does not need to be downloaded in consecutive pieces from the
|
||||
beginning of the file: Seeking, for example, might force to start
|
||||
downloading from a new position and leave a downloaded chunk behind.
|
||||
Therefore, `gst_query_get_n_buffering_ranges()` returns the number of
|
||||
chunks, or *ranges* of downloaded data, and then, the position and size
|
||||
of each range is retrieved with `gst_query_parse_nth_buffering_range()`.
|
||||
|
||||
The format of the returned values (start and stop position for each
|
||||
range) depends on what we requested in the
|
||||
`gst_query_new_buffering()` call. In this case, PERCENTAGE. These
|
||||
values are used to generate the graph.
|
||||
|
||||
``` first-line: 94; theme: Default; brush: cpp; gutter: true
|
||||
if (gst_element_query_position (data->pipeline, &format, &position) &&
|
||||
GST_CLOCK_TIME_IS_VALID (position) &&
|
||||
gst_element_query_duration (data->pipeline, &format, &duration) &&
|
||||
GST_CLOCK_TIME_IS_VALID (duration)) {
|
||||
i = (gint)(GRAPH_LENGTH * (double)position / (double)(duration + 1));
|
||||
graph [i] = data->buffering_level < 100 ? 'X' : '>';
|
||||
}
|
||||
```
|
||||
|
||||
Next, the current position is queried. It could be queried in the
|
||||
PERCENT format, so code similar to the one used for the ranges is used,
|
||||
but currently this format is not well supported for position queries.
|
||||
Instead, we use the TIME format and also query the duration to obtain a
|
||||
percentage.
|
||||
|
||||
The current position is indicated with either a ‘`>`’ or an ‘`X`’
|
||||
depending on the buffering level. If it is below 100%, the code in the
|
||||
`cb_message` method will have set the pipeline to `PAUSED`, so we print
|
||||
an ‘`X`’. If the buffering level is 100% the pipeline is in the
|
||||
`PLAYING` state and we print a ‘`>`’.
|
||||
|
||||
``` first-line: 102; theme: Default; brush: cpp; gutter: true
|
||||
if (data->buffering_level < 100) {
|
||||
g_print (" Buffering: %3d%%", data->buffering_level);
|
||||
} else {
|
||||
g_print (" ");
|
||||
}
|
||||
```
|
||||
|
||||
Finally, if the buffering level is below 100%, we report this
|
||||
information (and delete it otherwise).
|
||||
|
||||
#### Limiting the size of the downloaded file
|
||||
|
||||
``` first-line: 138; theme: Default; brush: cpp; gutter: true
|
||||
/* Uncomment this line to limit the amount of downloaded data */
|
||||
/* g_object_set (pipeline, "ring-buffer-max-size", (guint64)4000000, NULL); */
|
||||
```
|
||||
|
||||
Uncomment line 139 to see how this can be achieved. This reduces the
|
||||
size of the temporary file, by overwriting already played regions.
|
||||
Observe the download bar to see which regions are kept available in the
|
||||
file.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to enable progressive downloading with the
|
||||
`GST_PLAY_FLAG_DOWNLOAD` `playbin2` flag
|
||||
- How to know what has been downloaded using a Buffering `GstQuery`
|
||||
- How to know where it has been downloaded with the
|
||||
`deep-notify::temp-location` signal
|
||||
- How to limit the size of the temporary file with
|
||||
the `ring-buffer-max-size` property of `playbin2`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[playback-tutorial-4.c](attachments/327808/2424846.c) (text/plain)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[vs2010.zip](attachments/327808/2424847.zip) (application/zip)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
330
Playback+tutorial+5%3A+Color+Balance.markdown
Normal file
|
@ -0,0 +1,330 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 5: Color Balance
|
||||
|
||||
This page last changed on Jun 25, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Brightness, Contrast, Hue and Saturation are common video adjustments,
|
||||
which are collectively known as Color Balance settings in GStreamer.
|
||||
This tutorial shows:
|
||||
|
||||
- How to find out the available color balance channels
|
||||
- How to change them
|
||||
|
||||
# Introduction
|
||||
|
||||
[Basic tutorial 5: GUI toolkit
|
||||
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html) has
|
||||
already explained the concept of GObject interfaces: applications use
|
||||
them to find out if certain functionality is available, regardless of
|
||||
the actual element which implements it.
|
||||
|
||||
`playbin2` implements the Color Balance interface (`gstcolorbalance`),
|
||||
which allows access to the color balance settings. If any of the
|
||||
elements in the `playbin2` pipeline support this interface,
|
||||
`playbin2` simply forwards it to the application, otherwise, a
|
||||
colorbalance element is inserted in the pipeline.
|
||||
|
||||
This interface allows querying for the available color balance channels
|
||||
(`gstcolorbalancechannel`), along with their name and valid range of
|
||||
values, and then modify the current value of any of them.
|
||||
|
||||
# Color balance example
|
||||
|
||||
Copy this code into a text file named `playback-tutorial-5.c`.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
**playback-tutorial-5.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <string.h>
|
||||
#include <gst/gst.h>
|
||||
#include <gst/interfaces/colorbalance.h>
|
||||
|
||||
typedef struct _CustomData {
|
||||
GstElement *pipeline;
|
||||
GMainLoop *loop;
|
||||
} CustomData;
|
||||
|
||||
/* Process a color balance command */
|
||||
static void update_color_channel (const gchar *channel_name, gboolean increase, GstColorBalance *cb) {
|
||||
gdouble step;
|
||||
gint value;
|
||||
GstColorBalanceChannel *channel = NULL;
|
||||
const GList *channels, *l;
|
||||
|
||||
/* Retrieve the list of channels and locate the requested one */
|
||||
channels = gst_color_balance_list_channels (cb);
|
||||
for (l = channels; l != NULL; l = l->next) {
|
||||
GstColorBalanceChannel *tmp = (GstColorBalanceChannel *)l->data;
|
||||
|
||||
if (g_strrstr (tmp->label, channel_name)) {
|
||||
channel = tmp;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!channel)
|
||||
return;
|
||||
|
||||
/* Change the channel's value */
|
||||
step = 0.1 * (channel->max_value - channel->min_value);
|
||||
value = gst_color_balance_get_value (cb, channel);
|
||||
if (increase) {
|
||||
value = (gint)(value + step);
|
||||
if (value > channel->max_value)
|
||||
value = channel->max_value;
|
||||
} else {
|
||||
value = (gint)(value - step);
|
||||
if (value < channel->min_value)
|
||||
value = channel->min_value;
|
||||
}
|
||||
gst_color_balance_set_value (cb, channel, value);
|
||||
}
|
||||
|
||||
/* Output the current values of all Color Balance channels */
|
||||
static void print_current_values (GstElement *pipeline) {
|
||||
const GList *channels, *l;
|
||||
|
||||
/* Output Color Balance values */
|
||||
channels = gst_color_balance_list_channels (GST_COLOR_BALANCE (pipeline));
|
||||
for (l = channels; l != NULL; l = l->next) {
|
||||
GstColorBalanceChannel *channel = (GstColorBalanceChannel *)l->data;
|
||||
gint value = gst_color_balance_get_value (GST_COLOR_BALANCE (pipeline), channel);
|
||||
g_print ("%s: %3d%% ", channel->label,
|
||||
100 * (value - channel->min_value) / (channel->max_value - channel->min_value));
|
||||
}
|
||||
g_print ("\n");
|
||||
}
|
||||
|
||||
/* Process keyboard input */
|
||||
static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomData *data) {
|
||||
gchar *str = NULL;
|
||||
|
||||
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) != G_IO_STATUS_NORMAL) {
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
switch (g_ascii_tolower (str[0])) {
|
||||
case 'c':
|
||||
update_color_channel ("CONTRAST", g_ascii_isupper (str[0]), GST_COLOR_BALANCE (data->pipeline));
|
||||
break;
|
||||
case 'b':
|
||||
update_color_channel ("BRIGHTNESS", g_ascii_isupper (str[0]), GST_COLOR_BALANCE (data->pipeline));
|
||||
break;
|
||||
case 'h':
|
||||
update_color_channel ("HUE", g_ascii_isupper (str[0]), GST_COLOR_BALANCE (data->pipeline));
|
||||
break;
|
||||
case 's':
|
||||
update_color_channel ("SATURATION", g_ascii_isupper (str[0]), GST_COLOR_BALANCE (data->pipeline));
|
||||
break;
|
||||
case 'q':
|
||||
g_main_loop_quit (data->loop);
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
|
||||
g_free (str);
|
||||
|
||||
print_current_values (data->pipeline);
|
||||
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
CustomData data;
|
||||
GstStateChangeReturn ret;
|
||||
GIOChannel *io_stdin;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Initialize our data structure */
|
||||
memset (&data, 0, sizeof (data));
|
||||
|
||||
/* Print usage map */
|
||||
g_print (
|
||||
"USAGE: Choose one of the following options, then press enter:\n"
|
||||
" 'C' to increase contrast, 'c' to decrease contrast\n"
|
||||
" 'B' to increase brightness, 'b' to decrease brightness\n"
|
||||
" 'H' to increase hue, 'h' to decrease hue\n"
|
||||
" 'S' to increase saturation, 's' to decrease saturation\n"
|
||||
" 'Q' to quit\n");
|
||||
|
||||
/* Build the pipeline */
|
||||
data.pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Add a keyboard watch so we get notified of keystrokes */
|
||||
#ifdef _WIN32
|
||||
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
|
||||
#else
|
||||
io_stdin = g_io_channel_unix_new (fileno (stdin));
|
||||
#endif
|
||||
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
|
||||
|
||||
/* Start playing */
|
||||
ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING);
|
||||
if (ret == GST_STATE_CHANGE_FAILURE) {
|
||||
g_printerr ("Unable to set the pipeline to the playing state.\n");
|
||||
gst_object_unref (data.pipeline);
|
||||
return -1;
|
||||
}
|
||||
print_current_values (data.pipeline);
|
||||
|
||||
/* Create a GLib Main Loop and set it to run */
|
||||
data.loop = g_main_loop_new (NULL, FALSE);
|
||||
g_main_loop_run (data.loop);
|
||||
|
||||
/* Free resources */
|
||||
g_main_loop_unref (data.loop);
|
||||
g_io_channel_unref (io_stdin);
|
||||
gst_element_set_state (data.pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (data.pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1909648074" class="expand-container">
|
||||
<div id="expander-control-1909648074" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1909648074" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc playback-tutorial-5.c -o playback-tutorial-5 `pkg-config --cflags --libs gstreamer-interfaces-0.10 gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p></p>
|
||||
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed.</span></p>
|
||||
<p>The console should print all commands (Each command is a single upper-case or lower-case letter) and list all available Color Balance channels, typically, CONTRAST, BRIGHTNESS, HUE and SATURATION. Type each command (letter) followed by the Enter key.</p>
|
||||
<p></p>
|
||||
<p>Required libraries: <code>gstreamer-interfaces-0.10 gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
The `main()` function is fairly simple. A `playbin2` pipeline is
|
||||
instantiated and set to run, and a keyboard watch is installed so
|
||||
keystrokes can be monitored.
|
||||
|
||||
``` first-line: 45; theme: Default; brush: cpp; gutter: true
|
||||
/* Output the current values of all Color Balance channels */
|
||||
static void print_current_values (GstElement *pipeline) {
|
||||
const GList *channels, *l;
|
||||
|
||||
/* Output Color Balance values */
|
||||
channels = gst_color_balance_list_channels (GST_COLOR_BALANCE (pipeline));
|
||||
for (l = channels; l != NULL; l = l->next) {
|
||||
GstColorBalanceChannel *channel = (GstColorBalanceChannel *)l->data;
|
||||
gint value = gst_color_balance_get_value (GST_COLOR_BALANCE (pipeline), channel);
|
||||
g_print ("%s: %3d%% ", channel->label,
|
||||
100 * (value - channel->min_value) / (channel->max_value - channel->min_value));
|
||||
}
|
||||
g_print ("\n");
|
||||
}
|
||||
```
|
||||
|
||||
This method prints the current value for all channels, and exemplifies
|
||||
how to retrieve the list of channels. This is accomplished through the
|
||||
`gst_color_balance_list_channels()` method. It returns a `GList` which
|
||||
needs to be traversed.
|
||||
|
||||
Each element in the list is a `GstColorBalanceChannel` structure,
|
||||
informing of the channel’s name, minimum value and maximum value.
|
||||
`gst_color_balance_get_value()` can then be called on each channel to
|
||||
retrieve the current value.
|
||||
|
||||
In this example, the minimum and maximum values are used to output the
|
||||
current value as a percentage.
|
||||
|
||||
``` first-line: 10; theme: Default; brush: cpp; gutter: true
|
||||
/* Process a color balance command */
|
||||
static void update_color_channel (const gchar *channel_name, gboolean increase, GstColorBalance *cb) {
|
||||
gdouble step;
|
||||
gint value;
|
||||
GstColorBalanceChannel *channel = NULL;
|
||||
const GList *channels, *l;
|
||||
|
||||
/* Retrieve the list of channels and locate the requested one */
|
||||
channels = gst_color_balance_list_channels (cb);
|
||||
for (l = channels; l != NULL; l = l->next) {
|
||||
GstColorBalanceChannel *tmp = (GstColorBalanceChannel *)l->data;
|
||||
|
||||
if (g_strrstr (tmp->label, channel_name)) {
|
||||
channel = tmp;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!channel)
|
||||
return;
|
||||
```
|
||||
|
||||
This method locates the specified channel by name and increases or
|
||||
decreases it as requested. Again, the list of channels is retrieved and
|
||||
parsed looking for the channel with the specified name. Obviously, this
|
||||
list could be parsed only once and the pointers to the channels be
|
||||
stored and indexed by something more efficient than a string.
|
||||
|
||||
``` first-line: 30; theme: Default; brush: cpp; gutter: true
|
||||
/* Change the channel's value */
|
||||
step = 0.1 * (channel->max_value - channel->min_value);
|
||||
value = gst_color_balance_get_value (cb, channel);
|
||||
if (increase) {
|
||||
value = (gint)(value + step);
|
||||
if (value > channel->max_value)
|
||||
value = channel->max_value;
|
||||
} else {
|
||||
value = (gint)(value - step);
|
||||
if (value < channel->min_value)
|
||||
value = channel->min_value;
|
||||
}
|
||||
gst_color_balance_set_value (cb, channel, value);
|
||||
}
|
||||
```
|
||||
|
||||
The current value for the channel is then retrieved, changed (the
|
||||
increment is proportional to its dynamic range), clamped (to avoid
|
||||
out-of-range values) and set using `gst_color_balance_set_value()`.
|
||||
|
||||
And there is not much more to it. Run the program and observe the effect
|
||||
of changing each of the channels in real time.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown how to use the color balance interface.
|
||||
Particularly, it has shown:
|
||||
|
||||
- How to retrieve the list of color available balance channels
|
||||
with `gst_color_balance_list_channels()`
|
||||
- How to manipulate the current value of each channel using
|
||||
`gst_color_balance_get_value()` and `gst_color_balance_set_value()`
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[playback-tutorial-5.c](attachments/327804/2424874.c) (text/plain)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[vs2010.zip](attachments/327804/2424875.zip) (application/zip)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
276
Playback+tutorial+6%3A+Audio+visualization.markdown
Normal file
|
@ -0,0 +1,276 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 6: Audio visualization
|
||||
|
||||
This page last changed on Jun 26, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
GStreamer comes with a set of elements that turn audio into video. They
|
||||
can be used for scientific visualization or to spice up your music
|
||||
player, for example. This tutorial shows:
|
||||
|
||||
- How to enable audio visualization
|
||||
- How to select the visualization element
|
||||
|
||||
# Introduction
|
||||
|
||||
Enabling audio visualization in `playbin2` is actually very easy. Just
|
||||
set the appropriate `playbin2` flag and, when an audio-only stream is
|
||||
found, it will instantiate the necessary elements to create and display
|
||||
the visualization.
|
||||
|
||||
If you want to specify the actual element that you want to use to
|
||||
generate the visualization, you instantiate it yourself and then tell
|
||||
`playbin2` about it through the `vis-plugin` property.
|
||||
|
||||
This tutorial searches the GStreamer registry for all the elements of
|
||||
the Visualization class, tries to select `goom` (or another one if it is
|
||||
not available) and passes it to `playbin2`.
|
||||
|
||||
# A fancy music player
|
||||
|
||||
Copy this code into a text file named `playback-tutorial-6.c`.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
**playback-tutorial-6.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
/* playbin2 flags */
|
||||
typedef enum {
|
||||
GST_PLAY_FLAG_VIS = (1 << 3) /* Enable rendering of visualizations when there is no video stream. */
|
||||
} GstPlayFlags;
|
||||
|
||||
/* Return TRUE if this is a Visualization element */
|
||||
static gboolean filter_vis_features (GstPluginFeature *feature, gpointer data) {
|
||||
GstElementFactory *factory;
|
||||
|
||||
if (!GST_IS_ELEMENT_FACTORY (feature))
|
||||
return FALSE;
|
||||
factory = GST_ELEMENT_FACTORY (feature);
|
||||
if (!g_strrstr (gst_element_factory_get_klass (factory), "Visualization"))
|
||||
return FALSE;
|
||||
|
||||
return TRUE;
|
||||
}
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline, *vis_plugin;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
GList *list, *walk;
|
||||
GstElementFactory *selected_factory = NULL;
|
||||
guint flags;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Get a list of all visualization plugins */
|
||||
list = gst_registry_feature_filter (gst_registry_get_default (), filter_vis_features, FALSE, NULL);
|
||||
|
||||
/* Print their names */
|
||||
g_print("Available visualization plugins:\n");
|
||||
for (walk = list; walk != NULL; walk = g_list_next (walk)) {
|
||||
const gchar *name;
|
||||
GstElementFactory *factory;
|
||||
|
||||
factory = GST_ELEMENT_FACTORY (walk->data);
|
||||
name = gst_element_factory_get_longname (factory);
|
||||
g_print(" %s\n", name);
|
||||
|
||||
if (selected_factory == NULL || g_str_has_prefix (name, "GOOM")) {
|
||||
selected_factory = factory;
|
||||
}
|
||||
}
|
||||
|
||||
/* Don't use the factory if it's still empty */
|
||||
/* e.g. no visualization plugins found */
|
||||
if (!selected_factory) {
|
||||
g_print ("No visualization plugins found!\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* We have now selected a factory for the visualization element */
|
||||
g_print ("Selected '%s'\n", gst_element_factory_get_longname (selected_factory));
|
||||
vis_plugin = gst_element_factory_create (selected_factory, NULL);
|
||||
if (!vis_plugin)
|
||||
return -1;
|
||||
|
||||
/* Build the pipeline */
|
||||
pipeline = gst_parse_launch ("playbin2 uri=http://radio.hbr1.com:19800/ambient.ogg", NULL);
|
||||
|
||||
/* Set the visualization flag */
|
||||
g_object_get (pipeline, "flags", &flags, NULL);
|
||||
flags |= GST_PLAY_FLAG_VIS;
|
||||
g_object_set (pipeline, "flags", flags, NULL);
|
||||
|
||||
/* set vis plugin for playbin2 */
|
||||
g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
|
||||
|
||||
/* Start playing */
|
||||
gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* Wait until error or EOS */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Free resources */
|
||||
if (msg != NULL)
|
||||
gst_message_unref (msg);
|
||||
gst_plugin_feature_list_free (list);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-2064723206" class="expand-container">
|
||||
<div id="expander-control-2064723206" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-2064723206" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc playback-tutorial-6.c -o playback-tutorial-6 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p>This tutorial plays music streamed from the <a href="http://www.hbr1.com/" class="external-link">HBR1</a> Internet radio station. A window should open displaying somewhat psychedelic color patterns moving with the music. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed.</p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
First off, we indicate `playbin2` that we want an audio visualization by
|
||||
setting the `GST_PLAY_FLAG_VIS` flag. If the media already contains
|
||||
video, this flag has no effect.
|
||||
|
||||
``` first-line: 66; theme: Default; brush: cpp; gutter: true
|
||||
/* Set the visualization flag */
|
||||
g_object_get (pipeline, "flags", &flags, NULL);
|
||||
flags |= GST_PLAY_FLAG_VIS;
|
||||
g_object_set (pipeline, "flags", flags, NULL);
|
||||
```
|
||||
|
||||
If no visualization plugin is enforced by the user, `playbin2` will use
|
||||
`goom` (audio visualization will be disabled if `goom` is not
|
||||
available). The rest of the tutorial shows how to find out the available
|
||||
visualization elements and enforce one to `playbin2`.
|
||||
|
||||
``` first-line: 32; theme: Default; brush: cpp; gutter: true
|
||||
/* Get a list of all visualization plugins */
|
||||
list = gst_registry_feature_filter (gst_registry_get_default (), filter_vis_features, FALSE, NULL);
|
||||
```
|
||||
|
||||
`gst_registry_feature_filter()` examines all elements currently in the
|
||||
GStreamer registry and selects those for which
|
||||
the `filter_vis_features` function returns TRUE. This function selects
|
||||
only the Visualization plugins:
|
||||
|
||||
``` first-line: 8; theme: Default; brush: cpp; gutter: true
|
||||
/* Return TRUE if this is a Visualization element */
|
||||
static gboolean filter_vis_features (GstPluginFeature *feature, gpointer data) {
|
||||
GstElementFactory *factory;
|
||||
|
||||
if (!GST_IS_ELEMENT_FACTORY (feature))
|
||||
return FALSE;
|
||||
factory = GST_ELEMENT_FACTORY (feature);
|
||||
if (!g_strrstr (gst_element_factory_get_klass (factory), "Visualization"))
|
||||
return FALSE;
|
||||
|
||||
return TRUE;
|
||||
}
|
||||
```
|
||||
|
||||
A bit of theory regarding the organization of GStreamer elements is in
|
||||
place: Each of the files that GStreamer loads at runtime is known as a
|
||||
Plugin (`GstPlugin`). A Plugin can contain many Features
|
||||
(`GstPluginFeature`). There are different kinds of Features, among them,
|
||||
the Element Factories (`GstElementFactory`) that we have been using to
|
||||
build Elements (`GstElement`).
|
||||
|
||||
This function simply disregards all Features which are not Factories,
|
||||
and then all Factories whose class (obtained with
|
||||
`gst_element_factory_get_klass()`) does not include “Visualization”. As
|
||||
stated in the documentation for `GstElementFactory`, a Factory’s class
|
||||
is a “string describing the type of element, as an unordered list
|
||||
separated with slashes (/)”. Examples of classes are “Source/Network”,
|
||||
“Codec/Decoder/Video”, “Codec/Encoder/Audio” or “Visualization”.
|
||||
|
||||
``` first-line: 35; theme: Default; brush: cpp; gutter: true
|
||||
/* Print their names */
|
||||
g_print("Available visualization plugins:\n");
|
||||
for (walk = list; walk != NULL; walk = g_list_next (walk)) {
|
||||
const gchar *name;
|
||||
GstElementFactory *factory;
|
||||
|
||||
factory = GST_ELEMENT_FACTORY (walk->data);
|
||||
name = gst_element_factory_get_longname (factory);
|
||||
g_print(" %s\n", name);
|
||||
|
||||
if (selected_factory == NULL || g_str_has_prefix (name, "GOOM")) {
|
||||
selected_factory = factory;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Once we have the list of Visualization plugins, we print their names
|
||||
(`gst_element_factory_get_longname()`) and choose one (in this case,
|
||||
GOOM).
|
||||
|
||||
``` first-line: 57; theme: Default; brush: cpp; gutter: true
|
||||
/* We have now selected a factory for the visualization element */
|
||||
g_print ("Selected '%s'\n", gst_element_factory_get_longname (selected_factory));
|
||||
vis_plugin = gst_element_factory_create (selected_factory, NULL);
|
||||
if (!vis_plugin)
|
||||
return -1;
|
||||
```
|
||||
|
||||
The selected factory is used to instantiate an actual `GstElement` which
|
||||
is then passed to `playbin2` through the `vis-plugin` property:
|
||||
|
||||
``` first-line: 71; theme: Default; brush: cpp; gutter: true
|
||||
/* set vis plugin for playbin2 */
|
||||
g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
|
||||
```
|
||||
|
||||
And we are done.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to enable Audio Visualization in `playbin2` with the
|
||||
`GST_PLAY_FLAG_VIS` flag
|
||||
- How to enforce one particular visualization element with the
|
||||
`vis-plugin` `playbin2` property
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[vs2010.zip](attachments/327802/2424878.zip) (application/zip)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[playback-tutorial-6.c](attachments/327802/2424879.c) (text/plain)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
246
Playback+tutorial+7%3A+Custom+playbin2+sinks.markdown
Normal file
|
@ -0,0 +1,246 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 7: Custom playbin2 sinks
|
||||
|
||||
This page last changed on Dec 03, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
`playbin2` can be further customized by manually selecting its audio and
|
||||
video sinks. This allows applications to rely on `playbin2` to retrieve
|
||||
and decode the media and then manage the final render/display
|
||||
themselves. This tutorial shows:
|
||||
|
||||
- How to replace the sinks selected by `playbin2`.
|
||||
- How to use a complex pipeline as a sink.
|
||||
|
||||
# Introduction
|
||||
|
||||
Two properties of `playbin2` allow selecting the desired audio and video
|
||||
sinks: `audio-sink` and `video-sink` (respectively). The application
|
||||
only needs to instantiate the appropriate `GstElement` and pass it to
|
||||
`playbin2` through these properties.
|
||||
|
||||
This method, though, only allows using a single Element as sink. If a
|
||||
more complex pipeline is required, for example, an equalizer plus an
|
||||
audio sink, it needs to be wrapped in a Bin, so it looks to
|
||||
`playbin2` as if it was a single Element.
|
||||
|
||||
A Bin (`GstBin`) is a container that encapsulates partial pipelines so
|
||||
they can be managed as single elements. As an example, the
|
||||
`GstPipeline` we have been using in all tutorials is a type of
|
||||
`GstBin`, which does not interact with external Elements. Elements
|
||||
inside a Bin connect to external elements through Ghost Pads
|
||||
(`GstGhostPad`), this is, Pads on the surface of the Bin which simply
|
||||
forward data from an external Pad to a given Pad on an internal Element.
|
||||
|
||||
![](attachments/1441842/2424880.png)
|
||||
|
||||
**Figure 1:** A Bin with two Elements and one Ghost Pad.
|
||||
|
||||
`GstBin`s are also a type of `GstElement`, so they can be used wherever
|
||||
an Element is required, in particular, as sinks for `playbin2` (and they
|
||||
are then known as **sink-bins**).
|
||||
|
||||
# An equalized player
|
||||
|
||||
Copy this code into a text file named `playback-tutorial-7.c`.
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
**playback-tutorial7.c**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <gst/gst.h>
|
||||
|
||||
int main(int argc, char *argv[]) {
|
||||
GstElement *pipeline, *bin, *equalizer, *convert, *sink;
|
||||
GstPad *pad, *ghost_pad;
|
||||
GstBus *bus;
|
||||
GstMessage *msg;
|
||||
|
||||
/* Initialize GStreamer */
|
||||
gst_init (&argc, &argv);
|
||||
|
||||
/* Build the pipeline */
|
||||
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
|
||||
|
||||
/* Create the elements inside the sink bin */
|
||||
equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer");
|
||||
convert = gst_element_factory_make ("audioconvert", "convert");
|
||||
sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
|
||||
if (!equalizer || !convert || !sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
|
||||
/* Create the sink bin, add the elements and link them */
|
||||
bin = gst_bin_new ("audio_sink_bin");
|
||||
gst_bin_add_many (GST_BIN (bin), equalizer, convert, sink, NULL);
|
||||
gst_element_link_many (equalizer, convert, sink, NULL);
|
||||
pad = gst_element_get_static_pad (equalizer, "sink");
|
||||
ghost_pad = gst_ghost_pad_new ("sink", pad);
|
||||
gst_pad_set_active (ghost_pad, TRUE);
|
||||
gst_element_add_pad (bin, ghost_pad);
|
||||
gst_object_unref (pad);
|
||||
|
||||
/* Configure the equalizer */
|
||||
g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL);
|
||||
g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL);
|
||||
|
||||
/* Set playbin2's audio sink to be our sink bin */
|
||||
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
|
||||
|
||||
/* Start playing */
|
||||
gst_element_set_state (pipeline, GST_STATE_PLAYING);
|
||||
|
||||
/* Wait until error or EOS */
|
||||
bus = gst_element_get_bus (pipeline);
|
||||
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
|
||||
|
||||
/* Free resources */
|
||||
if (msg != NULL)
|
||||
gst_message_unref (msg);
|
||||
gst_object_unref (bus);
|
||||
gst_element_set_state (pipeline, GST_STATE_NULL);
|
||||
gst_object_unref (pipeline);
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
|
||||
<td><div id="expander-1371267928" class="expand-container">
|
||||
<div id="expander-control-1371267928" class="expand-control">
|
||||
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
|
||||
</div>
|
||||
<div id="expander-content-1371267928" class="expand-content">
|
||||
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
|
||||
<div class="panel" style="border-width: 1px;">
|
||||
<div class="panelContent">
|
||||
<p><code>gcc playback-tutorial-7.c -o playback-tutorial-7 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
|
||||
</div>
|
||||
</div>
|
||||
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
|
||||
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The higher frequency bands have been attenuated, so the movie sound should have a more powerful bass component.</span></p>
|
||||
<p>Required libraries: <code>gstreamer-0.10</code></p>
|
||||
</div>
|
||||
</div></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Walkthrough
|
||||
|
||||
``` first-line: 15; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the elements inside the sink bin */
|
||||
equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer");
|
||||
convert = gst_element_factory_make ("audioconvert", "convert");
|
||||
sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
|
||||
if (!equalizer || !convert || !sink) {
|
||||
g_printerr ("Not all elements could be created.\n");
|
||||
return -1;
|
||||
}
|
||||
```
|
||||
|
||||
All the Elements that compose our sink-bin are instantiated. We use an
|
||||
`equalizer-3bands` and an `autoaudiosink`, with an `audioconvert` in
|
||||
between, because we are not sure of the capabilities of the audio sink
|
||||
(since they are hardware-dependant).
|
||||
|
||||
``` first-line: 24; theme: Default; brush: cpp; gutter: true
|
||||
/* Create the sink bin, add the elements and link them */
|
||||
bin = gst_bin_new ("audio_sink_bin");
|
||||
gst_bin_add_many (GST_BIN (bin), equalizer, convert, sink, NULL);
|
||||
gst_element_link_many (equalizer, convert, sink, NULL);
|
||||
```
|
||||
|
||||
This adds the new Elements to the Bin and links them just as we would do
|
||||
if this was a pipeline.
|
||||
|
||||
``` first-line: 28; theme: Default; brush: cpp; gutter: true
|
||||
pad = gst_element_get_static_pad (equalizer, "sink");
|
||||
ghost_pad = gst_ghost_pad_new ("sink", pad);
|
||||
gst_pad_set_active (ghost_pad, TRUE);
|
||||
gst_element_add_pad (bin, ghost_pad);
|
||||
gst_object_unref (pad);
|
||||
```
|
||||
|
||||
Now we need to create a Ghost Pad so this partial pipeline inside the
|
||||
Bin can be connected to the outside. This Ghost Pad will be connected to
|
||||
a Pad in one of the internal Elements (the sink pad of the equalizer),
|
||||
so we retrieve this Pad with `gst_element_get_static_pad()`. Remember
|
||||
from [Basic tutorial 7: Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) that
|
||||
if this was a Request Pad instead of an Always Pad, we would need to use
|
||||
`gst_element_request_pad()`.
|
||||
|
||||
The Ghost Pad is created with `gst_ghost_pad_new()` (pointing to the
|
||||
inner Pad we just acquired), and activated with `gst_pad_set_active()`.
|
||||
It is then added to the Bin with `gst_element_add_pad()`, transferring
|
||||
ownership of the Ghost Pad to the bin, so we do not have to worry about
|
||||
releasing it.
|
||||
|
||||
Finally, the sink Pad we obtained from the equalizer needs to be release
|
||||
with `gst_object_unref()`.
|
||||
|
||||
At this point, we have a functional sink-bin, which we can use as the
|
||||
audio sink in `playbin2`. We just need to instruct `playbin2` to use it:
|
||||
|
||||
``` first-line: 38; theme: Default; brush: cpp; gutter: true
|
||||
/* Set playbin2's audio sink to be our sink bin */
|
||||
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
|
||||
```
|
||||
|
||||
It is as simple as setting the `audio-sink` property on `playbin2` to
|
||||
the newly created sink.
|
||||
|
||||
``` first-line: 34; theme: Default; brush: cpp; gutter: true
|
||||
/* Configure the equalizer */
|
||||
g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL);
|
||||
g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL);
|
||||
```
|
||||
|
||||
The only bit remaining is to configure the equalizer. For this example,
|
||||
the two higher frequency bands are set to the maximum attenuation so the
|
||||
bass is boosted. Play a bit with the values to feel the difference (Look
|
||||
at the documentation for the `equalizer-3bands` element for the allowed
|
||||
range of values).
|
||||
|
||||
# Exercise
|
||||
|
||||
Build a video bin instead of an audio bin, using one of the many
|
||||
interesting video filters GStreamer offers, like `solarize`,
|
||||
`vertigotv` or any of the Elements in the `effectv` plugin. Remember to
|
||||
use the color space conversion element `ffmpegcolorspace` if your
|
||||
pipeline fails to link due to incompatible caps.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown:
|
||||
|
||||
- How to set your own sinks to `playbin2` using the audio-sink and
|
||||
video-sink properties.
|
||||
- How to wrap a piece of pipeline into a `GstBin` so it can be used as
|
||||
a **sink-bin** by `playbin2`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
## Attachments:
|
||||
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[bin-element-ghost.png](attachments/1441842/2424880.png) (image/png)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[playback-tutorial-7.c](attachments/1441842/2424881.c) (text/plain)
|
||||
![](images/icons/bullet_blue.gif)
|
||||
[vs2010.zip](attachments/1441842/2424882.zip) (application/zip)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
|
@ -0,0 +1,351 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 8: Hardware-accelerated video decoding
|
||||
|
||||
This page last changed on Jul 24, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
Hardware-accelerated video decoding has rapidly become a necessity, as
|
||||
low-power devices grow more common. This tutorial (more of a lecture,
|
||||
actually) gives some background on hardware acceleration and explains
|
||||
how does GStreamer benefit from it.
|
||||
|
||||
Sneak peek: if properly setup, you do not need to do anything special to
|
||||
activate hardware acceleration; GStreamer automatically takes advantage
|
||||
of it.
|
||||
|
||||
# Introduction
|
||||
|
||||
Video decoding can be an extremely CPU-intensive task, especially for
|
||||
higher resolutions like 1080p HDTV. Fortunately, modern graphics cards,
|
||||
equipped with programmable GPUs, are able to take care of this job,
|
||||
allowing the CPU to concentrate on other duties. Having dedicated
|
||||
hardware becomes essential for low-power CPUs which are simply incapable
|
||||
of decoding such media fast enough.
|
||||
|
||||
In the current state of things (July-2012) each GPU manufacturer offers
|
||||
a different method to access their hardware (a different API), and a
|
||||
strong industry standard has not emerged yet.
|
||||
|
||||
As of July-2012, there exist at least 8 different video decoding
|
||||
acceleration APIs:
|
||||
|
||||
[VAAPI](http://en.wikipedia.org/wiki/Video_Acceleration_API) (*Video
|
||||
Acceleration API*): Initially designed by
|
||||
[Intel](http://en.wikipedia.org/wiki/Intel) in 2007, targeted at the X
|
||||
Window System on Unix-based operating systems, now open-source. It is
|
||||
currently not limited to Intel GPUs as other manufacturers are free to
|
||||
use this API, for example, [Imagination
|
||||
Technologies](http://en.wikipedia.org/wiki/Imagination_Technologies) or
|
||||
[S3 Graphics](http://en.wikipedia.org/wiki/S3_Graphics). Accessible to
|
||||
GStreamer through
|
||||
the [gstreamer-vaapi](http://gitorious.org/vaapi/gstreamer-vaapi) and
|
||||
[Fluendo](http://en.wikipedia.org/wiki/Fluendo)’s Video Acceleration
|
||||
Decoder (fluvadec) plugins.
|
||||
|
||||
[VDPAU](http://en.wikipedia.org/wiki/VDPAU) (*Video Decode and
|
||||
Presentation API for UNIX*): Initially designed by
|
||||
[NVidia](http://en.wikipedia.org/wiki/NVidia) in 2008, targeted at the X
|
||||
Window System on Unix-based operating systems, now open-source. Although
|
||||
it is also an open-source library, no manufacturer other than NVidia is
|
||||
using it yet. Accessible to GStreamer through
|
||||
the [vdpau](http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/vdpau) element
|
||||
in plugins-bad and [Fluendo](http://en.wikipedia.org/wiki/Fluendo)’s
|
||||
Video Acceleration Decoder (fluvadec) plugins.
|
||||
|
||||
[DXVA](http://en.wikipedia.org/wiki/DXVA) (*DirectX Video
|
||||
Acceleration*): [Microsoft](http://en.wikipedia.org/wiki/Microsoft) API
|
||||
specification for the Microsoft Windows and Xbox 360
|
||||
platforms. Accessible to GStreamer through
|
||||
the [Fluendo](http://en.wikipedia.org/wiki/Fluendo)’s Video
|
||||
Acceleration Decoder (fluvadec) plugin.
|
||||
|
||||
[XVBA](http://en.wikipedia.org/wiki/Xvba) (*X-Video Bitstream
|
||||
Acceleration*): Designed by [AMD
|
||||
Graphics](http://en.wikipedia.org/wiki/AMD_Graphics), is an arbitrary
|
||||
extension of the X video extension (Xv) for the X Window System on Linux
|
||||
operating-systems. Currently only AMD's ATI Radeon graphics cards
|
||||
hardware that have support for Unified Video Decoder version 2.0 or
|
||||
later are supported by the proprietary ATI Catalyst device
|
||||
driver. Accessible to GStreamer through
|
||||
the [Fluendo](http://en.wikipedia.org/wiki/Fluendo)’s Video
|
||||
Acceleration Decoder
|
||||
(fluvadec) plugin.
|
||||
|
||||
[VDA](http://developer.apple.com/library/mac/#technotes/tn2267/_index.html)
|
||||
(*Video Decode Acceleration*): Available on [Mac OS
|
||||
X](http://en.wikipedia.org/wiki/OS_X) v10.6.3 and later with Mac models
|
||||
equipped with the NVIDIA GeForce 9400M, GeForce 320M, GeForce GT 330M,
|
||||
ATI HD Radeon GFX, Intel HD Graphics and others. Only accelerates
|
||||
decoding of H.264 media. Accessible to GStreamer through
|
||||
the [Fluendo](http://en.wikipedia.org/wiki/Fluendo)’s Video
|
||||
Acceleration Decoder (fluvadec) plugin.
|
||||
|
||||
[OpenMAX](http://en.wikipedia.org/wiki/OpenMAX) (*Open Media
|
||||
Acceleration*): Managed by the non-profit technology consortium [Khronos
|
||||
Group](http://en.wikipedia.org/wiki/Khronos_Group "Khronos Group"),
|
||||
it is a "royalty-free, cross-platform set of C-language programming
|
||||
interfaces that provides abstractions for routines especially useful for
|
||||
audio, video, and still images". Accessible to GStreamer through
|
||||
the [gstreamer-omx](http://git.freedesktop.org/gstreamer/gst-omx) plugin.
|
||||
|
||||
[OVD](http://developer.amd.com/sdks/AMDAPPSDK/assets/OpenVideo_Decode_API.PDF)
|
||||
(*Open Video Decode*): Another API from [AMD
|
||||
Graphics](http://en.wikipedia.org/wiki/AMD_Graphics), designed to be a
|
||||
platform agnostic method for softrware developers to leverage the
|
||||
[Universal Video
|
||||
Decode](http://en.wikipedia.org/wiki/Unified_Video_Decoder) (UVD)
|
||||
hardware inside AMD Radeon graphics cards. Currently unavailable to
|
||||
GStreamer.
|
||||
|
||||
[DCE](http://en.wikipedia.org/wiki/Distributed_Codec_Engine)
|
||||
(*Distributed Codec Engine*): An open source software library ("libdce")
|
||||
and API specification by [Texas
|
||||
Instruments](http://en.wikipedia.org/wiki/Texas_Instruments), targeted
|
||||
at Linux systems and ARM platforms. Accessible to GStreamer through
|
||||
the [gstreamer-ducati](https://github.com/robclark/gst-ducati) plugin.
|
||||
|
||||
There exist some GStreamer plugins, like the
|
||||
[gstreamer-vaapi](http://gitorious.org/vaapi/gstreamer-vaapi) project or
|
||||
the
|
||||
[vdpau](http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/vdpau)
|
||||
element in plugins-bad, which target one particular hardware
|
||||
acceleration API and expose its functionality through different
|
||||
GStreamer elements. The application is then responsible for selecting
|
||||
the appropriate plugin depending on the available APIs.
|
||||
|
||||
Some other GStreamer plugins, like
|
||||
[Fluendo](http://en.wikipedia.org/wiki/Fluendo)’s Video Acceleration
|
||||
Decoder (fluvadec), detect at runtime the available APIs and select one
|
||||
automatically. This makes any program using these plugins independent of
|
||||
the API, or even the operating system.
|
||||
|
||||
# Inner workings of hardware-accelerated video decoding plugins
|
||||
|
||||
These APIs generally offer a number of functionalities, like video
|
||||
decoding, post-processing, presentation of the decoded frames, or
|
||||
download of such frames to system memory. Correspondingly, plugins
|
||||
generally offer a different GStreamer element for each of these
|
||||
functions, so pipelines can be built to accommodate any need.
|
||||
|
||||
For example, the `gstreamer-vaapi` plugin offers the `vaapidecode`,
|
||||
`vaapiupload`, `vaapidownload` and `vaapisink` elements that allow
|
||||
hardware-accelerated decoding through VAAPI, upload of raw video frames
|
||||
to GPU memory, download of GPU frames to system memory and presentation
|
||||
of GPU frames, respectively.
|
||||
|
||||
It is important to distinguish between conventional GStreamer frames,
|
||||
which reside in system memory, and frames generated by
|
||||
hardware-accelerated APIs. The latter reside in GPU memory and cannot be
|
||||
touched by GStreamer. They can usually be downloaded to system memory
|
||||
and treated as conventional GStreamer frames, but it is far more
|
||||
efficient to leave them in the GPU and display them from there.
|
||||
|
||||
GStreamer needs to keep track of where these “hardware buffers” are
|
||||
though, so conventional buffers still travel from element to element,
|
||||
but their only content is a hardware buffer ID, or handler. If retrieved
|
||||
with an `appsink`, for example, hardware buffers make no sense, since
|
||||
they are meant to be handled only by the plugin that generated them.
|
||||
|
||||
To indicate this, these buffers have special Caps, like
|
||||
`video/x-vdpau-output` or `video/x-fluendo-va`. In this way, the
|
||||
auto-plugging mechanism of GStreamer will not try to feed hardware
|
||||
buffers to conventional elements, as they would not understand the
|
||||
received buffers. Moreover, using these Caps, the auto-plugger is able
|
||||
to automatically build pipelines that use hardware acceleration, since,
|
||||
after a VAAPI decoder, a VAAPI sink is the only element that fits.
|
||||
|
||||
This all means that, if a particular hardware acceleration API is
|
||||
present in the system, and the corresponding GStreamer plugin is also
|
||||
available, auto-plugging elements like `playbin2` are free to use
|
||||
hardware acceleration to build their pipelines; the application does not
|
||||
need to do anything special to enable it. Almost:
|
||||
|
||||
When `playbin2` has to choose among different equally valid elements,
|
||||
like conventional software decoding (through `vp8dec`, for example) or
|
||||
hardware accelerated decoding (through `vaapidecode`, for example), it
|
||||
uses their *rank* to decide. The rank is a property of each element that
|
||||
indicates its priority; `playbin2` will simply select the element that
|
||||
is able to build a complete pipeline and has the highest rank.
|
||||
|
||||
So, whether `playbin2` will use hardware acceleration or not will depend
|
||||
on the relative ranks of all elements capable of dealing with that media
|
||||
type. Therefore, the easiest way to make sure hardware acceleration is
|
||||
enabled or disabled is by changing the rank of the associated element,
|
||||
as shown in this code:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
static void enable_factory (const gchar *name, gboolean enable) {
|
||||
GstRegistry *registry = NULL;
|
||||
GstElementFactory *factory = NULL;
|
||||
|
||||
registry = gst_registry_get_default ();
|
||||
if (!registry) return;
|
||||
|
||||
factory = gst_element_factory_find (name);
|
||||
if (!factory) return;
|
||||
|
||||
if (enable) {
|
||||
gst_plugin_feature_set_rank (GST_PLUGIN_FEATURE (factory), GST_RANK_PRIMARY + 1);
|
||||
}
|
||||
else {
|
||||
gst_plugin_feature_set_rank (GST_PLUGIN_FEATURE (factory), GST_RANK_NONE);
|
||||
}
|
||||
|
||||
gst_registry_add_feature (registry, GST_PLUGIN_FEATURE (factory));
|
||||
return;
|
||||
}
|
||||
```
|
||||
|
||||
The first parameter passed to this method is the name of the element to
|
||||
modify, for example, `vaapidecode` or `fluvadec`.
|
||||
|
||||
The key method is `gst_plugin_feature_set_rank()`, which will set the
|
||||
rank of the requested element factory to the desired level. For
|
||||
convenience, ranks are divided in NONE, MARGINAL, SECONDARY and PRIMARY,
|
||||
but any number will do. When enabling an element, we set it to
|
||||
PRIMARY+1, so it has a higher rank than the rest of elements which
|
||||
commonly have PRIMARY rank. Setting an element’s rank to NONE will make
|
||||
the auto-plugging mechanism to never select it.
|
||||
|
||||
# Hardware-accelerated video decoding and the GStreamer SDK
|
||||
|
||||
There are no plugins deployed in the GStreamer SDK Amazon 2012.7 that
|
||||
allow hardware-accelerated video decoding. The main reasons are that
|
||||
some of them are not yet fully operational, or still have issues, or are
|
||||
proprietary. Bear in mind that this situation is bound to change in the
|
||||
near future, as this is a very active area of development.
|
||||
|
||||
Some of these plugins can be built from their publicly available
|
||||
sources, using the Cerbero build system (see [Installing on
|
||||
Linux](Installing%2Bon%2BLinux.html)) or independently (linking against
|
||||
the GStreamer SDK libraries, obviously). Some other plugins are readily
|
||||
available in binary form from their vendors.
|
||||
|
||||
The following sections try to summarize the current state of some of
|
||||
these plugins.
|
||||
|
||||
### vdpau in gst-plugins-bad
|
||||
|
||||
- GStreamer element for VDPAU, present in
|
||||
[gst-plugins-bad](http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/vdpau).
|
||||
- Supported codecs:
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>MPEG2</th>
|
||||
<th>MPEG4</th>
|
||||
<th>H.264</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
### gstreamer-vaapi
|
||||
|
||||
- GStreamer element for VAAPI. Standalone project hosted at
|
||||
[gstreamer-vaapi](http://gitorious.org/vaapi/gstreamer-vaapi).
|
||||
- Supported codecs:
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>MPEG2</th>
|
||||
<th>MPEG4</th>
|
||||
<th>H.264</th>
|
||||
<th>VC1</th>
|
||||
<th>WMV3</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
- Can interface directly with Clutter (See [Basic tutorial 15: Clutter
|
||||
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)),
|
||||
so frames do not need to leave the GPU.
|
||||
- Compatible with `playbin2`.
|
||||
|
||||
### gst-omx
|
||||
|
||||
- GStreamer element for OpenMAX. Standalone project hosted at
|
||||
[gst-omx](http://git.freedesktop.org/gstreamer/gst-omx/).
|
||||
- Supported codecs greatly vary depending on the underlying hardware.
|
||||
|
||||
### fluvadec
|
||||
|
||||
- GStreamer element for VAAPI, VDPAU, DXVA2, XVBA and VDA from
|
||||
[Fluendo](http://en.wikipedia.org/wiki/Fluendo) (propietary).
|
||||
- Supported codecs depend on the chosen API, which is selected at
|
||||
runtime depending on what is available on the system:
|
||||
|
||||
<table>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th> </th>
|
||||
<th>MPEG2</th>
|
||||
<th>MPEG4</th>
|
||||
<th>H.264</th>
|
||||
<th>VC1</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td>VAAPI</td>
|
||||
<td><span>✓</span></td>
|
||||
<td><span>✓</span></td>
|
||||
<td><span>✓</span></td>
|
||||
<td><span>✓</span></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>VDPAU</td>
|
||||
<td><span>✓</span></td>
|
||||
<td><span>✓</span></td>
|
||||
<td><span>✓</span></td>
|
||||
<td><span>✓</span></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>XVBA</td>
|
||||
<td> </td>
|
||||
<td> </td>
|
||||
<td><span>✓</span></td>
|
||||
<td><span>✓</span></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>DXVA2</td>
|
||||
<td> </td>
|
||||
<td> </td>
|
||||
<td><span>✓</span></td>
|
||||
<td> </td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>VDA</td>
|
||||
<td> </td>
|
||||
<td> </td>
|
||||
<td><span>✓</span></td>
|
||||
<td> </td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
- Can interface directly with Clutter (See [Basic tutorial 15: Clutter
|
||||
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)),
|
||||
so frames do not need to leave the GPU.
|
||||
- Compatible with `playbin2`.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown a bit how GStreamer internally manages hardware
|
||||
accelerated video decoding. Particularly,
|
||||
|
||||
- Applications do not need to do anything special to enable hardware
|
||||
acceleration if a suitable API and the corresponding GStreamer
|
||||
plugin are available.
|
||||
- Hardware acceleration can be enabled or disabled by changing the
|
||||
rank of the decoding element with `gst_plugin_feature_set_rank()`.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
109
Playback+tutorial+9%3A+Digital+audio+pass-through.markdown
Normal file
|
@ -0,0 +1,109 @@
|
|||
# GStreamer SDK documentation : Playback tutorial 9: Digital audio pass-through
|
||||
|
||||
This page last changed on Jul 24, 2012 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
This tutorial shows how GStreamer handles digital audio pass-through.
|
||||
|
||||
# Introduction
|
||||
|
||||
Besides the common analog format, high-end audio systems usually also
|
||||
accept data in digital form, either compressed or uncompressed. This is
|
||||
convenient because the audio signal then travels from the computer to
|
||||
the speakers in a form that is more resilient to interference and noise,
|
||||
resulting higher quality.
|
||||
|
||||
The connection is typically made through an
|
||||
[S/PDIF](http://en.wikipedia.org/wiki/SPDIF) cable which can either be
|
||||
optical (with [TOSLINK](http://en.wikipedia.org/wiki/TOSLINK)
|
||||
connectors) or coaxial (with [RCA](http://en.wikipedia.org/wiki/RCA)
|
||||
connectors). S/PDIF is also known as IEC 60958 type II (IEC 958 before
|
||||
1998).
|
||||
|
||||
In this scenario, GStreamer does not need to perform audio decoding; it
|
||||
can simply output the encoded data, acting in *pass-through* mode, and
|
||||
let the external audio system perform the decoding.
|
||||
|
||||
# Inner workings of GStreamer audio sinks
|
||||
|
||||
First off, digital audio output must be enabled at the system level. The
|
||||
method to achieve this depend on the operating system, but it generally
|
||||
involves going to the audio control panel and activating a checkbox
|
||||
reading “Digital Audio Output” or similar.
|
||||
|
||||
The main GStreamer audio sinks for each platform, Pulse Audio
|
||||
(`pulsesink`) for Linux, `osxaudiosink` for OS X and Direct Sound
|
||||
(`directsoundsink`) for Windows, detect when digital audio output is
|
||||
available and change their input caps accordingly to accept encoded
|
||||
data. For example, these elements typically accept `audio/x-raw-int` or
|
||||
`audio/x-raw-float` data: when digital audio output is enabled in the
|
||||
system, they may also accept `audio/mpeg`, `audio/x-ac3`,
|
||||
`audio/x-eac3` or `audio/x-dts`.
|
||||
|
||||
Then, when `playbin2` builds the decoding pipeline, it realizes that the
|
||||
audio sink can be directly connected to the encoded data (typically
|
||||
coming out of a demuxer), so there is no need for a decoder. This
|
||||
process is automatic and does not need any action from the application.
|
||||
|
||||
On Linux, there exist other audio sinks, like Alsa (`alsasink`) which
|
||||
work differently (a “digital device” needs to be manually selected
|
||||
through the `device` property of the sink). Pulse Audio, though, is the
|
||||
commonly preferred audio sink on Linux.
|
||||
|
||||
# Precautions with digital formats
|
||||
|
||||
When Digital Audio Output is enabled at the system level, the GStreamer
|
||||
audio sinks automatically expose all possible digital audio caps,
|
||||
regardless of whether the actual audio decoder at the end of the S/PDIF
|
||||
cable is able to decode all those formats. This is so because there is
|
||||
no mechanism to query an external audio decoder which formats are
|
||||
supported, and, in fact, the cable can even be disconnected during this
|
||||
process.
|
||||
|
||||
For example, after enabling Digital Audio Output in the system’s Control
|
||||
Panel, `directsoundsink` will automatically expose `audio/x-ac3`,
|
||||
`audio/x-eac3` and `audio/x-dts` caps in addition to `audio/x-raw-int`.
|
||||
However, one particular external decoder might only understand raw
|
||||
integer streams and would try to play the compressed data as such (a
|
||||
painful experience for your ears, rest assured).
|
||||
|
||||
Solving this issue requires user intervention, since only the user knows
|
||||
the formats supported by the external decoder.
|
||||
|
||||
On some systems, the simplest solution is to inform the operating system
|
||||
of the formats that the external audio decoder can accept. In this way,
|
||||
the GStreamer audio sinks will only offer these formats. The acceptable
|
||||
audio formats are commonly selected from the operating system’s audio
|
||||
configuration panel, from the same place where Digital Audio Output is
|
||||
enabled, but, unfortunately, this option is not available in all audio
|
||||
drivers.
|
||||
|
||||
Another solution involves, using a custom sinkbin (see [Playback
|
||||
tutorial 7: Custom playbin2
|
||||
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin2%2Bsinks.html))
|
||||
which includes a `capsfilter` element (see [Basic tutorial 14: Handy
|
||||
elements](Basic%2Btutorial%2B14%253A%2BHandy%2Belements.html)) and an
|
||||
audio sink. The caps that the external decoder supports are then set in
|
||||
the capsfiler so the wrong format is not output. This allows the
|
||||
application to enforce the appropriate format instead of relying on the
|
||||
user to have the system correctly configured. Still requires user
|
||||
intervention, but can be used regardless of the options the audio driver
|
||||
offers.
|
||||
|
||||
Please do not use `autoaudiosink` as the audio sink, as it currently
|
||||
only supports raw audio, and will ignore any compressed format.
|
||||
|
||||
# Conclusion
|
||||
|
||||
This tutorial has shown a bit of how GStreamer deals with digital audio.
|
||||
In particular, it has shown that:
|
||||
|
||||
- Applications using `playbin2` do not need to do anything special to
|
||||
enable digital audio output: it is managed from the audio control
|
||||
panel of the operating system.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
11
Playback+tutorials.markdown
Normal file
|
@ -0,0 +1,11 @@
|
|||
# GStreamer SDK documentation : Playback tutorials
|
||||
|
||||
This page last changed on Mar 28, 2012 by xartigas.
|
||||
|
||||
# Welcome to the GStreamer SDK Playback tutorials
|
||||
|
||||
These tutorials explain everything you need to know to produce a media
|
||||
playback application using GStreamer.
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
19
Qt+tutorials.markdown
Normal file
|
@ -0,0 +1,19 @@
|
|||
# GStreamer SDK documentation : Qt tutorials
|
||||
|
||||
This page last changed on May 02, 2013 by tdfischer.
|
||||
|
||||
# Welcome to the GStreamer SDK Qt tutorials
|
||||
|
||||
These tutorials describe Qt-specific topics. General GStreamer concepts
|
||||
will not be explained in these tutorials, so the [Basic
|
||||
tutorials](http://docs.gstreamer.com/display/GstSDK/Basic+tutorials) should
|
||||
be reviewed first. The reader should also be familiar with basic Qt
|
||||
programming techniques.
|
||||
|
||||
The Qt tutorials have the same structure as the [Android
|
||||
tutorials](Android%2Btutorials.html): Each one builds on top of the
|
||||
previous one and adds progressively more functionality, until a working
|
||||
media player application is obtained in \#FIXME\#
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
113
QtGStreamer+vs+C+GStreamer.markdown
Normal file
|
@ -0,0 +1,113 @@
|
|||
# GStreamer SDK documentation : QtGStreamer vs C GStreamer
|
||||
|
||||
This page last changed on May 24, 2013 by xartigas.
|
||||
|
||||
QtGStreamer is designed to mirror the C GStreamer API as closely as
|
||||
possible. There are, of course, minor differences. They are documented
|
||||
here.
|
||||
|
||||
# Common Functions
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="50%" />
|
||||
<col width="50%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>C GStreamer</th>
|
||||
<th>QtGStreamer</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><code>gst_element_factory_make()</code></td>
|
||||
<td><code>QGst::ElementFactory::make(const QString &factoryName, const char *elementName=NULL)</code></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>gst_parse_bin_from_description()</code></td>
|
||||
<td><code>QGst::Bin::fromDescription(const QString &description, BinFromDescriptionOption ghostUnlinkedPads=Ghost)</code></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><code>gst_caps_from_string()</code></td>
|
||||
<td><p><code>QGst::Caps::fromString(const QString &string)</code></p></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><code>g_signal_connect()</code></td>
|
||||
<td><code>QGlib::connect(GObject* instance, const char *detailedSignal, T *receiver, R(T::*)(Args...) slot, ConnectFlags flags)</code></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
# Naming Convention
|
||||
|
||||
QtGStreamer follows a strict naming policy to help make cross
|
||||
referencing easier:
|
||||
|
||||
### Namespaces
|
||||
|
||||
The "G" namespace (`GObject`, `GValue`, etc...) is referred to as
|
||||
"QGlib".
|
||||
|
||||
The "Gst" namespace (`GstObject`, `GstElement`, etc...) is referred to
|
||||
as "QGst".
|
||||
|
||||
### Class Names
|
||||
|
||||
Class names should be the same as their G\* equivalents, with the
|
||||
namespace prefix removed. For example, "`GstObject`" becomes
|
||||
"`QGst::Object`", "`GParamSpec`" becomes "`QGlib::ParamSpec`", etc...
|
||||
|
||||
### Method Names
|
||||
|
||||
In general the method names should be the same as the GStreamer ones,
|
||||
with the g\[st\]\_\<class\> prefix removed and converted to camel case.
|
||||
|
||||
For example,
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
gboolean gst_caps_is_emtpy(const GstCaps *caps);
|
||||
```
|
||||
|
||||
becomes:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
namespace QGst {
|
||||
class Caps {
|
||||
bool isEmpty() const;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
There are cases where this may not be followed:
|
||||
|
||||
1. **Properties**. Most property getters have a "get" prefix, for
|
||||
example, `gst_object_get_name()`. In QtGStreamer the "get" prefix is
|
||||
omitted, so this becomes just `name()`.
|
||||
2. **Overloaded members**. In C there is no possibility to have two
|
||||
methods with the same name, so overloaded members usually have some
|
||||
extra suffix, like "\_full". For example, `g_object_set_data()` and
|
||||
`g_object_set_data_full()`. In C++ we just add a method with the
|
||||
same name, or put optional parameters in the existing method.
|
||||
3. **Other cases where the glib/gstreamer method name doesn't make much
|
||||
sense**. For example, `gst_element_is_locked_state()`. That doesn't
|
||||
make sense in english, as "sate" is the subject and should go before
|
||||
the verb "is". So, it becomes `stateIsLocked()`.
|
||||
|
||||
# Reference Counting
|
||||
|
||||
Reference counting is handled the same way as Qt does. There is no need
|
||||
to call `g_object_ref()`` and g_object_unref()`.
|
||||
|
||||
# Access to GStreamer Elements
|
||||
|
||||
QtGStreamer provides access to the underlying C objects, in case you
|
||||
need them. This is accessible with a simple cast:
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
ElementPtr qgstElement = QGst::ElementFactory::make("playbin2");
|
||||
GstElement* gstElement = GST_ELEMENT(qgstElement);
|
||||
```
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
26
Releases.markdown
Normal file
|
@ -0,0 +1,26 @@
|
|||
# GStreamer SDK documentation : Releases
|
||||
|
||||
This page last changed on Jun 12, 2013 by xartigas.
|
||||
|
||||
The GStreamer SDK releases are named using a date and a name. The date
|
||||
is in year.month form, and corresponds to the release date. The name is
|
||||
meant to be easier to remember and matches an actual river’s name, in
|
||||
alphabetical order, increasing with every release.
|
||||
|
||||
The current release is:
|
||||
|
||||
### [2013.6 Congo](2013.6%2BCongo.html)
|
||||
|
||||
The older releases were (oldest at the
|
||||
bottom):
|
||||
|
||||
### [2012.11 Brahmaputra](2012.11%2BBrahmaputra.html)
|
||||
|
||||
### [2012.9 Amazon (Bugfix Release 2)](2012.9%2BAmazon%2B%2528Bugfix%2BRelease%2B2%2529.html)
|
||||
|
||||
### [2012.7 Amazon (Bugfix Release 1)](2012.7%2BAmazon%2B%2528Bugfix%2BRelease%2B1%2529.html)
|
||||
|
||||
### [2012.5 Amazon](2012.5%2BAmazon.html)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
58
Table+of+Concepts.markdown
Normal file
|
@ -0,0 +1,58 @@
|
|||
# GStreamer SDK documentation : Table of Concepts
|
||||
|
||||
This page last changed on Jun 06, 2012 by xartigas.
|
||||
|
||||
This table shows in which tutorial each of the following key GStreamer
|
||||
concepts is discussed.
|
||||
|
||||
- Action signals: [Playback tutorial 1: Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html)
|
||||
- Audio switching: [Playback tutorial 1: Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html)
|
||||
- Buffers: [Basic tutorial 8: Short-cutting the
|
||||
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html)
|
||||
- Bus: [Basic tutorial 2: GStreamer
|
||||
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)
|
||||
- Capabilities: [Basic tutorial 6: Media formats and Pad
|
||||
Capabilities](Basic%2Btutorial%2B6%253A%2BMedia%2Bformats%2Band%2BPad%2BCapabilities.html)
|
||||
- Debugging: [Basic tutorial 11: Debugging
|
||||
tools](Basic%2Btutorial%2B11%253A%2BDebugging%2Btools.html)
|
||||
- Discoverer: [Basic tutorial 9: Media information
|
||||
gathering](Basic%2Btutorial%2B9%253A%2BMedia%2Binformation%2Bgathering.html)
|
||||
- Elements: [Basic tutorial 2: GStreamer
|
||||
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)
|
||||
- gst-discoverer: [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)
|
||||
- gst-inspect: [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html), [gst-inspect](gst-inspect.html)
|
||||
- gst-launch: [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html), [gst-launch](gst-launch.html)
|
||||
- GUI: [Basic tutorial 5: GUI toolkit
|
||||
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html)
|
||||
- Links: [Basic tutorial 2: GStreamer
|
||||
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)
|
||||
- Pads: [Basic tutorial 3: Dynamic
|
||||
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html)
|
||||
- Pad Availability: [Basic tutorial 7: Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html)
|
||||
- Pipelines: [Basic tutorial 2: GStreamer
|
||||
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)
|
||||
- Queries: [Basic tutorial 4: Time
|
||||
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html)
|
||||
- Seeks: [Basic tutorial 4: Time
|
||||
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html)
|
||||
- Signals: [Basic tutorial 3: Dynamic
|
||||
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html)
|
||||
- States: [Basic tutorial 3: Dynamic
|
||||
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html)
|
||||
- Subtitles: [Playback tutorial 2: Subtitle
|
||||
management](Playback%2Btutorial%2B2%253A%2BSubtitle%2Bmanagement.html)
|
||||
- Tags: [Playback tutorial 1: Playbin2
|
||||
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html)
|
||||
- Tools: [Basic tutorial 10: GStreamer
|
||||
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)
|
||||
- Threads: [Basic tutorial 7: Multithreading and Pad
|
||||
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
93
Tutorials.markdown
Normal file
|
@ -0,0 +1,93 @@
|
|||
# GStreamer SDK documentation : Tutorials
|
||||
|
||||
This page last changed on Jun 12, 2013 by xartigas.
|
||||
|
||||
# Welcome to the GStreamer SDK Tutorials\!
|
||||
|
||||
The following sections introduce a series of tutorials designed to help
|
||||
you learn how to use GStreamer, the multi-platform, modular,
|
||||
open-source, media streaming framework.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
Before following these tutorials, you need to set up your development
|
||||
environment according to your platform. If you have not done so yet,
|
||||
follow the appropriate link
|
||||
for [Linux](Installing%2Bon%2BLinux.html), [Mac OS
|
||||
X](Installing%2Bon%2BMac%2BOS%2BX.html) or [Windows](Installing%2Bon%2BWindows.html) and
|
||||
come back here afterwards.
|
||||
|
||||
The tutorials are currently written only in the C programming language,
|
||||
so you need to be comfortable with it. Even though C is not an
|
||||
Object-Oriented (OO) language per se, the GStreamer framework uses
|
||||
`GObject`s, so some knowledge of OO concepts will come in handy.
|
||||
Knowledge of the `GObject` and `GLib` libraries is not mandatory, but
|
||||
will make the trip easier.
|
||||
|
||||
### Source code
|
||||
|
||||
Every tutorial represents a self-contained project, with full source
|
||||
code in C (and eventually in other languages too). Source code snippets
|
||||
are introduced alongside the text, and the full code (with any other
|
||||
required files like makefiles or project files) is distributed with the
|
||||
SDK, as explained in the installation instructions.
|
||||
|
||||
### A short note on GObject and GLib
|
||||
|
||||
GStreamer is built on top of the `GObject` (for object orientation) and
|
||||
`GLib` (for common algorithms) libraries, which means that every now and
|
||||
then you will have to call functions of these libraries. Even though the
|
||||
tutorials will make sure that deep knowledge of these libraries is not
|
||||
required, familiarity with them will certainly ease the process of
|
||||
learning GStreamer.
|
||||
|
||||
You can always tell which library you are calling because all GStreamer
|
||||
functions, structures and types have the `gst_` prefix, whereas GLib and
|
||||
GObject use `g_`.
|
||||
|
||||
### Sources of documentation
|
||||
|
||||
Besides what you can find in this site, you have
|
||||
the `GObject` and `GLib` reference guides, and, of course the
|
||||
upstream [GStreamer
|
||||
documentation](http://gstreamer.freedesktop.org/documentation/).
|
||||
|
||||
When accessing external documentation, be careful to check that the
|
||||
version of the documented library matches the versions used in this SDK
|
||||
(which you can find in the [Releases](Releases.html) page)
|
||||
|
||||
### Structure
|
||||
|
||||
The tutorials are organized in sections, revolving about a common theme:
|
||||
|
||||
- [Basic tutorials](Basic%2Btutorials.html): Describe general topics
|
||||
required to understand the rest of tutorials in the GStreamer SDK.
|
||||
- [Playback tutorials](Playback%2Btutorials.html): Explain everything
|
||||
you need to know to produce a media playback application using
|
||||
GStreamer.
|
||||
- [Android tutorials](Android%2Btutorials.html): Tutorials dealing
|
||||
with the few Android-specific topics you need to know.
|
||||
- [iOS tutorials](iOS%2Btutorials.html): Tutorials dealing with the
|
||||
few iOS-specific topics you need to know.
|
||||
|
||||
If you cannot remember in which tutorial a certain GStreamer concept is
|
||||
explained, use the following:
|
||||
|
||||
- [Table of Concepts](Table%2Bof%2BConcepts.html)
|
||||
|
||||
Furthermore, you can find the list of planned tutorials here:
|
||||
|
||||
- [Upcoming tutorials](Upcoming%2Btutorials.html)
|
||||
|
||||
### Sample media
|
||||
|
||||
The audio and video clips used throughout these tutorials are all
|
||||
publicly available and the copyright remains with their respective
|
||||
authors. In some cases they have been re-encoded for demonstration
|
||||
purposes. All the files are hosted
|
||||
at [docs.gstreamer.com](http://docs.gstreamer.com).
|
||||
|
||||
- [Sintel, the Durian Open Movie Project](http://www.sintel.org/)
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
21
Upcoming+tutorials.markdown
Normal file
|
@ -0,0 +1,21 @@
|
|||
# GStreamer SDK documentation : Upcoming tutorials
|
||||
|
||||
This page last changed on May 24, 2013 by xartigas.
|
||||
|
||||
This is the list of planned tutorials (in no particular order). They
|
||||
will move out of this list and into their own pages as they are
|
||||
finished. Use the [Contact](Contact.html) page to send us a mail or open
|
||||
a feature request through Bugzilla if there is a subject you would like
|
||||
to see discussed in a tutorial.
|
||||
|
||||
Basic tutorials:
|
||||
|
||||
- Static builds
|
||||
- Pad Probes
|
||||
|
||||
Playback tutorials:
|
||||
|
||||
- DVD playback
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
246
Using+appsink%2Fappsrc+in+Qt.markdown
Normal file
|
@ -0,0 +1,246 @@
|
|||
# GStreamer SDK documentation : Using appsink/appsrc in Qt
|
||||
|
||||
This page last changed on May 24, 2013 by xartigas.
|
||||
|
||||
# Goal
|
||||
|
||||
For those times when you need to stream data into or out of GStreamer
|
||||
through your application, GStreamer includes two helpful elements:
|
||||
|
||||
- `appsink` - Allows applications to easily extract data from a
|
||||
GStreamer pipeline
|
||||
- `appsrc` - Allows applications to easily stream data into a
|
||||
GStreamer pipeline
|
||||
|
||||
This tutorial will demonstrate how to use both of them by constructing a
|
||||
pipeline to decode an audio file, stream it into an application's code,
|
||||
then stream it back into your audio output device. All this, using
|
||||
QtGStreamer.
|
||||
|
||||
# Steps
|
||||
|
||||
First, the files. These are also available in the
|
||||
`examples/appsink-src` directory of the QGstreamer SDK.
|
||||
|
||||
**CMakeLists.txt**
|
||||
|
||||
``` theme: Default; brush: plain; gutter: true
|
||||
project(qtgst-example-appsink-src)
|
||||
find_package(QtGStreamer REQUIRED)
|
||||
find_package(Qt4 REQUIRED)
|
||||
include_directories(${QTGSTREAMER_INCLUDES} ${QT_QTCORE_INCLUDE_DIRS})
|
||||
add_definitions(${QTGSTREAMER_DEFINITIONS})
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${QTGSTREAMER_FLAGS}")
|
||||
add_executable(appsink-src main.cpp)
|
||||
target_link_libraries(appsink-src ${QTGSTREAMER_UTILS_LIBRARIES} ${QT_QTCORE_LIBRARIES})
|
||||
```
|
||||
|
||||
**main.cpp**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
#include <iostream>
|
||||
#include <QtCore/QCoreApplication>
|
||||
#include <QGlib/Error>
|
||||
#include <QGlib/Connect>
|
||||
#include <QGst/Init>
|
||||
#include <QGst/Bus>
|
||||
#include <QGst/Pipeline>
|
||||
#include <QGst/Parse>
|
||||
#include <QGst/Message>
|
||||
#include <QGst/Utils/ApplicationSink>
|
||||
#include <QGst/Utils/ApplicationSource>
|
||||
|
||||
class MySink : public QGst::Utils::ApplicationSink
|
||||
{
|
||||
public:
|
||||
MySink(QGst::Utils::ApplicationSource *src)
|
||||
: QGst::Utils::ApplicationSink(), m_src(src) {}
|
||||
protected:
|
||||
virtual void eos()
|
||||
{
|
||||
m_src->endOfStream();
|
||||
}
|
||||
virtual QGst::FlowReturn newBuffer()
|
||||
{
|
||||
m_src->pushBuffer(pullBuffer());
|
||||
return QGst::FlowOk;
|
||||
}
|
||||
private:
|
||||
QGst::Utils::ApplicationSource *m_src;
|
||||
};
|
||||
|
||||
class Player : public QCoreApplication
|
||||
{
|
||||
public:
|
||||
Player(int argc, char **argv);
|
||||
~Player();
|
||||
private:
|
||||
void onBusMessage(const QGst::MessagePtr & message);
|
||||
private:
|
||||
QGst::Utils::ApplicationSource m_src;
|
||||
MySink m_sink;
|
||||
QGst::PipelinePtr pipeline1;
|
||||
QGst::PipelinePtr pipeline2;
|
||||
};
|
||||
Player::Player(int argc, char **argv)
|
||||
: QCoreApplication(argc, argv), m_sink(&m_src)
|
||||
{
|
||||
QGst::init(&argc, &argv);
|
||||
if (argc <= 1) {
|
||||
std::cerr << "Usage: " << argv[0] << " <audio_file>" << std::endl;
|
||||
std::exit(1);
|
||||
}
|
||||
const char *caps = "audio/x-raw-int,channels=1,rate=8000,"
|
||||
"signed=(boolean)true,width=16,depth=16,endianness=1234";
|
||||
/* source pipeline */
|
||||
QString pipe1Descr = QString("filesrc location=\"%1\" ! "
|
||||
"decodebin2 ! "
|
||||
"audioconvert ! "
|
||||
"audioresample ! "
|
||||
"appsink name=\"mysink\" caps=\"%2\"").arg(argv[1], caps);
|
||||
pipeline1 = QGst::Parse::launch(pipe1Descr).dynamicCast<QGst::Pipeline>();
|
||||
m_sink.setElement(pipeline1->getElementByName("mysink"));
|
||||
QGlib::connect(pipeline1->bus(), "message::error", this, &Player::onBusMessage);
|
||||
pipeline1->bus()->addSignalWatch();
|
||||
/* sink pipeline */
|
||||
QString pipe2Descr = QString("appsrc name=\"mysrc\" caps=\"%1\" ! autoaudiosink").arg(caps);
|
||||
pipeline2 = QGst::Parse::launch(pipe2Descr).dynamicCast<QGst::Pipeline>();
|
||||
m_src.setElement(pipeline2->getElementByName("mysrc"));
|
||||
QGlib::connect(pipeline2->bus(), "message", this, &Player::onBusMessage);
|
||||
pipeline2->bus()->addSignalWatch();
|
||||
/* start playing */
|
||||
pipeline1->setState(QGst::StatePlaying);
|
||||
pipeline2->setState(QGst::StatePlaying);
|
||||
}
|
||||
Player::~Player()
|
||||
{
|
||||
pipeline1->setState(QGst::StateNull);
|
||||
pipeline2->setState(QGst::StateNull);
|
||||
}
|
||||
void Player::onBusMessage(const QGst::MessagePtr & message)
|
||||
{
|
||||
switch (message->type()) {
|
||||
case QGst::MessageEos:
|
||||
quit();
|
||||
break;
|
||||
case QGst::MessageError:
|
||||
qCritical() << message.staticCast<QGst::ErrorMessage>()->error();
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
int main(int argc, char **argv)
|
||||
{
|
||||
Player p(argc, argv);
|
||||
return p.exec();
|
||||
}
|
||||
```
|
||||
|
||||
## Walkthrough
|
||||
|
||||
As this is a very simple example, most of the action happens in the
|
||||
`Player`'s constructor. First, GStreamer is initialized through
|
||||
`QGst::init()`:
|
||||
|
||||
**GStreamer Initialization**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
QGst::init(&argc, &argv);
|
||||
```
|
||||
|
||||
Now we can construct the first half of the pipeline:
|
||||
|
||||
**Pipeline Setup**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
const char *caps = "audio/x-raw-int,channels=1,rate=8000,"
|
||||
"signed=(boolean)true,width=16,depth=16,endianness=1234";
|
||||
|
||||
/* source pipeline */
|
||||
QString pipe1Descr = QString("filesrc location=\"%1\" ! "
|
||||
"decodebin2 ! "
|
||||
"audioconvert ! "
|
||||
"audioresample ! "
|
||||
"appsink name=\"mysink\" caps=\"%2\"").arg(argv[1], caps);
|
||||
pipeline1 = QGst::Parse::launch(pipe1Descr).dynamicCast<QGst::Pipeline>();
|
||||
m_sink.setElement(pipeline1->getElementByName("mysink"));
|
||||
QGlib::connect(pipeline1->bus(), "message::error", this, &Player::onBusMessage);
|
||||
pipeline1->bus()->addSignalWatch();
|
||||
```
|
||||
|
||||
`QGst::Parse::launch()` parses the text description of a pipeline and
|
||||
returns a `QGst::PipelinePtr`. In this case, the pipeline is composed
|
||||
of:
|
||||
|
||||
- A `filesrc` element to read the file
|
||||
- `decodebin2` to automatically examine the stream and pick the right
|
||||
decoder(s)
|
||||
- `audioconvert` and `audioresample` to convert the output of the
|
||||
`decodebin2` into the caps specified for the `appsink`
|
||||
- An `appsink` element with specific caps
|
||||
|
||||
Next, we tell our `MySink` class (which is a subclass
|
||||
of `QGst::Utils::ApplicationSink`) what `appsink` element to use.
|
||||
|
||||
The second half of the pipeline is created similarly:
|
||||
|
||||
**Second Pipeline**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
/* sink pipeline */
|
||||
QString pipe2Descr = QString("appsrc name=\"mysrc\" caps=\"%1\" ! autoaudiosink").arg(caps);
|
||||
pipeline2 = QGst::Parse::launch(pipe2Descr).dynamicCast<QGst::Pipeline>();
|
||||
m_src.setElement(pipeline2->getElementByName("mysrc"));
|
||||
QGlib::connect(pipeline2->bus(), "message", this, &Player::onBusMessage);
|
||||
pipeline2->bus()->addSignalWatch();
|
||||
```
|
||||
|
||||
Finally, the pipeline is started:
|
||||
|
||||
**Starting the pipeline**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
/* start playing */
|
||||
pipeline1->setState(QGst::StatePlaying);
|
||||
pipeline2->setState(QGst::StatePlaying);
|
||||
```
|
||||
|
||||
Once the pipelines are started, the first one begins pushing buffers
|
||||
into the `appsink` element. Our `MySink` class implements the
|
||||
`newBuffer()` method, which is called by QGStreamer when a new buffer is
|
||||
ready for processing:
|
||||
|
||||
**MySink::newBuffer()**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: true
|
||||
virtual QGst::FlowReturn newBuffer()
|
||||
{
|
||||
m_src->pushBuffer(pullBuffer());
|
||||
return QGst::FlowOk;
|
||||
}
|
||||
```
|
||||
|
||||
Our implementation takes the new buffer and pushes it into the
|
||||
`appsrc` element, which got assigned in the `Player` constructor:
|
||||
|
||||
**Player::Player()**
|
||||
|
||||
``` theme: Default; brush: cpp; gutter: false
|
||||
Player::Player(int argc, char **argv)
|
||||
: QCoreApplication(argc, argv), m_sink(&m_src)
|
||||
```
|
||||
|
||||
From there, buffers flow into the `autoaudiosink` element, which
|
||||
automatically figures out a way to send it to your speakers.
|
||||
|
||||
# Conclusion
|
||||
|
||||
You should now have an understanding of how to push and pull arbitrary
|
||||
data into and out of a GStreamer pipeline.
|
||||
|
||||
It has been a pleasure having you here, and see you soon\!
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
261
Windows+deployment.markdown
Normal file
|
@ -0,0 +1,261 @@
|
|||
# GStreamer SDK documentation : Windows deployment
|
||||
|
||||
This page last changed on Nov 28, 2012 by xartigas.
|
||||
|
||||
This page explains how to deploy GStreamer along your application. There
|
||||
are different mechanisms, which have been reviewed in [Deploying your
|
||||
application](Deploying%2Byour%2Bapplication.html). The details for some
|
||||
of the mechanisms are given here, and more options might be added to
|
||||
this documentation in the future.
|
||||
|
||||
# Shared GStreamer
|
||||
|
||||
This is the easiest way to deploy GStreamer, although most of the time
|
||||
it installs unnecessary files which grow the size of the installer and
|
||||
the target drive free space requirements. Since the SDK might be shared
|
||||
among all applications that use it, though, the extra space requirements
|
||||
are somewhat blurred.
|
||||
|
||||
Simply pack the GStreamer SDK **runtime** installer ([the same one you
|
||||
installed in your development machine](Installing%2Bon%2BWindows.html))
|
||||
inside your installer (or download it from your installer) and execute
|
||||
it silently using `msiexec`. `msiexec` is the tool that wraps most of
|
||||
the Windows Installer functionality and offers a number of options to
|
||||
suit your needs. You can review these options by
|
||||
executing `msiexec` without parameters. For example:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
msiexec /i gstreamer-sdk-2012.9-x86.msi
|
||||
```
|
||||
|
||||
This will bring up the installation dialog as if the user had
|
||||
double-clicked on the `msi` file. Usually, you will want to let the user
|
||||
choose where they want to install the SDK. An environment variable will
|
||||
let your application locate it later on.
|
||||
|
||||
# Private deployment of GStreamer
|
||||
|
||||
You can use the same method as the shared SDK, but instruct its
|
||||
installer to deploy to your application’s folder (or a
|
||||
subfolder). Again, use the `msiexec` parameters that suit you best. For
|
||||
example:
|
||||
|
||||
``` theme: Default; brush: plain; gutter: false
|
||||
msiexec /passive INSTALLDIR=C:\Desired\Folder /i gstreamer-sdk-2012.9-x86.msi
|
||||
```
|
||||
|
||||
This will install the SDK to `C:\Desired\Folder` showing a progress
|
||||
dialog, but not requiring user intervention.
|
||||
|
||||
# Deploy only necessary files, by manually picking them
|
||||
|
||||
On the other side of the spectrum, if you want to reduce the space
|
||||
requirements (and installer size) to the maximum, you can manually
|
||||
choose which GStreamer libraries to deploy. Unfortunately, you are on
|
||||
your own on this road, besides using the [Dependency
|
||||
Walker](http://www.dependencywalker.com/) tool to discover inter-DLL
|
||||
dependencies.
|
||||
|
||||
Bear in mind that GStreamer is modular in nature. Plug-ins are loaded
|
||||
depending on the media that is being played, so, if you do not know in
|
||||
advance what files you are going to play, you do not know which DLLs you
|
||||
need to deploy.
|
||||
|
||||
# Deploy only necessary packages, using provided Merge Modules
|
||||
|
||||
If you are building your installer using one of the Professional
|
||||
editions of [Visual
|
||||
Studio](http://www.microsoft.com/visualstudio/en-us/products/2010-editions/professional/overview)
|
||||
or [WiX](http://wix.sf.net) you can take advantage of pre-packaged
|
||||
[Merge
|
||||
Modules](http://msdn.microsoft.com/en-us/library/windows/desktop/aa369820\(v=vs.85\).aspx).
|
||||
The GStreamer SDK is divided in packages, which roughly take care of
|
||||
different tasks. There is the core package, the playback package, the
|
||||
networking package, etc. Each package contains the necessary libraries
|
||||
and files to accomplish its task.
|
||||
|
||||
The Merge Modules are pieces that can be put together to build a larger
|
||||
Windows Installer. In this case, you just need to create a deployment
|
||||
project for your application with Visual Studio and then add the Merge
|
||||
Modules for the GStreamer packages your application needs.
|
||||
|
||||
This will produce a smaller installer than deploying the complete
|
||||
GStreamer SDK, without the added burden of having to manually pick each
|
||||
library. You just need to know which packages your application requires.
|
||||
|
||||
![](images/icons/grey_arrow_down.gif)Available packages (Click to
|
||||
expand)
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
<col width="25%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>Package name</th>
|
||||
<th>Dependencies</th>
|
||||
<th>Licenses</th>
|
||||
<th>Description</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td>base-system</td>
|
||||
<td> </td>
|
||||
<td>JPEG, FreeType, BSD-like, LGPL,<br />
|
||||
LGPL-2+, LGPL-2.1, LibPNG and MIT</td>
|
||||
<td>Base system dependencies</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gobject-python</td>
|
||||
<td>base-system</td>
|
||||
<td>LGPL</td>
|
||||
<td>GLib/GObject python bindings</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-capture</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer plugins for capture</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-clutter</td>
|
||||
<td>base-system, gtk+-2.0, gstreamer-core</td>
|
||||
<td>LGPL</td>
|
||||
<td>GStreamer Clutter support</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-codecs</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>BSD, Jasper-2.0, BSD-like, LGPL,<br />
|
||||
LGPL-2, LGPL-2+, LGPL-2.1 and LGPL-2.1+</td>
|
||||
<td>GStreamer codecs</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-codecs-gpl</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>BSD-like, LGPL, LGPL-2+ and LGPL-2.1+</td>
|
||||
<td>GStreamer codecs under the GPL license and/or with patents issues</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-core</td>
|
||||
<td>base-system</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer core</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-dvd</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>GPL-2+, LGPL and LGPL-2+</td>
|
||||
<td>GStreamer DVD support</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-effects</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer effects and instrumentation plugins</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-networking</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>GPL-3, LGPL, LGPL-2+, LGPL-2.1+<br />
|
||||
and LGPL-3+</td>
|
||||
<td>GStreamer plugins for network protocols</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-playback</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer plugins for playback</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-python</td>
|
||||
<td>base-system, gobject-python,<br />
|
||||
gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2.1+</td>
|
||||
<td>GStreamer python bindings</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-system</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL, LGPL-2+ and LGPL-2.1+</td>
|
||||
<td>GStreamer system plugins</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gstreamer-tutorials</td>
|
||||
<td> </td>
|
||||
<td>LGPL</td>
|
||||
<td>Tutorials for GStreamer</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gstreamer-visualizers</td>
|
||||
<td>base-system, gstreamer-core</td>
|
||||
<td>LGPL and LGPL-2+</td>
|
||||
<td>GStreamer visualization plugins</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>gtk+-2.0</td>
|
||||
<td>base-system</td>
|
||||
<td>LGPL</td>
|
||||
<td>Gtk toolkit</td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td>gtk+-2.0-python</td>
|
||||
<td>base-system, gtk+-2.0</td>
|
||||
<td>LGPL and LGPL-2.1+</td>
|
||||
<td>Gtk python bindings</td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td>snappy</td>
|
||||
<td><p>base-system, gstreamer-clutter,<br />
|
||||
gtk+-2.0, gstreamer-playback,<br />
|
||||
gstreamer-core, gstreamer-codecs</p></td>
|
||||
<td>LGPL</td>
|
||||
<td>Snappy media player</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
If you include a merge module in your deployment project, remember to
|
||||
include also its dependencies. Otherwise, the project will build
|
||||
correctly and install flawlessly, but, when executing your application,
|
||||
it will miss files.
|
||||
|
||||
Get the ZIP file with all Merge Modules for your architecture:
|
||||
|
||||
<table>
|
||||
<colgroup>
|
||||
<col width="100%" />
|
||||
</colgroup>
|
||||
<thead>
|
||||
<tr class="header">
|
||||
<th>32 bits</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><p><a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6-merge-modules.zip" class="external-link">GStreamer SDK 2013.6 (Congo) for Windows 32 bits (Merge Modules)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/windows/x86/gstreamer-sdk-x86-2013.6-merge-modules.zip" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6-merge-modules.zip.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/windows/x86/gstreamer-sdk-x86-2013.6-merge-modules.zip.sha1" class="external-link">sha1</a></p></td>
|
||||
</tr>
|
||||
<tr class="even">
|
||||
<td><span style="color: rgb(0,0,0);">64 bits</span></td>
|
||||
</tr>
|
||||
<tr class="odd">
|
||||
<td><a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6-merge-modules.zip" class="external-link">GStreamer SDK 2013.6 (Congo) for Windows 64 bits (Merge Modules)</a> - <a href="http://www.freedesktop.org/software/gstreamer-sdk/data/packages/windows/x86-64/gstreamer-sdk-x86_64-2013.6-merge-modules.zip" class="external-link">mirror</a> - <a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6-merge-modules.zip.md5" class="external-link">md5</a> - <a href="http://cdn.gstreamer.com/windows/x86-64/gstreamer-sdk-x86_64-2013.6-merge-modules.zip.sha1" class="external-link">sha1</a></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<table>
|
||||
<tbody>
|
||||
<tr class="odd">
|
||||
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
|
||||
<td><p>Due to the size of these files, usage of a <a href="http://en.wikipedia.org/wiki/Download_manager" class="external-link">Download Manager</a> is <strong>highly recommended</strong>. Take a look at <a href="http://en.wikipedia.org/wiki/Comparison_of_download_managers" class="external-link">this list</a> if you do not have one installed. If, after downloading, the installer reports itself as corrupt, chances are that the connection ended before the file was complete. A Download Manager will typically re-start the process and fetch the missing parts.</p></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
Document generated by Confluence on Oct 08, 2015 10:27
|
||||
|
BIN
attachments/1343490.png
Normal file
After Width: | Height: | Size: 7 KiB |
BIN
attachments/1540098.png
Normal file
After Width: | Height: | Size: 3.5 KiB |
BIN
attachments/1540099.png
Normal file
After Width: | Height: | Size: 2.7 KiB |
BIN
attachments/1540100.png
Normal file
After Width: | Height: | Size: 3.3 KiB |
BIN
attachments/1540101.png
Normal file
After Width: | Height: | Size: 4.3 KiB |
BIN
attachments/1540102.png
Normal file
After Width: | Height: | Size: 52 KiB |
BIN
attachments/1540121.png
Normal file
After Width: | Height: | Size: 494 KiB |
BIN
attachments/1540122.png
Normal file
After Width: | Height: | Size: 494 KiB |
BIN
attachments/1540123.png
Normal file
After Width: | Height: | Size: 494 KiB |
BIN
attachments/1540141.png
Normal file
After Width: | Height: | Size: 33 KiB |
BIN
attachments/1540146.png
Normal file
After Width: | Height: | Size: 13 KiB |
BIN
attachments/1540147.png
Normal file
After Width: | Height: | Size: 65 KiB |
BIN
attachments/1540148.png
Normal file
After Width: | Height: | Size: 418 B |
BIN
attachments/1540149.png
Normal file
After Width: | Height: | Size: 25 KiB |
BIN
attachments/1540150.png
Normal file
After Width: | Height: | Size: 33 KiB |
BIN
attachments/1540151.png
Normal file
After Width: | Height: | Size: 42 KiB |
BIN
attachments/1540152.png
Normal file
After Width: | Height: | Size: 29 KiB |
BIN
attachments/1540153.png
Normal file
After Width: | Height: | Size: 42 KiB |
BIN
attachments/1540154.png
Normal file
After Width: | Height: | Size: 47 KiB |
BIN
attachments/1540155.png
Normal file
After Width: | Height: | Size: 26 KiB |
BIN
attachments/1540156.png
Normal file
After Width: | Height: | Size: 30 KiB |
BIN
attachments/1540157.png
Normal file
After Width: | Height: | Size: 40 KiB |
BIN
attachments/1540158.png
Normal file
After Width: | Height: | Size: 11 KiB |
BIN
attachments/1540159.png
Normal file
After Width: | Height: | Size: 11 KiB |
BIN
attachments/1540160.png
Normal file
After Width: | Height: | Size: 11 KiB |
BIN
attachments/1540161.png
Normal file
After Width: | Height: | Size: 33 KiB |
BIN
attachments/1540162.png
Normal file
After Width: | Height: | Size: 17 KiB |
BIN
attachments/1540163.png
Normal file
After Width: | Height: | Size: 8.8 KiB |
BIN
attachments/1540164.png
Normal file
After Width: | Height: | Size: 31 KiB |
BIN
attachments/1540165.png
Normal file
After Width: | Height: | Size: 17 KiB |
BIN
attachments/2424835.png
Normal file
After Width: | Height: | Size: 193 KiB |
BIN
attachments/2424836.png
Normal file
After Width: | Height: | Size: 51 KiB |
BIN
attachments/2424837.png
Normal file
After Width: | Height: | Size: 106 KiB |
BIN
attachments/2424838.png
Normal file
After Width: | Height: | Size: 121 KiB |
BIN
attachments/2424839.png
Normal file
After Width: | Height: | Size: 33 KiB |
BIN
attachments/2424840.png
Normal file
After Width: | Height: | Size: 414 KiB |
BIN
attachments/2424841.png
Normal file
After Width: | Height: | Size: 83 KiB |
BIN
attachments/2424842.png
Normal file
After Width: | Height: | Size: 86 KiB |