diff --git a/docs/manual/advanced-autoplugging.xml b/docs/manual/advanced-autoplugging.xml
index 177e5f8ffb..474a66fdfc 100644
--- a/docs/manual/advanced-autoplugging.xml
+++ b/docs/manual/advanced-autoplugging.xml
@@ -20,32 +20,32 @@
without needing any adaptations to its autopluggers.
- We will first introduce the concept of MIME types as a dynamic and
+ We will first introduce the concept of Media types as a dynamic and
extendible way of identifying media streams. After that, we will introduce
the concept of typefinding to find the type of a media stream. Lastly,
we will explain how autoplugging and the &GStreamer; registry can be
- used to setup a pipeline that will convert media from one mimetype to
+ used to setup a pipeline that will convert media from one mediatype to
another, for example for media decoding.
-
- MIME-types as a way to identify streams
+
+ Media types as a way to identify streams
We have previously introduced the concept of capabilities as a way
for elements (or, rather, pads) to agree on a media type when
streaming data from one element to the next (see ). We have explained that a capability is
- a combination of a mimetype and a set of properties. For most
+ a combination of a media type and a set of properties. For most
container formats (those are the files that you will find on your
hard disk; Ogg, for example, is a container format), no properties
- are needed to describe the stream. Only a MIME-type is needed. A
- full list of MIME-types and accompanying properties can be found
+ are needed to describe the stream. Only a media type is needed. A
+ full list of media types and accompanying properties can be found
in the
Plugin Writer's Guide.
- An element must associate a MIME-type to its source and sink pads
+ An element must associate a media type to its source and sink pads
when it is loaded into the system. &GStreamer; knows about the
different elements and what type of data they expect and emit through
the &GStreamer; registry. This allows for very dynamic and extensible
@@ -54,14 +54,14 @@
In , we've learned to build a
- music player for Ogg/Vorbis files. Let's look at the MIME-types
+ music player for Ogg/Vorbis files. Let's look at the media types
associated with each pad in this pipeline. shows what MIME-type belongs to each
+ linkend="section-mime-img"/> shows what media type belongs to each
pad in this pipeline.
diff --git a/docs/manual/advanced-interfaces.xml b/docs/manual/advanced-interfaces.xml
index 1fa58975c5..ef41f4caee 100644
--- a/docs/manual/advanced-interfaces.xml
+++ b/docs/manual/advanced-interfaces.xml
@@ -48,78 +48,6 @@
-
- The Mixer interface
-
-
- The mixer interface provides a uniform way to control the volume on a
- hardware (or software) mixer. The interface is primarily intended to
- be implemented by elements for audio inputs and outputs that talk
- directly to the hardware (e.g. OSS or ALSA plugins).
-
-
- Using this interface, it is possible to control a list of tracks
- (such as Line-in, Microphone, etc.) from a mixer element. They can
- be muted, their volume can be changed and, for input tracks, their
- record flag can be set as well.
-
-
- Example plugins implementing this interface include the OSS elements
- (osssrc, osssink, ossmixer) and the ALSA plugins (alsasrc, alsasink
- and alsamixer).
-
-
- You should not use this interface for volume control in a playback
- application. Either use a volume element or use
- playbin's volume property, or use
- the audiosink's volume property (if it has one).
-
-
-
- In order for the GstMixer
- interface to be
- usable, the element implementing it needs to be in the right state,
- so that the underlying mixer device is open. This usually means the
- element needs to be at least in GST_STATE_READY
- before you can use this interface. You will get confusing warnings
- if the element is not in the right state when the interface is used.
-
-
-
-
-
- The Tuner interface
-
-
- The tuner interface is a uniform way to control inputs and outputs
- on a multi-input selection device. This is primarily used for input
- selection on elements for TV- and capture-cards.
-
-
- Using this interface, it is possible to select one track from a list
- of tracks supported by that tuner-element. The tuner will then select
- that track for media-processing internally. This can, for example, be
- used to switch inputs on a TV-card (e.g. from Composite to S-video).
-
-
- This interface is currently only implemented by the Video4linux and
- Video4linux2 elements.
-
-
-
- In order for the GstTuner
- interface to be
- usable, the element implementing it needs to be in the right state,
- so that the underlying device is open. This usually means the
- element needs to be at least in GST_STATE_READY
- before you can use this interface. You will get confusing warnings
- if the element is not in the right state when the interface is used.
-
-
-
-
The Color Balance interface
@@ -132,45 +60,23 @@
The colorbalance interface is implemented by several plugins, including
- xvimagesink and the Video4linux and Video4linux2 elements.
+ xvimagesink and the Video4linux2 elements.
-
- The Property Probe interface
+
+ The Video Overlay interface
- The property probe is a way to autodetect allowed values for a
- GObject property. It's primary use is to autodetect
- devices in several elements. For example, the OSS elements use this
- interface to detect all OSS devices on a system. Applications can then
- probe this property and get a list of detected devices.
-
-
-
- Given the overlap between HAL and the practical implementations of this
- interface, this might in time be deprecated in favour of HAL.
-
-
-
- This interface is currently implemented by many elements, including
- the ALSA, OSS, XVideo, Video4linux and Video4linux2 elements.
-
-
-
-
- The X Overlay interface
-
-
- The X Overlay interface was created to solve the problem of embedding
+ The Video Overlay interface was created to solve the problem of embedding
video streams in an application window. The application provides an
- X-window to the element implementing this interface to draw on, and
- the element will then use this X-window to draw on rather than creating
+ window handle to the element implementing this interface to draw on, and
+ the element will then use this window handle to draw on rather than creating
a new toplevel window. This is useful to embed video in video players.
- This interface is implemented by, amongst others, the Video4linux and
- Video4linux2 elements and by ximagesink, xvimagesink and sdlvideosink.
+ This interface is implemented by, amongst others, the Video4linux2
+ elements and by ximagesink, xvimagesink and sdlvideosink.
diff --git a/docs/manual/advanced-threads.xml b/docs/manual/advanced-threads.xml
index 218606e8df..cabdcf5479 100644
--- a/docs/manual/advanced-threads.xml
+++ b/docs/manual/advanced-threads.xml
@@ -80,16 +80,14 @@
Scheduling in &GStreamer;
- Scheduling of pipelines in &GStreamer; is done by using a thread for
- each group, where a group is a set of elements separated
- by queue elements. Within such a group, scheduling is
- either push-based or pull-based, depending on which mode is supported
- by the particular element. If elements support random access to data,
- such as file sources, then elements downstream in the pipeline become
- the entry point of this group (i.e. the element controlling the
- scheduling of other elements). The entry point pulls data from upstream
- and pushes data downstream, thereby calling data handling functions on
- either type of element.
+ Each element in the &GStreamer; pipeline decides how it is going to
+ be scheduled. Elements can choose to be scheduled push-based or
+ pull-based.
+ If elements support random access to data, such as file sources,
+ then elements downstream in the pipeline can ask to schedule the random
+ access elements in pull-based mode. Data is pulled from upstream
+ and pushed downstream. If pull-mode is not supported, the element can
+ decide to operate in push-mode.
In practice, most elements in &GStreamer;, such as decoders, encoders,
diff --git a/docs/manual/appendix-checklist.xml b/docs/manual/appendix-checklist.xml
index f6e01436f3..8022a5c9bf 100644
--- a/docs/manual/appendix-checklist.xml
+++ b/docs/manual/appendix-checklist.xml
@@ -74,7 +74,7 @@
For example, would turn
on debugging for the Ogg demuxer element. You can use wildcards as
well. A debugging level of 0 will turn off all debugging, and a level
- of 5 will turn on all debugging. Intermediate values only turn on
+ of 9 will turn on all debugging. Intermediate values only turn on
some debugging (based on message severity; 2, for example, will only
display errors and warnings). Here's a list of all available options:
@@ -187,13 +187,5 @@
-
- GstEditor
-
- GstEditor is a set of widgets to display a graphical representation of a
- pipeline.
-
-
-
diff --git a/docs/manual/appendix-integration.xml b/docs/manual/appendix-integration.xml
index c55edb8f56..007f614775 100644
--- a/docs/manual/appendix-integration.xml
+++ b/docs/manual/appendix-integration.xml
@@ -23,16 +23,15 @@
For audio input and output, &GStreamer; provides input and
output elements for several audio subsystems. Amongst others,
- &GStreamer; includes elements for ALSA (alsasrc, alsamixer,
- alsasink), OSS (osssrc, ossmixer, osssink) and Sun audio
- (sunaudiosrc, sunaudiomixer, sunaudiosink).
+ &GStreamer; includes elements for ALSA (alsasrc,
+ alsasink), OSS (osssrc, osssink) Pulesaudio (pulsesrc, pulsesink)
+ and Sun audio (sunaudiosrc, sunaudiomixer, sunaudiosink).
For video input, &GStreamer; contains source elements for
- Video4linux (v4lsrc, v4lmjpegsrc, v4lelement and v4lmjpegisnk)
- and Video4linux2 (v4l2src, v4l2element).
+ Video4linux2 (v4l2src, v4l2element, v4l2sink).
diff --git a/docs/manual/appendix-porting.xml b/docs/manual/appendix-porting.xml
index fe2c8cb410..107db3e591 100644
--- a/docs/manual/appendix-porting.xml
+++ b/docs/manual/appendix-porting.xml
@@ -128,3 +128,178 @@
+
+ Porting 0.10 applications to 1.0
+
+ This section of the appendix will discuss shortly what changes to
+ applications will be needed to quickly and conveniently port most
+ applications from &GStreamer;-0.10 to &GStreamer;-1.0, with references
+ to the relevant sections in this Application Development Manual
+ where needed. With this list, it should be possible to port simple
+ applications to &GStreamer;-1.0 in less than a day.
+
+
+
+ List of changes
+
+
+
+ All deprecated methods were removed. Recompile against 0.10 with
+ DISABLE_DEPRECATED and fix issues before attempting to port to 1.0.
+
+
+
+
+ "playbin2" has been renamed to "playbin", with similar API
+
+
+
+
+ "decodebin2" has been renamed to "decodebin", with similar API. Note
+ that there is no longer a "new-decoded-pad" signal, just use GstElement's
+ "pad-added" signal instead (but don't forget to remove the 'gboolean last'
+ argument from your old signal callback functino signature).
+
+
+
+
+ the names of some "formatted" pad templates has been changed from e.g.
+ "src%d" to "src%u" or "src_%u" or similar, since we don't want to see
+ negative numbers in pad names. This mostly affects applications that
+ create request pads from elements.
+
+
+
+
+ some elements that used to have a single dynamic source pad have a
+ source pad now. Example: wavparse, id3demux, iceydemux, apedemux.
+ (This does not affect applications using decodebin or playbin).
+
+
+
+
+ playbin now proxies the GstVideoOverlay (former GstXOverlay) interface,
+ so most applications can just remove the sync bus handler where they
+ would set the window ID, and instead just set the window ID on playbin
+ from the application thread before starting playback.
+
+
+ playbin also proxies the GstColorBalance and GstNavigation interfaces,
+ so applications that use this don't need to go fishing for elements
+ that may implement those any more, but can just use them unconditionally.
+
+
+
+
+ multifdsink, tcpclientsink, tcpclientsrc, tcpserversrc the protocol property
+ is removed, use gdppay and gdpdepay.
+
+
+
+
+ XML serialization was removed.
+
+
+
+
+ Probes and pad blocking was merged into new pad probes.
+
+
+
+
+ Position, duration and convert functions no longer use an inout parameter
+ for the destination format.
+
+
+
+
+ Video and audio caps were simplified. audio/x-raw-int and audio/x-raw-float
+ are now all under the audio/x-raw media type. Similarly, video/x-raw-rgb
+ and video/x-raw-yuv are now video/x-raw.
+
+
+
+
+ ffmpegcolorspace was removed and replaced with videoconvert.
+
+
+
+
+ GstMixerInterface / GstTunerInterface were removed without replacement.
+
+
+
+
+ The GstXOverlay interface was renamed to GstVideoOverlay, and now part
+ of the video library in gst-plugins-base, as the interfaces library
+ no longer exists.
+
+
+ The name of the GstXOverlay "prepare-xwindow-id" message has changed
+ to "prepare-window-handle" (and GstXOverlay has been renamed to
+ GstVideoOverlay). Code that checks for the string directly should be
+ changed to use gst_is_video_overlay_prepare_window_handle_message(message)
+ instead.
+
+
+
+
+ The GstPropertyProbe interface was removed. the is no replacement yet,
+ but a more featureful replacement for device discovery and feature
+ querying is planned, see https://bugzilla.gnome.org/show_bug.cgi?id=678402
+
+
+
+
+ gst_uri_handler_get_uri() and the get_uri vfunc now return a copy of
+ the URI string
+
+
+ gst_uri_handler_set_uri() and the set_uri vfunc now take an additional
+ GError argument so the handler can notify the caller why it didn't
+ accept a particular URI.
+
+
+ gst_uri_handler_set_uri() now checks if the protocol of the URI passed
+ is one of the protocols advertised by the uri handler, so set_uri vfunc
+ implementations no longer need to check that as well.
+
+
+
+
+ GstTagList is now an opaque mini object instead of being typedefed to a
+ GstStructure. While it was previously okay (and in some cases required because of
+ missing taglist API) to cast a GstTagList to a GstStructure or use
+ gst_structure_* API on taglists, you can no longer do that. Doing so will
+ cause crashes.
+
+
+ Also, tag lists are refcounted now, and can therefore not be freely
+ modified any longer. Make sure to call gst_tag_list_make_writable (taglist)
+ before adding, removing or changing tags in the taglist.
+
+
+ GST_TAG_IMAGE, GST_TAG_PREVIEW_IMAGE, GST_TAG_ATTACHMENT: many tags that
+ used to be of type GstBuffer are now of type GstSample (which is basically
+ a struct containing a buffer alongside caps and some other info).
+
+
+
+
+ GstController has now been merged into GstObject. It does not exists as an
+ individual object anymore. In addition core contains a GstControlSource base
+ class and the GstControlBinding. The actual control sources are in the controller
+ library as before. The 2nd big change is that control sources generate
+ a sequence of gdouble values and those are mapped to the property type and
+ value range by GstControlBindings.
+
+
+ The whole gst_controller_* API is gone and now available in simplified form
+ under gst_object_*. ControlSources are now attached via GstControlBinding
+ to properties. There are no GValue arguments used anymore when programming
+ control sources.
+
+
+
+
+
diff --git a/docs/manual/basics-bins.xml b/docs/manual/basics-bins.xml
index af736eedb2..1e3155e8ee 100644
--- a/docs/manual/basics-bins.xml
+++ b/docs/manual/basics-bins.xml
@@ -19,11 +19,8 @@
The bin will also manage the elements contained in it. It will
- figure out how the data will flow in the bin and generate an
- optimal plan for that data flow. Plan generation is one of the
- most complicated procedures in &GStreamer;. You will learn more
- about this process, called scheduling, in .
+ perform state changes on the elements as well as collect and
+ forward bus messages.