docs: use the gtk-doc syntax to link to properties

Don't use docbook unless needed. Also stip other docbook tags in the the files we fix.
This commit is contained in:
Stefan Sauer 2014-02-18 21:44:24 +01:00
parent 3abad7af66
commit 35da463618
6 changed files with 95 additions and 172 deletions

View file

@ -20,34 +20,24 @@
/**
* SECTION:element-capssetter
*
* <refsect2>
* <para>
* Sets or merges caps on a stream's buffers.
* That is, a buffer's caps are updated using (fields of)
* <link linkend="GstCapsSetter--caps">caps</link>. Note that this may
* contain multiple structures (though not likely recommended), but each
* of these must be fixed (or will otherwise be rejected).
* </para>
* <para>
* If <link linkend="GstCapsSetter--join">join</link>
* is TRUE, then the incoming caps' mime-type is compared to the mime-type(s)
* of provided caps and only matching structure(s) are considered for updating.
* </para>
* <para>
* If <link linkend="GstCapsSetter--replace">replace</link>
* is TRUE, then any caps update is preceded by clearing existing fields,
* making provided fields (as a whole) replace incoming ones.
* Otherwise, no clearing is performed, in which case provided fields are
* added/merged onto incoming caps
* </para>
* <para>
* Sets or merges caps on a stream's buffers. That is, a buffer's caps are
* updated using (fields of) #GstCapsSetter:caps. Note that this may contain
* multiple structures (though not likely recommended), but each of these must
* be fixed (or will otherwise be rejected).
*
* If #GstCapsSetter:join is %TRUE, then the incoming caps' mime-type is
* compared to the mime-type(s) of provided caps and only matching structure(s)
* are considered for updating.
*
* If #GstCapsSetter:replace is %TRUE, then any caps update is preceded by
* clearing existing fields, making provided fields (as a whole) replace
* incoming ones. Otherwise, no clearing is performed, in which case provided
* fields are added/merged onto incoming caps
*
* Although this element might mainly serve as debug helper,
* it can also practically be used to correct a faulty pixel-aspect-ratio,
* or to modify a yuv fourcc value to effectively swap chroma components or such
* alike.
* </para>
* </refsect2>
*
*/

View file

@ -65,35 +65,21 @@
* The fragmented file features defined (only) in ISO Base Media are used by
* ISMV files making up (a.o.) Smooth Streaming (ismlmux).
*
* A few properties (<link linkend="GstMP4Mux--movie-timescale">movie-timescale</link>,
* <link linkend="GstMP4Mux--trak-timescale">trak-timescale</link>) allow adjusting
* some technical parameters, which might be useful in (rare) cases to resolve
* compatibility issues in some situations.
* A few properties (#GstMp4Mux:movie-timescale, #GstMp4Mux:trak-timescale)
* allow adjusting some technical parameters, which might be useful in (rare)
* cases to resolve compatibility issues in some situations.
*
* Some other properties influence the result more fundamentally.
* A typical mov/mp4 file's metadata (aka moov) is located at the end of the file,
* somewhat contrary to this usually being called "the header".
* However, a <link linkend="GstMP4Mux--faststart">faststart</link> file will
* (with some effort) arrange this to be located near start of the file,
* which then allows it e.g. to be played while downloading.
* Alternatively, rather than having one chunk of metadata at start (or end),
* there can be some metadata at start and most of the other data can be spread
* out into fragments of <link linkend="GstMP4Mux--fragment-duration">fragment-duration</link>.
* A typical mov/mp4 file's metadata (aka moov) is located at the end of the
* file, somewhat contrary to this usually being called "the header".
* However, a #GstMp4Mux:faststart file will (with some effort) arrange this to
* be located near start of the file, which then allows it e.g. to be played
* while downloading. Alternatively, rather than having one chunk of metadata at
* start (or end), there can be some metadata at start and most of the other
* data can be spread out into fragments of #GstMp4Mux:fragment-duration.
* If such fragmented layout is intended for streaming purposes, then
* <link linkend="GstMP4Mux--streamable">streamable</link> allows foregoing to add
* index metadata (at the end of file).
*
* <link linkend="GstMP4Mux--dts-method">dts-method</link> allows selecting a
* method for managing input timestamps (stay tuned for 0.11 to have this
* automagically settled). The default delta/duration method should handle nice
* (aka perfect streams) just fine, but may experience problems otherwise
* (e.g. input stream with re-ordered B-frames and/or with frame dropping).
* The re-ordering approach re-assigns incoming timestamps in ascending order
* to incoming buffers and offers an alternative in such cases. In cases where
* that might fail, the remaining method can be tried, which is exact and
* according to specs, but might experience playback on not so spec-wise players.
* Note that this latter approach also requires one to enable
* <link linkend="GstMP4Mux--presentation-timestamp">presentation-timestamp</link>.
* #GstMp4Mux:streamable allows foregoing to add index metadata (at the end of
* file).
*
* <refsect2>
* <title>Example pipelines</title>
@ -129,35 +115,21 @@
* The fragmented file features defined (only) in ISO Base Media are used by
* ISMV files making up (a.o.) Smooth Streaming (ismlmux).
*
* A few properties (<link linkend="Gst3GPPMux--movie-timescale">movie-timescale</link>,
* <link linkend="Gst3GPPMux--trak-timescale">trak-timescale</link>) allow adjusting
* some technical parameters, which might be useful in (rare) cases to resolve
* compatibility issues in some situations.
* A few properties (#Gst3GPPMux:movie-timescale, #Gst3GPPMux:trak-timescale)
* allow adjusting some technical parameters, which might be useful in (rare)
* cases to resolve compatibility issues in some situations.
*
* Some other properties influence the result more fundamentally.
* A typical mov/mp4 file's metadata (aka moov) is located at the end of the file,
* somewhat contrary to this usually being called "the header".
* However, a <link linkend="Gst3GPPMux--faststart">faststart</link> file will
* (with some effort) arrange this to be located near start of the file,
* which then allows it e.g. to be played while downloading.
* Alternatively, rather than having one chunk of metadata at start (or end),
* there can be some metadata at start and most of the other data can be spread
* out into fragments of <link linkend="Gst3GPPMux--fragment-duration">fragment-duration</link>.
* If such fragmented layout is intended for streaming purposes, then
* <link linkend="Gst3GPPMux--streamable">streamable</link> allows foregoing to add
* index metadata (at the end of file).
*
* <link linkend="Gst3GPPMux--dts-method">dts-method</link> allows selecting a
* method for managing input timestamps (stay tuned for 0.11 to have this
* automagically settled). The default delta/duration method should handle nice
* (aka perfect streams) just fine, but may experience problems otherwise
* (e.g. input stream with re-ordered B-frames and/or with frame dropping).
* The re-ordering approach re-assigns incoming timestamps in ascending order
* to incoming buffers and offers an alternative in such cases. In cases where
* that might fail, the remaining method can be tried, which is exact and
* according to specs, but might experience playback on not so spec-wise players.
* Note that this latter approach also requires one to enable
* <link linkend="Gst3GPPMux--presentation-timestamp">presentation-timestamp</link>.
* somewhat contrary to this usually being called "the header". However, a
* #Gst3GPPMux:faststart file will (with some effort) arrange this to be located
* near start of the file, which then allows it e.g. to be played while
* downloading. Alternatively, rather than having one chunk of metadata at start
* (or end), there can be some metadata at start and most of the other data can
* be spread out into fragments of #Gst3GPPMux:fragment-duration. If such
* fragmented layout is intended for streaming purposes, then
* #Gst3GPPMux:streamable allows foregoing to add index metadata (at the end of
* file).
*
* <refsect2>
* <title>Example pipelines</title>
@ -193,35 +165,21 @@
* The fragmented file features defined (only) in ISO Base Media are used by
* ISMV files making up (a.o.) Smooth Streaming (ismlmux).
*
* A few properties (<link linkend="GstMJ2Mux--movie-timescale">movie-timescale</link>,
* <link linkend="GstMJ2Mux--trak-timescale">trak-timescale</link>) allow adjusting
* some technical parameters, which might be useful in (rare) cases to resolve
* compatibility issues in some situations.
* A few properties (#GstMJ2Mux:movie-timescale, #GstMJ2Mux:trak-timescale)
* allow adjusting some technical parameters, which might be useful in (rare)
* cases to resolve compatibility issues in some situations.
*
* Some other properties influence the result more fundamentally.
* A typical mov/mp4 file's metadata (aka moov) is located at the end of the file,
* somewhat contrary to this usually being called "the header".
* However, a <link linkend="GstMJ2Mux--faststart">faststart</link> file will
* (with some effort) arrange this to be located near start of the file,
* which then allows it e.g. to be played while downloading.
* Alternatively, rather than having one chunk of metadata at start (or end),
* there can be some metadata at start and most of the other data can be spread
* out into fragments of <link linkend="GstMJ2Mux--fragment-duration">fragment-duration</link>.
* If such fragmented layout is intended for streaming purposes, then
* <link linkend="GstMJ2Mux--streamable">streamable</link> allows foregoing to add
* index metadata (at the end of file).
*
* <link linkend="GstMJ2Mux--dts-method">dts-method</link> allows selecting a
* method for managing input timestamps (stay tuned for 0.11 to have this
* automagically settled). The default delta/duration method should handle nice
* (aka perfect streams) just fine, but may experience problems otherwise
* (e.g. input stream with re-ordered B-frames and/or with frame dropping).
* The re-ordering approach re-assigns incoming timestamps in ascending order
* to incoming buffers and offers an alternative in such cases. In cases where
* that might fail, the remaining method can be tried, which is exact and
* according to specs, but might experience playback on not so spec-wise players.
* Note that this latter approach also requires one to enable
* <link linkend="GstMJ2Mux--presentation-timestamp">presentation-timestamp</link>.
* somewhat contrary to this usually being called "the header". However, a
* #GstMJ2Mux:faststart file will (with some effort) arrange this to be located
* near start of the file, which then allows it e.g. to be played while
* downloading. Alternatively, rather than having one chunk of metadata at start
* (or end), there can be some metadata at start and most of the other data can
* be spread out into fragments of #GstMJ2Mux:fragment-duration. If such
* fragmented layout is intended for streaming purposes, then
* #GstMJ2Mux:streamable allows foregoing to add index metadata (at the end of
* file).
*
* <refsect2>
* <title>Example pipelines</title>
@ -257,35 +215,21 @@
* The fragmented file features defined (only) in ISO Base Media are used by
* ISMV files making up (a.o.) Smooth Streaming (ismlmux).
*
* A few properties (<link linkend="GstISMLMux--movie-timescale">movie-timescale</link>,
* <link linkend="GstISMLMux--trak-timescale">trak-timescale</link>) allow adjusting
* some technical parameters, which might be useful in (rare) cases to resolve
* compatibility issues in some situations.
* A few properties (#GstISMLMux:movie-timescale, #GstISMLMux:trak-timescale)
* allow adjusting some technical parameters, which might be useful in (rare)
* cases to resolve compatibility issues in some situations.
*
* Some other properties influence the result more fundamentally.
* A typical mov/mp4 file's metadata (aka moov) is located at the end of the file,
* somewhat contrary to this usually being called "the header".
* However, a <link linkend="GstISMLMux--faststart">faststart</link> file will
* (with some effort) arrange this to be located near start of the file,
* which then allows it e.g. to be played while downloading.
* Alternatively, rather than having one chunk of metadata at start (or end),
* there can be some metadata at start and most of the other data can be spread
* out into fragments of <link linkend="GstISMLMux--fragment-duration">fragment-duration</link>.
* If such fragmented layout is intended for streaming purposes, then
* <link linkend="GstISMLMux--streamable">streamable</link> allows foregoing to add
* index metadata (at the end of file).
*
* <link linkend="GstISMLMux--dts-method">dts-method</link> allows selecting a
* method for managing input timestamps (stay tuned for 0.11 to have this
* automagically settled). The default delta/duration method should handle nice
* (aka perfect streams) just fine, but may experience problems otherwise
* (e.g. input stream with re-ordered B-frames and/or with frame dropping).
* The re-ordering approach re-assigns incoming timestamps in ascending order
* to incoming buffers and offers an alternative in such cases. In cases where
* that might fail, the remaining method can be tried, which is exact and
* according to specs, but might experience playback on not so spec-wise players.
* Note that this latter approach also requires one to enable
* <link linkend="GstISMLMux--presentation-timestamp">presentation-timestamp</link>.
* somewhat contrary to this usually being called "the header". However, a
* #GstISMLMux:faststart file will (with some effort) arrange this to be located
* near start of the file, which then allows it e.g. to be played while
* downloading. Alternatively, rather than having one chunk of metadata at start
* (or end), there can be some metadata at start and most of the other data can
* be spread out into fragments of #GstISMLMux:fragment-duration. If such
* fragmented layout is intended for streaming purposes, then
* #GstISMLMux:streamable allows foregoing to add index metadata (at the end of
* file).
*
* <refsect2>
* <title>Example pipelines</title>

View file

@ -64,23 +64,21 @@
* The fragmented file features defined (only) in ISO Base Media are used by
* ISMV files making up (a.o.) Smooth Streaming (ismlmux).
*
* A few properties (<link linkend="GstQTMux--movie-timescale">movie-timescale</link>,
* <link linkend="GstQTMux--trak-timescale">trak-timescale</link>) allow adjusting
* some technical parameters, which might be useful in (rare) cases to resolve
* compatibility issues in some situations.
* A few properties (#GstQTMux:movie-timescale, #GstQTMux:trak-timescale) allow
* adjusting some technical parameters, which might be useful in (rare) cases to
* resolve compatibility issues in some situations.
*
* Some other properties influence the result more fundamentally.
* A typical mov/mp4 file's metadata (aka moov) is located at the end of the file,
* somewhat contrary to this usually being called "the header".
* However, a <link linkend="GstQTMux--faststart">faststart</link> file will
* (with some effort) arrange this to be located near start of the file,
* which then allows it e.g. to be played while downloading.
* Alternatively, rather than having one chunk of metadata at start (or end),
* there can be some metadata at start and most of the other data can be spread
* out into fragments of <link linkend="GstQTMux--fragment-duration">fragment-duration</link>.
* A typical mov/mp4 file's metadata (aka moov) is located at the end of the
* file, somewhat contrary to this usually being called "the header".
* However, a #GstQTMux:faststart file will (with some effort) arrange this to
* be located near start of the file, which then allows it e.g. to be played
* while downloading. Alternatively, rather than having one chunk of metadata at
* start (or end), there can be some metadata at start and most of the other
* data can be spread out into fragments of #GstQTMux:fragment-duration.
* If such fragmented layout is intended for streaming purposes, then
* <link linkend="GstQTMux--streamable">streamable</link> allows foregoing to add
* index metadata (at the end of file).
* #GstQTMux:streamable allows foregoing to add index metadata (at the end of
* file).
*
* <refsect2>
* <title>Example pipelines</title>

View file

@ -76,12 +76,10 @@
* #GValueArray of #gdouble
* <classname>&quot;decay&quot;</classname>:
* the decaying peak power level in dB for each channel
* the decaying peak level follows the peak level, but starts dropping
* if no new peak is reached after the time given by
* the <link linkend="GstLevel--peak-ttl">the time to live</link>.
* When the decaying peak level drops, it does so at the decay rate
* as specified by the
* <link linkend="GstLevel--peak-falloff">the peak falloff rate</link>.
* The decaying peak level follows the peak level, but starts dropping if no
* new peak is reached after the time given by the #GstLevel:peak-ttl.
* When the decaying peak level drops, it does so at the decay rate as
* specified by the #GstLevel:peak-falloff.
* </para>
* </listitem>
* <listitem>

View file

@ -39,9 +39,9 @@
* their output since they generally save metadata in the file header.
* Therefore, it is often necessary that applications read the results in a bus
* event handler for the tag message. Obtaining the values this way is always
* needed for <link linkend="GstRgAnalysis--num-tracks">album processing</link>
* since the album gain and peak values need to be associated with all tracks of
* an album, not just the last one.
* needed for album processing (see #GstRgAnalysis:num-tracks property) since
* the album gain and peak values need to be associated with all tracks of an
* album, not just the last one.
*
* <refsect2>
* <title>Example launch lines</title>

View file

@ -162,8 +162,7 @@ gst_rg_volume_class_init (GstRgVolumeClass * klass)
*
* If album mode is enabled but the album gain tag is absent in the stream,
* the track gain is used instead. If both gain tags are missing, the value
* of the <link linkend="GstRgVolume--fallback-gain">fallback-gain</link>
* property is used instead.
* of the #GstRgVolume:fallback-gain property is used instead.
*/
g_object_class_install_property (gobject_class, PROP_ALBUM_MODE,
g_param_spec_boolean ("album-mode", "Album mode",
@ -223,24 +222,22 @@ gst_rg_volume_class_init (GstRgVolumeClass * klass)
*
* Applied gain [dB]. This gain is applied to processed buffer data.
*
* This is set to the <link linkend="GstRgVolume--target-gain">target
* gain</link> if amplification by that amount can be applied safely.
* "Safely" means that the volume adjustment does not inflict clipping
* distortion. Should this not be the case, the result gain is set to an
* appropriately reduced value (by applying peak normalization). The proposed
* standard calls this "clipping prevention".
* This is set to the #GstRgVolume:target-gain if amplification by that amount
* can be applied safely. "Safely" means that the volume adjustment does not
* inflict clipping distortion. Should this not be the case, the result gain
* is set to an appropriately reduced value (by applying peak normalization).
* The proposed standard calls this "clipping prevention".
*
* The difference between target and result gain reflects the necessary amount
* of reduction. Applications can make use of this information to temporarily
* reduce the <link linkend="GstRgVolume--pre-amp">pre-amp</link> for
* subsequent streams, as recommended by the ReplayGain standard.
* reduce the #GstRgVolume:pre-amp for subsequent streams, as recommended by
* the ReplayGain standard.
*
* Note that target and result gain differing for a great majority of streams
* indicates a problem: What happens in this case is that most streams receive
* peak normalization instead of amplification by the ideal replay gain. To
* prevent this, the <link linkend="GstRgVolume--pre-amp">pre-amp</link> has
* to be lowered and/or a limiter has to be used which facilitates the use of
* <link linkend="GstRgVolume--headroom">headroom</link>.
* prevent this, the #GstRgVolume:pre-amp has to be lowered and/or a limiter
* has to be used which facilitates the use of #GstRgVolume:headroom.
*/
g_object_class_install_property (gobject_class, PROP_RESULT_GAIN,
g_param_spec_double ("result-gain", "Result-gain", "Applied gain [dB]",
@ -250,18 +247,14 @@ gst_rg_volume_class_init (GstRgVolumeClass * klass)
*
* Applicable gain [dB]. This gain is supposed to be applied.
*
* Depending on the value of the <link
* linkend="GstRgVolume--album-mode">album-mode</link> property and the
* Depending on the value of the #GstRgVolume:album-mode property and the
* presence of ReplayGain tags in the stream, this is set according to one of
* these simple formulas:
*
* <itemizedlist>
* <listitem><link linkend="GstRgVolume--pre-amp">pre-amp</link> + album gain
* of the stream</listitem>
* <listitem><link linkend="GstRgVolume--pre-amp">pre-amp</link> + track gain
* of the stream</listitem>
* <listitem><link linkend="GstRgVolume--pre-amp">pre-amp</link> + <link
* linkend="GstRgVolume--fallback-gain">fallback gain</link></listitem>
* <listitem>#GstRgVolume:pre-amp + album gain of the stream</listitem>
* <listitem>#GstRgVolume:pre-amp + track gain of the stream</listitem>
* <listitem>#GstRgVolume:pre-amp + #GstRgVolume:fallback-gain</listitem>
* </itemizedlist>
*/
g_object_class_install_property (gobject_class, PROP_TARGET_GAIN,