doc: fix element section documentations

Element sections were not rendered anymore after the hotdoc
port, fixing this revealed a few incorrect links.
This commit is contained in:
Mathieu Duponchelle 2019-05-25 16:56:32 +02:00
parent 4e0bdca3f0
commit d704790519
13 changed files with 29 additions and 29 deletions

View file

@ -48,7 +48,7 @@
* This element is meant for easy no-hassle video snapshotting. It is not
* suitable for video playback or video display at high framerates. Use
* ximagesink, xvimagesink or some other suitable video sink in connection
* with the #GstXOverlay interface instead if you want to do video playback.
* with the #GstVideoOverlay interface instead if you want to do video playback.
*
* ## Message details
*
@ -60,7 +60,7 @@
*
* * `pixbuf`: the #GdkPixbuf object
* * `pixel-aspect-ratio`: the pixel aspect ratio (PAR) of the input image
* (this field contains a #GstFraction); the
* (this field contains a value of type #GST_TYPE_FRACTION); the
* PAR is usually 1:1 for images, but is often something non-1:1 in the case
* of video input. In this case the image may be distorted and you may need
* to rescale it accordingly before saving it to file or displaying it. This

View file

@ -31,7 +31,7 @@
* If the server is not an Icecast server, it will behave as if the
* #GstSoupHTTPSrc:iradio-mode property were not set. If it is, souphttpsrc will
* output data with a media type of application/x-icy, in which case you will
* need to use the #ICYDemux element as follow-up element to extract the Icecast
* need to use the #GstICYDemux element as follow-up element to extract the Icecast
* metadata and to determine the underlying media type.
*
* ## Example launch line

View file

@ -30,10 +30,10 @@
* </ulink>. It's the successor of On2 VP3, which was the base of the
* Theora video codec.
*
* To control the quality of the encoding, the #GstVP8Enc::target-bitrate,
* #GstVP8Enc::min-quantizer, #GstVP8Enc::max-quantizer or #GstVP8Enc::cq-level
* To control the quality of the encoding, the #GstVP8Enc:target-bitrate,
* #GstVP8Enc:min-quantizer, #GstVP8Enc:max-quantizer or #GstVP8Enc:cq-level
* properties can be used. Which one is used depends on the mode selected by
* the #GstVP8Enc::end-usage property.
* the #GstVP8Enc:end-usage property.
* See <ulink url="http://www.webmproject.org/docs/encoder-parameters/">Encoder Parameters</ulink>
* for explanation, examples for useful encoding parameters and more details
* on the encoding parameters.

View file

@ -30,10 +30,10 @@
* </ulink>. It's the successor of On2 VP3, which was the base of the
* Theora video codec.
*
* To control the quality of the encoding, the #GstVP9Enc::target-bitrate,
* #GstVP9Enc::min-quantizer, #GstVP9Enc::max-quantizer or #GstVP9Enc::cq-level
* To control the quality of the encoding, the #GstVP9Enc:target-bitrate,
* #GstVP9Enc:min-quantizer, #GstVP9Enc:max-quantizer or #GstVP9Enc:cq-level
* properties can be used. Which one is used depends on the mode selected by
* the #GstVP9Enc::end-usage property.
* the #GstVP9Enc:end-usage property.
* See <ulink url="http://www.webmproject.org/docs/encoder-parameters/">Encoder Parameters</ulink>
* for explanation, examples for useful encoding parameters and more details
* on the encoding parameters.

View file

@ -66,20 +66,20 @@
* The fragmented file features defined (only) in ISO Base Media are used by
* ISMV files making up (a.o.) Smooth Streaming (ismlmux).
*
* A few properties (#GstMp4Mux:movie-timescale, #GstMp4Mux:trak-timescale)
* A few properties (#GstMP4Mux:movie-timescale, #GstMP4Mux:trak-timescale)
* allow adjusting some technical parameters, which might be useful in (rare)
* cases to resolve compatibility issues in some situations.
*
* Some other properties influence the result more fundamentally.
* A typical mov/mp4 file's metadata (aka moov) is located at the end of the
* file, somewhat contrary to this usually being called "the header".
* However, a #GstMp4Mux:faststart file will (with some effort) arrange this to
* However, a #GstMP4Mux:faststart file will (with some effort) arrange this to
* be located near start of the file, which then allows it e.g. to be played
* while downloading. Alternatively, rather than having one chunk of metadata at
* start (or end), there can be some metadata at start and most of the other
* data can be spread out into fragments of #GstMp4Mux:fragment-duration.
* data can be spread out into fragments of #GstMP4Mux:fragment-duration.
* If such fragmented layout is intended for streaming purposes, then
* #GstMp4Mux:streamable allows foregoing to add index metadata (at the end of
* #GstMP4Mux:streamable allows foregoing to add index metadata (at the end of
* file).
*
* ## Example pipelines

View file

@ -90,12 +90,12 @@
* is interrupted uncleanly by a crash. Robust muxing mode requires a seekable
* output, such as filesink, because it needs to rewrite the start of the file.
*
* To enable robust muxing mode, set the #GstQTMux::reserved-moov-update-period
* and #GstQTMux::reserved-max-duration property. Also present is the
* #GstQTMux::reserved-bytes-per-sec property, which can be increased if
* To enable robust muxing mode, set the #GstQTMux:reserved-moov-update-period
* and #GstQTMux:reserved-max-duration property. Also present is the
* #GstQTMux:reserved-bytes-per-sec property, which can be increased if
* for some reason the default is not large enough and the initial reserved
* space for headers is too small. Applications can monitor the
* #GstQTMux::reserved-duration-remaining property to see how close to full
* #GstQTMux:reserved-duration-remaining property to see how close to full
* the reserved space is becoming.
*
* Applications that wish to be able to use/edit a file while it is being
@ -104,7 +104,7 @@
* completely valid header from the start for all tracks (i.e. it appears as
* though the file is "reserved-max-duration" long with all samples
* present). This mode can be enabled by setting the
* #GstQTMux::reserved-moov-update-period and #GstQTMux::reserved-prefill
* #GstQTMux:reserved-moov-update-period and #GstQTMux:reserved-prefill
* properties. Note that this mode is only possible with input streams that have
* a fixed sample size (such as raw audio and Prores Video) and that don't
* have reordered samples.

View file

@ -29,7 +29,7 @@
* after the first picture. We also need a videorate element to set timestamps
* on all buffers after the first one in accordance with the framerate.
*
* File names are created by replacing "\%d" with the index using printf().
* File names are created by replacing "\%d" with the index using `printf()`.
*
* ## Example launch line
* |[

View file

@ -25,8 +25,8 @@
* @title: rtprtxqueue
*
* rtprtxqueue maintains a queue of transmitted RTP packets, up to a
* configurable limit (see #GstRTPRtxQueue::max-size-time,
* #GstRTPRtxQueue::max-size-packets), and retransmits them upon request
* configurable limit (see #GstRTPRtxQueue:max-size-time,
* #GstRTPRtxQueue:max-size-packets), and retransmits them upon request
* from the downstream rtpsession (GstRTPRetransmissionRequest event).
*
* This element is similar to rtprtxsend, but it has differences:

View file

@ -40,7 +40,7 @@
* * Support for multiple sender SSRC.
*
* The rtpsession will not demux packets based on SSRC or payload type, nor will
* it correct for packet reordering and jitter. Use #GstRtpsSrcDemux,
* it correct for packet reordering and jitter. Use #GstRtpSsrcDemux,
* #GstRtpPtDemux and GstRtpJitterBuffer in addition to #GstRtpSession to
* perform these tasks. It is usually a good idea to use #GstRtpBin, which
* combines all these features in one element.

View file

@ -73,7 +73,7 @@
*
* The message's structure contains three fields:
*
* #GstRTSPSrcTimeoutCause `cause`: the cause of the timeout.
* GstRTSPSrcTimeoutCause `cause`: the cause of the timeout.
*
* #gint `stream-number`: an internal identifier of the stream that timed out.
*

View file

@ -25,7 +25,7 @@
* framerate. The two incoming buffers are blended together using an effect
* specific alpha mask.
*
* The #GstSmpte:depth property defines the presision in bits of the mask. A
* The #GstSMPTE:depth property defines the presision in bits of the mask. A
* higher presision will create a mask with smoother gradients in order to avoid
* banding.
*

View file

@ -25,12 +25,12 @@
* using an effect specific SMPTE mask in the I420 input case. In the AYUV case,
* the alpha channel is modified using the effect specific SMPTE mask.
*
* The #GstSmpteAlpha:position property is a controllabe double between 0.0 and
* The #GstSMPTEAlpha:position property is a controllabe double between 0.0 and
* 1.0 that specifies the position in the transition. 0.0 is the start of the
* transition with the alpha channel to complete opaque where 1.0 has the alpha
* channel set to completely transparent.
*
* The #GstSmpteAlpha:depth property defines the precision in bits of the mask.
* The #GstSMPTEAlpha:depth property defines the precision in bits of the mask.
* A higher presision will create a mask with smoother gradients in order to
* avoid banding.
*

View file

@ -36,15 +36,15 @@
* * #GstClockTime `duration`: the duration of the buffer.
* * #GstClockTime `endtime`: the end time of the buffer that triggered the message as stream time (this
* is deprecated, as it can be calculated from stream-time + duration)
* * #GstValueList of #gfloat `magnitude`: the level for each frequency band in dB.
* * A #GST_TYPE_LIST value of #gfloat `magnitude`: the level for each frequency band in dB.
* All values below the value of the
* #GstSpectrum:threshold property will be set to the threshold. Only present
* if the #GstSpectrum:message-magnitude property is %TRUE.
* * #GstValueList of #gfloat `phase`: The phase for each frequency band. The value is between -pi and pi. Only
* * A #GST_TYPE_LIST of #gfloat `phase`: The phase for each frequency band. The value is between -pi and pi. Only
* present if the #GstSpectrum:message-phase property is %TRUE.
*
* If #GstSpectrum:multi-channel property is set to true. magnitude and phase
* fields will be each a nested #GstValueArray. The first dimension are the
* fields will be each a nested #GST_TYPE_ARRAY value. The first dimension are the
* channels and the second dimension are the values.
*
* ## Example application