docstrings: port ulinks to markdown links

This commit is contained in:
Mathieu Duponchelle 2019-08-23 19:56:35 +02:00
parent c5738c6125
commit 42adb02a10
30 changed files with 74 additions and 65 deletions

View file

@ -27,8 +27,8 @@
*
* The chromaprint element calculates an acoustic fingerprint for an
* audio stream which can be used to identify a song and look up
* further metadata from the <ulink url="http://acoustid.org/">Acoustid</ulink>
* and Musicbrainz databases.
* further metadata from the [Acoustid](http://acoustid.org/) and Musicbrainz
* databases.
*
* ## Example launch line
* |[

View file

@ -23,19 +23,21 @@
* @title: dfbvideosink
*
* DfbVideoSink renders video frames using the
* <ulink url="http://www.directfb.org/">DirectFB</ulink> library.
* [DirectFB](http://www.directfb.org/) library.
* Rendering can happen in two different modes :
*
* * Standalone: this mode will take complete control of the monitor forcing
* <ulink url="http://www.directfb.org/">DirectFB</ulink> to fullscreen layout.
* DirectFB to fullscreen layout.
*
* This is convenient to test using the gst-launch-1.0 command line tool or
* other simple applications. It is possible to interrupt playback while
* being in this mode by pressing the Escape key.
* This mode handles navigation events for every input device supported by
* the <ulink url="http://www.directfb.org/">DirectFB</ulink> library, it will
* look for available video modes in the fb.modes file and try to switch
* the framebuffer video mode to the most suitable one. Depending on
* hardware acceleration capabilities the element will handle scaling or not.
* the DirectFB library, it will look for available video modes in the fb.modes
* file and try to switch the framebuffer video mode to the most suitable one.
* Depending on hardware acceleration capabilities the element will handle
* scaling or not.
*
* If no acceleration is available it will do clipping or centering of the
* video frames respecting the original aspect ratio.
*
@ -43,7 +45,8 @@
* #GstDfbVideoSink:surface provided by the
* application developer. This is a more advanced usage of the element and
* it is required to integrate video playback in existing
* <ulink url="http://www.directfb.org/">DirectFB</ulink> applications.
* DirectFB applications.
*
* When using this mode the element just renders to the
* #GstDfbVideoSink:surface provided by the
* application, that means it won't handle navigation events and won't resize

View file

@ -25,7 +25,7 @@
* @see_also: timidity, wildmidi
*
* This element renders midi-events as audio streams using
* <ulink url="http://fluidsynth.sourceforge.net//">Fluidsynth</ulink>.
* [Fluidsynth](http://fluidsynth.sourceforge.net/).
* It offers better sound quality compared to the timidity or wildmidi element.
*
* ## Example pipeline

View file

@ -48,8 +48,9 @@
* @title: katedec
* @see_also: oggdemux
*
* This element decodes Kate streams
* <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec
* This element decodes Kate streams.
*
* [Kate](http://libkate.googlecode.com/) is a free codec
* for text based data, such as subtitles. Any number of kate streams can be
* embedded in an Ogg stream.
*

View file

@ -49,10 +49,11 @@
* @title: kateenc
* @see_also: oggmux
*
* This element encodes Kate streams
* <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec
* for text based data, such as subtitles. Any number of kate streams can be
* embedded in an Ogg stream.
* This element encodes Kate streams.
*
* [Kate](http://libkate.googlecode.com/) is a free codec for text based data,
* such as subtitles. Any number of kate streams can be embedded in an Ogg
* stream.
*
* libkate (see above url) is needed to build this plugin.
*

View file

@ -48,12 +48,12 @@
* @title: tiger
* @see_also: katedec
*
* This element decodes and renders Kate streams
* <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec
* for text based data, such as subtitles. Any number of kate streams can be
* embedded in an Ogg stream.
* This element decodes and renders Kate streams.
* [Kate](http://libkate.googlecode.com/) is a free codec for text based data,
* such as subtitles. Any number of kate streams can be embedded in an Ogg
* stream.
*
* libkate (see above url) and <ulink url="http://libtiger.googlecode.com/">libtiger</ulink>
* libkate (see above url) and [libtiger](http://libtiger.googlecode.com/)
* are needed to build this element.
*
* ## Example pipeline

View file

@ -27,7 +27,8 @@
* @see_also: #GstAudioConvert #GstAudioResample, #GstAudioTestSrc, #GstAutoAudioSink
*
* The LADSPA (Linux Audio Developer's Simple Plugin API) element is a bridge
* for plugins using the <ulink url="http://www.ladspa.org/">LADSPA</ulink> API.
* for plugins using the [LADSPA](http://www.ladspa.org/) API.
*
* It scans all installed LADSPA plugins and registers them as gstreamer
* elements. If available it can also parse LRDF files and use the metadata for
* element classification. The functionality you get depends on the LADSPA plugins

View file

@ -30,8 +30,8 @@
* a successor to LADSPA (Linux Audio Developer's Simple Plugin API).
*
* The LV2 element is a bridge for plugins using the
* <ulink url="http://www.lv2plug.in/">LV2</ulink> API. It scans all
* installed LV2 plugins and registers them as gstreamer elements.
* [LV2](http://www.lv2plug.in/) API. It scans all installed LV2 plugins and
* registers them as gstreamer elements.
*/
#ifdef HAVE_CONFIG_H

View file

@ -28,8 +28,8 @@
/**
* SECTION:element-modplug
*
* Modplug uses the <ulink url="http://modplug-xmms.sourceforge.net/">modplug</ulink>
* library to decode tracked music in the MOD/S3M/XM/IT and related formats.
* Modplug uses the [modplug](http://modplug-xmms.sourceforge.net/) library to
* decode tracked music in the MOD/S3M/XM/IT and related formats.
*
* ## Example pipeline
*

View file

@ -25,9 +25,10 @@
* @see_also: mpeg2dec
*
* This element encodes raw video into an MPEG-1/2 elementary stream using the
* <ulink url="http://mjpeg.sourceforge.net/">mjpegtools</ulink> library.
* [mjpegtools](http://mjpeg.sourceforge.net/) library.
*
* Documentation on MPEG encoding in general can be found in the
* <ulink url="https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776">MJPEG Howto</ulink>
* [MJPEG Howto](https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776)
* and on the various available parameters in the documentation
* of the mpeg2enc tool in particular, which shares options with this element.
*

View file

@ -26,9 +26,9 @@
*
* This element is an audio/video multiplexer for MPEG-1/2 video streams
* and (un)compressed audio streams such as AC3, MPEG layer I/II/III.
* It is based on the <ulink url="http://mjpeg.sourceforge.net/">mjpegtools</ulink> library.
* It is based on the [mjpegtools](http://mjpeg.sourceforge.net/) library.
* Documentation on creating MPEG videos in general can be found in the
* <ulink url="https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776">MJPEG Howto</ulink>
* [MJPEG Howto](https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776)
* and the man-page of the mplex tool documents the properties of this element,
* which are shared with the mplex tool.
*

View file

@ -23,8 +23,8 @@
* @see_also: #GstOpenMptDec
*
* openmpdec decodes module music formats, such as S3M, MOD, XM, IT.
* It uses the <ulink url="https://lib.openmpt.org">OpenMPT library</ulink>
* for this purpose. It can be autoplugged and therefore works with decodebin.
* It uses the [OpenMPT library](https://lib.openmpt.org) for this purpose.
* It can be autoplugged and therefore works with decodebin.
*
* ## Example launch line
*

View file

@ -23,7 +23,7 @@
* SECTION:element-srtsink
* @title: srtsink
*
* srtsink is a network sink that sends <ulink url="http://www.srtalliance.org/">SRT</ulink>
* srtsink is a network sink that sends [SRT](http://www.srtalliance.org/)
* packets to the network.
*
* ## Examples</title>

View file

@ -23,7 +23,7 @@
* SECTION:element-srtsrc
* @title: srtsrc
*
* srtsrc is a network source that reads <ulink url="http://www.srtalliance.org/">SRT</ulink>
* srtsrc is a network source that reads [SRT](http://www.srtalliance.org/)
* packets from the network.
*
* ## Examples

View file

@ -21,8 +21,9 @@
* SECTION:element-voaacenc
* @title: voaacenc
*
* AAC audio encoder based on vo-aacenc library
* <ulink url="http://sourceforge.net/projects/opencore-amr/files/vo-aacenc/">vo-aacenc library source file</ulink>.
* AAC audio encoder based on vo-aacenc library.
*
* [vo-aacenc library source file](http://sourceforge.net/projects/opencore-amr/files/vo-aacenc/)
*
* ## Example launch line
* |[

View file

@ -23,7 +23,7 @@
* @see_also: #GstAmrWbDec, #GstAmrWbParse
*
* AMR wideband encoder based on the
* <ulink url="http://www.penguin.cz/~utx/amr">reference codec implementation</ulink>.
* [reference codec implementation](http://www.penguin.cz/~utx/amr).
*
* ## Example launch line
* |[

View file

@ -27,8 +27,9 @@
*
* The waylandsink is creating its own window and render the decoded video frames to that.
* Setup the Wayland environment as described in
* <ulink url="http://wayland.freedesktop.org/building.html">Wayland</ulink> home page.
* The current implementaion is based on weston compositor.
* [Wayland](http://wayland.freedesktop.org/building.html) home page.
*
* The current implementation is based on weston compositor.
*
* ## Example pipelines
* |[

View file

@ -23,7 +23,7 @@
* @title: GstWebRTCDataChannel
* @see_also: #GstWebRTCRTPTransceiver
*
* <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport">http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport</ulink>
* <http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport>
*/
#ifdef HAVE_CONFIG_H

View file

@ -23,8 +23,9 @@
* @see_also: #GstWildmidiDec
*
* wildmididec decodes MIDI files.
* It uses <ulink url="https://www.mindwerks.net/projects/wildmidi/">WildMidi</ulink>
* for this purpose. It can be autoplugged and therefore works with decodebin.
*
* It uses [WildMidi](https://www.mindwerks.net/projects/wildmidi/) for this
* purpose. It can be autoplugged and therefore works with decodebin.
*
* ## Example launch line
*

View file

@ -23,7 +23,7 @@
* @title: GstWebRTCDTLSTransport
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver, #GstWebRTCICETransport
*
* <ulink url="https://www.w3.org/TR/webrtc/#rtcdtlstransport">https://www.w3.org/TR/webrtc/#rtcdtlstransport</ulink>
* <https://www.w3.org/TR/webrtc/#rtcdtlstransport>
*/
#ifdef HAVE_CONFIG_H

View file

@ -23,7 +23,7 @@
* @title: GstWebRTCICETransport
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver, #GstWebRTCDTLSTransport
*
* <ulink url="https://www.w3.org/TR/webrtc/#rtcicetransport">https://www.w3.org/TR/webrtc/#rtcicetransport</ulink>
* <https://www.w3.org/TR/webrtc/#rtcicetransport>
*/
#ifdef HAVE_CONFIG_H

View file

@ -22,7 +22,7 @@
* @short_description: RTCSessionDescription object
* @title: GstWebRTCSessionDescription
*
* <ulink url="https://www.w3.org/TR/webrtc/#rtcsessiondescription-class">https://www.w3.org/TR/webrtc/#rtcsessiondescription-class</ulink>
* <https://www.w3.org/TR/webrtc/#rtcsessiondescription-class>
*/
#ifdef HAVE_CONFIG_H

View file

@ -38,7 +38,7 @@ GType gst_webrtc_session_description_get_type (void);
* @type: the #GstWebRTCSDPType of the description
* @sdp: the #GstSDPMessage of the description
*
* See <ulink url="https://www.w3.org/TR/webrtc/#rtcsessiondescription-class">https://www.w3.org/TR/webrtc/#rtcsessiondescription-class</ulink>
* See <https://www.w3.org/TR/webrtc/#rtcsessiondescription-class>
*/
struct _GstWebRTCSessionDescription
{

View file

@ -23,7 +23,7 @@
* @title: GstWebRTCRTPReceiver
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPTransceiver
*
* <ulink url="https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface">https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface</ulink>
* <https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface>
*/
#ifdef HAVE_CONFIG_H

View file

@ -23,7 +23,7 @@
* @title: GstWebRTCRTPSender
* @see_also: #GstWebRTCRTPReceiver, #GstWebRTCRTPTransceiver
*
* <ulink url="https://www.w3.org/TR/webrtc/#rtcrtpsender-interface">https://www.w3.org/TR/webrtc/#rtcrtpsender-interface</ulink>
* <https://www.w3.org/TR/webrtc/#rtcrtpsender-interface>
*/
#ifdef HAVE_CONFIG_H

View file

@ -23,7 +23,7 @@
* @title: GstWebRTCRTPTransceiver
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver
*
* <ulink url="https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface">https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface</ulink>
* <https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface>
*/
#ifdef HAVE_CONFIG_H

View file

@ -82,7 +82,7 @@ typedef enum /*< underscore_name=gst_webrtc_dtls_transport_state >*/
* @GST_WEBRTC_ICE_GATHERING_STATE_GATHERING: gathering
* @GST_WEBRTC_ICE_GATHERING_STATE_COMPLETE: complete
*
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate">http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate</ulink>
* See <http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate>
*/
typedef enum /*< underscore_name=gst_webrtc_ice_gathering_state >*/
{
@ -101,7 +101,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_gathering_state >*/
* @GST_WEBRTC_ICE_CONNECTION_STATE_DISCONNECTED: disconnected
* @GST_WEBRTC_ICE_CONNECTION_STATE_CLOSED: closed
*
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate">http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate</ulink>
* See <http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate>
*/
typedef enum /*< underscore_name=gst_webrtc_ice_connection_state >*/
{
@ -123,7 +123,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_connection_state >*/
* @GST_WEBRTC_SIGNALING_STATE_HAVE_LOCAL_PRANSWER: have-local-pranswer
* @GST_WEBRTC_SIGNALING_STATE_HAVE_REMOTE_PRANSWER: have-remote-pranswer
*
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate">http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate</ulink>
* See <http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate>
*/
typedef enum /*< underscore_name=gst_webrtc_signaling_state >*/
{
@ -144,7 +144,7 @@ typedef enum /*< underscore_name=gst_webrtc_signaling_state >*/
* @GST_WEBRTC_PEER_CONNECTION_STATE_FAILED: failed
* @GST_WEBRTC_PEER_CONNECTION_STATE_CLOSED: closed
*
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate">http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate</ulink>
* See <http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate>
*/
typedef enum /*< underscore_name=gst_webrtc_peer_connection_state >*/
{
@ -185,7 +185,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_component >*/
* @GST_WEBRTC_SDP_TYPE_ANSWER: answer
* @GST_WEBRTC_SDP_TYPE_ROLLBACK: rollback
*
* See <ulink url="http://w3c.github.io/webrtc-pc/#rtcsdptype">http://w3c.github.io/webrtc-pc/#rtcsdptype</ulink>
* See <http://w3c.github.io/webrtc-pc/#rtcsdptype>
*/
typedef enum /*< underscore_name=gst_webrtc_sdp_type >*/
{
@ -282,7 +282,7 @@ typedef enum /*< underscore_name=gst_webrtc_fec_type >*/
* GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED: connected
* GST_WEBRTC_SCTP_TRANSPORT_STATE_CLOSED: closed
*
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate">http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate</ulink>
* See <http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate>
*
* Since: 1.16
*/
@ -301,7 +301,7 @@ typedef enum /*< underscore_name=gst_webrtc_sctp_transport_state >*/
* GST_WEBRTC_PRIORITY_TYPE_MEDIUM: medium
* GST_WEBRTC_PRIORITY_TYPE_HIGH: high
*
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype">http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype</ulink>
* See <http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype>
*
* Since: 1.16
*/
@ -321,7 +321,7 @@ typedef enum /*< underscore_name=gst_webrtc_priority_type >*/
* GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING: closing
* GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED: closed
*
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate">http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate</ulink>
* See <http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate>
*
* Since: 1.16
*/

View file

@ -38,8 +38,8 @@
*
* The accurip element calculates a CRC for an audio stream which can be used
* to match the audio stream to a database hosted on
* <ulink url="http://accuraterip.com/">AccurateRip</ulink>. This database
* is used to check for a CD rip accuracy.
* [AccurateRip](http://accuraterip.com/). This database is used to check for a
* CD rip accuracy.
*
* ## Example launch line
* |[

View file

@ -65,9 +65,9 @@
* @title: festival
*
* This element connects to a
* <ulink url="http://www.festvox.org/festival/index.html">festival</ulink>
* server process and uses it to synthesize speech. Festival need to run already
* in server mode, started as `festival --server`
* [festival](http://www.festvox.org/festival/index.html) server process and
* uses it to synthesize speech. Festival need to run already in server mode,
* started as `festival --server`
*
* ## Example pipeline
* |[

View file

@ -26,9 +26,8 @@
* #GstPcapParse:src-port and #GstPcapParse:dst-port to restrict which packets
* should be included.
*
* The supported data format is the classical <ulink
* url="https://wiki.wireshark.org/Development/LibpcapFileFormat">libpcap file
* format</ulink>.
* The supported data format is the classical
* [libpcap file format](https://wiki.wireshark.org/Development/LibpcapFileFormat)
*
* ## Example pipelines
* |[