2004-02-13 13:04:54 +00:00
|
|
|
What we are trying to achieve:
|
|
|
|
|
|
|
|
satisfy:
|
|
|
|
patching of CVS checkout using our patch files placed in our CVS
|
|
|
|
|
|
|
|
passing of
|
|
|
|
make
|
|
|
|
make distcheck
|
|
|
|
non-srcdir build (ie, mkdir build; cd build; ../configure; make)
|
|
|
|
|
|
|
|
How it works:
|
|
|
|
|
|
|
|
* configure checks whether or not it should update ffmpeg from CVS by looking
|
|
|
|
at the nano version number
|
|
|
|
- if it's 1, we're in cvs mode, and it should check it out
|
|
|
|
- if it's not 1, we're in prerel or rel mode, and the code should already
|
|
|
|
be on disk
|
|
|
|
FIXME: we could change this to really check out the source code if some
|
|
|
|
required files aren't there just in case someone checks out from CVS
|
|
|
|
but CVS is not at nano 1
|
|
|
|
|
|
|
|
* patching of the checked-out copy happens at
|
|
|
|
|
|
|
|
Axioms under which we work:
|
|
|
|
- the dist tarball needs to include either
|
|
|
|
- the pristine ffmpeg checkout + our patches + a patch mechanism on make
|
|
|
|
or
|
|
|
|
- the ffmpeg checkout with patches already applied
|
|
|
|
|
|
|
|
- configure/make is not allowed to touch files that already live in the source
|
|
|
|
tree; if they need to then they need to be copied first and cleaned
|
|
|
|
afterward
|
|
|
|
|
|
|
|
- it would be very nice if, on update of either the Tag file or the patch set,
|
|
|
|
make would know exactly what to do with it.
|
|
|
|
|
HACKING: Add some basic documentation on how our wrapping works.
Original commit message from CVS:
* HACKING:
Add some basic documentation on how our wrapping works.
* TODO:
Add a list of things that could be worked on or that need doing.
* configure.ac:
Update snapshot.
* ext/ffmpeg/Makefile.am:
Changne .la links. See below (autotools patch).
* ext/ffmpeg/gstffmpeg.c: (plugin_init):
Enable demuxers. See below (gstffmpegdemux.c).
* ext/ffmpeg/gstffmpegcodecmap.c: (gst_ffmpeg_formatid_to_caps):
Realmedia caused a crash - fix that.
* ext/ffmpeg/gstffmpegdemux.c: (gst_ffmpegdemux_averror),
(gst_ffmpegdemux_base_init), (gst_ffmpegdemux_init),
(gst_ffmpegdemux_close), (gst_ffmpegdemux_dispose),
(gst_ffmpegdemux_stream_from_pad),
(gst_ffmpegdemux_src_event_mask), (gst_ffmpegdemux_src_event),
(gst_ffmpegdemux_src_format_list),
(gst_ffmpegdemux_src_query_list), (gst_ffmpegdemux_src_query),
(gst_ffmpegdemux_src_convert), (gst_ffmpegdemux_add),
(gst_ffmpegdemux_open), (gst_ffmpegdemux_loop),
(gst_ffmpegdemux_change_state), (gst_ffmpegdemux_register):
Right. OK, so I fixed up the demuxing and have it basically-working,
and the best way to get some more people to test it is to actually
enable it. I'm not sure if we want this for 0.8.0, but we can at
least give it a try. I've tested avi, matroska and mpeg, all appear
to work. The cool thing is that this gives us instant support for
several exotic formats that we'd never care about ourselves. Again,
this needs more testing for it to still be enabled in 0.8.0, but I
want to give it a try...
* ext/ffmpeg/gstffmpegmux.c: (gst_ffmpegmux_base_init),
(gst_ffmpegmux_init), (gst_ffmpegmux_request_new_pad),
(gst_ffmpegmux_connect), (gst_ffmpegmux_loop),
(gst_ffmpegmux_register):
Add some fixups that I use locally. Make it work in the case of
MPEG encoding, but the muxer is still not in shape to be enabled.
* ext/ffmpeg/gstffmpegprotocol.c: (gst_ffmpegdata_open),
(gst_ffmpegdata_read), (gst_ffmpegdata_write),
(gst_ffmpegdata_seek), (gst_ffmpegdata_close):
Some small fixups that crept into it while it was disabled for the
last few years. Basically works.
* gst-libs/ext/ffmpeg/Makefile.am:
Instead of having our local-autotoolized version, I patch the ffmpeg
source to be fully autotoolized. That means a simple SUBDIRS here
is now enough.
* gst-libs/ext/ffmpeg/Tag:
Version update.
* gst-libs/ext/ffmpeg/patch/autotools.diff:
Autotoolize ffmpeg. Needs to be sent to ffmpeg-devel@...
* gst-libs/ext/ffmpeg/patch/disableinstalllibs.diff:
Don't install their libs.
* gst-libs/ext/ffmpeg/patch/disablemmx.diff:
Don't use MMX. It cannot ocmpile using PIC.
* gst-libs/ext/ffmpeg/patch/disabletools.diff:
Don't compile/install their tools, we don't use them.
* gst-libs/ext/ffmpeg/patch/functions.diff:
Prevent symbol conflicts.
* gst-libs/ext/ffmpeg/patch/matroska.diff:
Add a matroska demuxer. Needs to be sent to ffmpeg-devel@...
2004-03-01 04:59:17 +00:00
|
|
|
Some notes on how ffmpeg wrapping inside GStreamer currently works:
|
|
|
|
* gstffmpeg{dec,enc,demux,mux}.c are wrappers for specific element types from
|
|
|
|
their ffmpeg counterpart. If you want to wrap a new type of element in
|
|
|
|
ffmpeg (e.g. the URLProtocol things), then you'd need to write a new
|
|
|
|
wrapper file.
|
|
|
|
|
|
|
|
* gstffmpegcolorspace.c is a wrapper for one specific function in ffmpeg:
|
|
|
|
colorspace conversion. This works different from the previously mentioned
|
|
|
|
ones, and we'll come to that in the next item. If you want to wrap one
|
|
|
|
specific function, then that, too, belongs in a new wrapper file.
|
|
|
|
|
|
|
|
* the important difference between all those is that the colorspace element
|
|
|
|
contains one element, so there is a 1<->1 mapping. This makes for a fairly
|
|
|
|
basic element implementation. gstffmpegcolorspace.c, therefore, doesn't
|
|
|
|
differ much from other colorspace elements. The ffmpeg element types,
|
|
|
|
however, define a whole *list* of elements (in GStreamer, each decoder etc.
|
|
|
|
needs to be its own element). We use a set of tricks for that to keep
|
|
|
|
coding simple: codec mapping and dynamic type creation.
|
|
|
|
|
|
|
|
* ffmpeg uses CODEC_ID_* enumerations for their codecs. GStreamer uses caps,
|
|
|
|
which consists of a mimetype and a defined set of properties. In ffmpeg,
|
|
|
|
these properties live in a AVCodecContext struct, which contains anything
|
|
|
|
that could configure any codec (which makes it rather messy, but ohwell).
|
|
|
|
To convert from one to the other, we use codec mapping, which is done in
|
|
|
|
gstffmpegcodecmap.[ch]. This is the most important file in the whole
|
|
|
|
ffmpeg wrapping process! It contains functions to go from a codec type
|
|
|
|
(video or audio - used as the output format for decoding or the input
|
|
|
|
format for encoding), a codec id (to identify each format) or a format id
|
|
|
|
(a string identifying a file format - usually the file format extension)
|
|
|
|
to a GstCaps, and the other way around.
|
|
|
|
|
|
|
|
* to define multiple elements in one source file (which all behave similarly),
|
|
|
|
we dynamically create types for each plugin and let all of them operate on
|
|
|
|
the same struct (GstFFMpegDec, GstFFMpegEnc, ...). The functions in
|
|
|
|
gstffmpeg{dec,enc,demux,mux}.c called gst_ffmpeg*_register() do this.
|
|
|
|
The magic is as follows: for each codec or format, ffmpeg has a single
|
|
|
|
AVCodec or AV{Input,Output}Format, which are packed together in a list of
|
|
|
|
supported codecs/formats. We simply walk through the list, for each of
|
|
|
|
those, we check whether gstffmpegcodecmap.c knows about this single one.
|
|
|
|
If it does, we get the GstCaps for each pad template that belongs to it,
|
|
|
|
and register a type for all of those together. We also leave this inside
|
|
|
|
a caching struct, that will later be used by the base_init() function to
|
|
|
|
fill in information about this specific codec in the class struct of this
|
|
|
|
element (pad templates and codec/format information). Since the actual
|
|
|
|
codec information is the only thing that really makes each codec/format
|
|
|
|
different (they all behave the same through the ffmpeg API), we don't
|
|
|
|
really need to do anything else that is codec-specific, so all other
|
|
|
|
functions are rather simple.
|
|
|
|
|
|
|
|
* one particular thing that needs mention is how gstffmpeg{mux,demux}.c and
|
|
|
|
gstffmpegprotocol.c interoperate. ffmpeg uses URLProtocols for data input
|
|
|
|
and output. Now, of course, we want to use the *GStreamer* way of doing
|
|
|
|
input and output (filesrc, ...) rather than the ffmpeg way. Therefore, we
|
|
|
|
wrap up a GstPad as a URLProtocol and register this with ffmpeg. This is
|
|
|
|
what gstffmpegprotocol.c does. The URL is called gstreamer://%p, where %p
|
|
|
|
is the address of a GstPad. gstffmpeg{mux,demux}.c then open a file called
|
|
|
|
gstreamer://%p, with %p being their source/sink pad, respectively. This
|
|
|
|
way, we use GStreamer for data input/output through the ffmpeg API. It's
|
|
|
|
rather ugly, but it has worked quite well so far.
|
|
|
|
|
|
|
|
* there's lots of things that still need doing. See the TODO file for more
|
|
|
|
information.
|