Added CVS ignore, and two documents..

Original commit message from CVS:
Added CVS ignore, and two documents..
This commit is contained in:
Wim Taymans 2001-02-06 20:06:22 +00:00
parent d88779134d
commit 5e1257437d
3 changed files with 194 additions and 0 deletions

0
docs/random/.gitignore vendored Normal file
View file

63
docs/random/wtay/CORBA Normal file
View file

@ -0,0 +1,63 @@
CORBA and bonobo ramblings
==========================
This is about something I know nothing about, so I want
your views on the subject :)
statements in this doc might contain plain nonsense and
utter ignorance about the subject, so feel free to correct
no matter how many brown paper bags I need to put over my
head.
CORBA
-----
The object request broker. It basically allows you to declare
objects using a language called IDL. It has the nice benefit
of allowing objects to live in other contexts, languages and
even on other machines over the network.
Wrapping the GStreamer objects in CORBA objects doesn't look
like a problem. It will immediatly allow us to create objects
accros the network and use the framework in a distributed
environment.
You will end up with a lot of corba objects using this method.
Is this the way to do it? do we only need to CORBA-ify some
of the core objects instead?
I see a CORBA wrapper as something that exposes the API of
GStreamer. If I want to do distributed media processing I would
build up an app with corba calls. The point is that you use
the lowlevel CORBA API to create a distributed media app.
Bonobo
------
A component model build with CORBA. Bonobo has provisions for
creating embedable objects. As I understand it, this means that
it has something visible to embed. I know you can also use
Bonobo without the GUI parts but why would we prefer Bonobo
over CORBA to handle that?
Bonobo has a framework to create toolbars, menus and other
neat stuff. It also has a serialisation mechanism that allows
you to, for example, merge a pipeline with a document in one
single stream.
I see bonobo as a high level service provider, you create a
mediaplayer component that can be embeded into a document and
stuff like that. The point here is that you use bonobo to
create services out of user apps build with GStreamer.
Comments?
Wim

131
docs/random/wtay/autoplug2 Normal file
View file

@ -0,0 +1,131 @@
Autoplugger V2
==============
The current autoplugger as described in autoplug1 has some
serious shortcommings:
- it is embedded in GstPipeline and cannot be used with a
generic interface. A lot of complexity is inside the
pipeline, more specifically the creation of threads and
subbins and the pad connections.
- it is not (easily) pluggable.
1) definition
-------------
We want to define a plugable framework for autoplugging this
includes:
- autoplugging algorithms can be added and removed at run time.
- autoplugging algorithms can be defined by plugins.
The autoplugger is build to handle simple media playback but
could also be used to create more complex pipelines.
The main focus will be on creating an element (can be a bin)
that has *one* input pad and *one or more* output pads.
It must be possible for a user app to insert its own elements in
the autogenerated element.
The autoplugger will be an interface to the low level plugin
system based on the functional requirements of the user app.
the app will for example request an object that can convert
media type X to media type Y, the user app is not interested in
any intermediate steps to accomplish this conversion.
2) the API
----------
The API for the user apps should be no more then this:
GstElement* gst_autoplug_construct (GstAutoplug *autoplug,
GstCaps *incaps,
GstCaps *outcaps, ...);
autoplug is a reference to the autoplug implementation
incaps is a GstCaps handle for the source pad, the last set
of arguments is a va_list of destination caps.
A handle to the autoplugger implementation can be obtained
with
GList* gst_autoplug_get_list (void);
which will return a GList* of autopluggers.
GstAutoplug* gst_autoplug_get ("name");
is used to get an autoplugger.
3) the plugins API
------------------
plugins can add their own autoplugger implementation by
subclassing an abstract autoplugger class and implementing/
overriding various methods.
the autoplugger can be registered with:
gst_plugin_add_autoplugger (GstPlugin *plugin,
GstAutoplug *autoplug);
This will allow us to only load the autoplugger when needed.
4) implementation
-----------------
We will implement two autopluggers:
- a static autoplugger. This autoplugger recursively adds
elements to the target element until all of the possible
pads are connected to something. The static autoplugger
only operates on padtemplates and ALWAYS pads. The pipeline
is not started before all elements are connected, hence
the 'static' autoplugger. This autoplugger will be a rework
of the current autoplugger.
- a dynamic autoplugger. This autoplugger configures the
pipeline at runtime based on the pad capabilities when they
become available. this allows for more fine grained
autoplugging than can be achieved with the static one because
it can be based on the actual media stream you are handling.
the autopluggers will be implemented in their separate plugins,
outside of the core libraries and are therefore optional.
5) the autoplugger object
-------------------------
the autoplugger object will be an abstract class with the following
properties:
- name, description, more text to identify the autoplugger.
- a class method autoplug_construct that has to be implemented by
the real autoplugger.
- possible PadTemplates that this autoplugger can handle well?
optionally, the core autoplugger code can provide convenience
functions to implement custom autopluggers. The shortest path
algorithm with pluggable weighting and list functions come to
mind.
signals will be added to the autoplugger so that user apps can
modify the constructed pipeline by adding extra objects.
A possible use case would be to let gstmediaplay perform an
autoplug on the media stream and insert a custom sound/video
effect in the pipeline when an appropriate element is created.
comments?
Wim