Merging gst-devtools

This commit is contained in:
Thibault Saunier 2021-09-24 16:15:34 -03:00
commit 86a93d746d
260 changed files with 81119 additions and 0 deletions

5
.arcconfig Normal file
View file

@ -0,0 +1,5 @@
{
"phabricator.uri" : "https://phabricator.freedesktop.org/",
"repository.callsign" : "GSTDEV",
"project": "GStreamer Validate"
}

3
.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
*.bak
build*
mesonbuild*

43
.gitlab-ci.yml Normal file
View file

@ -0,0 +1,43 @@
include: "https://gitlab.freedesktop.org/gstreamer/gst-ci/raw/master/gitlab/ci_template.yml"
.local-rules: &local-rules
rules:
- changes:
- validate/launcher/
# Run valgrind if we changed the check.py testsuite
local valgrind core:
extends: '.valgrind fedora x86_64'
variables:
TEST_SUITE: "check.gstreamer\\..*"
<<: *local-rules
local valgrind base:
extends: '.valgrind fedora x86_64'
variables:
TEST_SUITE: "check.gst-plugins-base\\..*"
<<: *local-rules
local valgrind good:
extends: '.valgrind fedora x86_64'
variables:
TEST_SUITE: "check.gst-plugins-good\\..*"
<<: *local-rules
local valgrind ugly:
extends: '.valgrind fedora x86_64'
variables:
TEST_SUITE: "check.gst-plugins-ugly\\..*"
<<: *local-rules
local valgrind bad:
extends: '.valgrind fedora x86_64'
variables:
TEST_SUITE: "check.gst-plugins-bad\\..*"
<<: *local-rules
local valgrind ges:
extends: '.valgrind fedora x86_64'
variables:
TEST_SUITE: "check.gst-editing-services\\..*"
<<: *local-rules

20278
ChangeLog Normal file

File diff suppressed because it is too large Load diff

299
NEWS Normal file
View file

@ -0,0 +1,299 @@
GStreamer 1.20 Release Notes
GStreamer 1.20 has not been released yet. It is scheduled for release
around October/November 2021.
1.19.x is the unstable development version that is being developed in
the git main branch and which will eventually result in 1.20, and 1.19.2
is the current development release in that series
It is expected that feature freeze will be in early October 2021,
followed by one or two 1.19.9x pre-releases and the new 1.20 stable
release around October/November 2021.
1.20 will be backwards-compatible to the stable 1.18, 1.16, 1.14, 1.12,
1.10, 1.8, 1.6,, 1.4, 1.2 and 1.0 release series.
See https://gstreamer.freedesktop.org/releases/1.20/ for the latest
version of this document.
Last updated: Wednesday 22 September 2021, 18:00 UTC (log)
Introduction
The GStreamer team is proud to announce a new major feature release in
the stable 1.x API series of your favourite cross-platform multimedia
framework!
As always, this release is again packed with many new features, bug
fixes and other improvements.
Highlights
- this section will be completed in due course
Major new features and changes
Noteworthy new features and API
- this section will be filled in in due course
New elements
- this section will be filled in in due course
New element features and additions
- this section will be filled in in due course
Plugin and library moves
- this section will be filled in in due course
- There were no plugin moves or library moves in this cycle.
Plugin removals
The following elements or plugins have been removed:
- this section will be filled in in due course
Miscellaneous API additions
- this section will be filled in in due course
Miscellaneous performance, latency and memory optimisations
- this section will be filled in in due course
Miscellaneous other changes and enhancements
- this section will be filled in in due course
Tracing framework and debugging improvements
- this section will be filled in in due course
Tools
- this section will be filled in in due course
GStreamer RTSP server
- this section will be filled in in due course
GStreamer VAAPI
- this section will be filled in in due course
GStreamer OMX
- this section will be filled in in due course
GStreamer Editing Services and NLE
- this section will be filled in in due course
GStreamer validate
- this section will be filled in in due course
GStreamer Python Bindings
- this section will be filled in in due course
GStreamer C# Bindings
- this section will be filled in in due course
GStreamer Rust Bindings and Rust Plugins
The GStreamer Rust bindings are released separately with a different
release cadence thats tied to gtk-rs, but the latest release has
already been updated for the upcoming new GStreamer 1.20 API.
gst-plugins-rs, the module containing GStreamer plugins written in Rust,
has also seen lots of activity with many new elements and plugins.
What follows is a list of elements and plugins available in
gst-plugins-rs, so people dont miss out on all those potentially useful
elements that have no C equivalent.
- FIXME: add new elements
Rust audio plugins
- audiornnoise: New element for audio denoising which implements the
noise removal algorithm of the Xiph RNNoise library, in Rust
- rsaudioecho: Port of the audioecho element from gst-plugins-good
rsaudioloudnorm: Live audio loudness normalization element based on
the FFmpeg af_loudnorm filter
- claxondec: FLAC lossless audio codec decoder element based on the
pure-Rust claxon implementation
- csoundfilter: Audio filter that can use any filter defined via the
Csound audio programming language
- lewtondec: Vorbis audio decoder element based on the pure-Rust
lewton implementation
Rust video plugins
- cdgdec/cdgparse: Decoder and parser for the CD+G video codec based
on a pure-Rust CD+G implementation, used for example by karaoke CDs
- cea608overlay: CEA-608 Closed Captions overlay element
- cea608tott: CEA-608 Closed Captions to timed-text (e.g. VTT or SRT
subtitles) converter
- tttocea608: CEA-608 Closed Captions from timed-text converter
- mccenc/mccparse: MacCaption Closed Caption format encoder and parser
- sccenc/sccparse: Scenarist Closed Caption format encoder and parser
- dav1dec: AV1 video decoder based on the dav1d decoder implementation
by the VLC project
- rav1enc: AV1 video encoder based on the fast and pure-Rust rav1e
encoder implementation
- rsflvdemux: Alternative to the flvdemux FLV demuxer element from
gst-plugins-good, not feature-equivalent yet
- rsgifenc/rspngenc: GIF/PNG encoder elements based on the pure-Rust
implementations by the image-rs project
Rust text plugins
- textwrap: Element for line-wrapping timed text (e.g. subtitles) for
better screen-fitting, including hyphenation support for some
languages
Rust network plugins
- reqwesthttpsrc: HTTP(S) source element based on the Rust
reqwest/hyper HTTP implementations and almost feature-equivalent
with the main GStreamer HTTP source souphttpsrc
- s3src/s3sink: Source/sink element for the Amazon S3 cloud storage
- awstranscriber: Live audio to timed text transcription element using
the Amazon AWS Transcribe API
Generic Rust plugins
- sodiumencrypter/sodiumdecrypter: Encryption/decryption element based
on libsodium/NaCl
- togglerecord: Recording element that allows to pause/resume
recordings easily and considers keyframe boundaries
- fallbackswitch/fallbacksrc: Elements for handling potentially
failing (network) sources, restarting them on errors/timeout and
showing a fallback stream instead
- threadshare: Set of elements that provide alternatives for various
existing GStreamer elements but allow to share the streaming threads
between each other to reduce the number of threads
- rsfilesrc/rsfilesink: File source/sink elements as replacements for
the existing filesrc/filesink elements
Build and Dependencies
- this section will be filled in in due course
gst-build
- this section will be filled in in due course
Cerbero
Cerbero is a meta build system used to build GStreamer plus dependencies
on platforms where dependencies are not readily available, such as
Windows, Android, iOS and macOS.
General improvements
- this section will be filled in in due course
macOS / iOS
- this section will be filled in in due course
Windows
- this section will be filled in in due course
Windows MSI installer
- this section will be filled in in due course
Linux
- this section will be filled in in due course
Android
- this section will be filled in in due course
Platform-specific changes and improvements
Android
- this section will be filled in in due course
macOS and iOS
- this section will be filled in in due course
Windows
- this section will be filled in in due course
Linux
- this section will be filled in in due course
Documentation improvements
- this section will be filled in in due course
Possibly Breaking Changes
- this section will be filled in in due course
- MPEG-TS SCTE-35 API changes (FIXME: flesh out)
- gst_parse_launch() and friends now error out on non-existing
properties on top-level bins where they would silently fail and
ignore those before.
Known Issues
- this section will be filled in in due course
- There are a couple of known WebRTC-related regressions/blockers:
- webrtc: DTLS setup with Chrome is broken
- webrtcbin: First keyframe is usually lost
Contributors
- this section will be filled in in due course
… and many others who have contributed bug reports, translations, sent
suggestions or helped testing.
Stable 1.20 branch
After the 1.20.0 release there will be several 1.20.x bug-fix releases
which will contain bug fixes which have been deemed suitable for a
stable branch, but no new features or intrusive changes will be added to
a bug-fix release usually. The 1.20.x bug-fix releases will be made from
the git 1.20 branch, which will be a stable branch.
1.20.0
1.20.0 is scheduled to be released around October/November 2021.
Schedule for 1.22
Our next major feature release will be 1.22, and 1.21 will be the
unstable development version leading up to the stable 1.22 release. The
development of 1.21/1.22 will happen in the git main branch.
The plan for the 1.22 development cycle is yet to be confirmed.
1.22 will be backwards-compatible to the stable 1.20, 1.18, 1.16, 1.14,
1.12, 1.10, 1.8, 1.6, 1.4, 1.2 and 1.0 release series.
------------------------------------------------------------------------
These release notes have been prepared by Tim-Philipp Müller with
contributions from …
License: CC BY-SA 4.0

96
RELEASE Normal file
View file

@ -0,0 +1,96 @@
This is GStreamer gst-devtools 1.19.2.
GStreamer 1.19 is the development branch leading up to the next major
stable version which will be 1.20.
The 1.19 development series adds new features on top of the 1.18 series and is
part of the API and ABI-stable 1.x release series of the GStreamer multimedia
framework.
Full release notes will one day be found at:
https://gstreamer.freedesktop.org/releases/1.20/
Binaries for Android, iOS, Mac OS X and Windows will usually be provided
shortly after the release.
This module will not be very useful by itself and should be used in conjunction
with other GStreamer modules for a complete multimedia experience.
- gstreamer: provides the core GStreamer libraries and some generic plugins
- gst-plugins-base: a basic set of well-supported plugins and additional
media-specific GStreamer helper libraries for audio,
video, rtsp, rtp, tags, OpenGL, etc.
- gst-plugins-good: a set of well-supported plugins under our preferred
license
- gst-plugins-ugly: a set of well-supported plugins which might pose
problems for distributors
- gst-plugins-bad: a set of plugins of varying quality that have not made
their way into one of core/base/good/ugly yet, for one
reason or another. Many of these are are production quality
elements, but may still be missing documentation or unit
tests; others haven't passed the rigorous quality testing
we expect yet.
- gst-libav: a set of codecs plugins based on the ffmpeg library. This is
where you can find audio and video decoders and encoders
for a wide variety of formats including H.264, AAC, etc.
- gstreamer-vaapi: hardware-accelerated video decoding and encoding using
VA-API on Linux. Primarily for Intel graphics hardware.
- gst-omx: hardware-accelerated video decoding and encoding, primarily for
embedded Linux systems that provide an OpenMax
implementation layer such as the Raspberry Pi.
- gst-rtsp-server: library to serve files or streaming pipelines via RTSP
- gst-editing-services: library an plugins for non-linear editing
==== Download ====
You can find source releases of gstreamer in the download
directory: https://gstreamer.freedesktop.org/src/gstreamer/
The git repository and details how to clone it can be found at
https://gitlab.freedesktop.org/gstreamer/
==== Homepage ====
The project's website is https://gstreamer.freedesktop.org/
==== Support and Bugs ====
We have recently moved from GNOME Bugzilla to GitLab on freedesktop.org
for bug reports and feature requests:
https://gitlab.freedesktop.org/gstreamer
Please submit patches via GitLab as well, in form of Merge Requests. See
https://gstreamer.freedesktop.org/documentation/contribute/
for more details.
For help and support, please subscribe to and send questions to the
gstreamer-devel mailing list (see below for details).
There is also a #gstreamer IRC channel on the Freenode IRC network.
==== Developers ====
GStreamer source code repositories can be found on GitLab on freedesktop.org:
https://gitlab.freedesktop.org/gstreamer
and can also be cloned from there and this is also where you can submit
Merge Requests or file issues for bugs or feature requests.
Interested developers of the core library, plugins, and applications should
subscribe to the gstreamer-devel list:
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

14
debug-viewer/.gitignore vendored Normal file
View file

@ -0,0 +1,14 @@
*.pyc
*.pyo
*.glade.bak
*.gladep
*.gladep.bak
/build
/dist
/MANIFEST
po/*.pot
po/mo

View file

@ -0,0 +1,74 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Development Utilities
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Development Utilities Common Data module."""
import gi
from gi.repository import GObject
class Dispatcher (object):
def __call__(self, iterator):
raise NotImplementedError("derived classes must override this method")
def cancel(self):
pass
class DefaultDispatcher (Dispatcher):
def __call__(self, iterator):
for x in iterator:
pass
class GSourceDispatcher (Dispatcher):
def __init__(self):
Dispatcher.__init__(self)
self.source_id = None
def __call__(self, iterator):
if self.source_id is not None:
GObject.source_remove(self.source_id)
def iteration():
r = iterator.__next__()
if not r:
self.source_id = None
return r
self.source_id = GObject.idle_add(
iteration, priority=GObject.PRIORITY_LOW)
def cancel(self):
if self.source_id is None:
return
GObject.source_remove(self.source_id)
self.source_id = None

View file

@ -0,0 +1,528 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Development Utilities
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Development Utilities Common GUI module."""
import os
import logging
import gi
gi.require_version('Gtk', '3.0')
from gi.repository import GObject
from gi.repository import Gtk
from gi.repository import Gdk
from gi.types import GObjectMeta
import GstDebugViewer
from GstDebugViewer.Common import utils
from .generictreemodel import GenericTreeModel
def widget_add_popup_menu(widget, menu, button=3):
def popup_callback(widget, event):
if event.button == button:
menu.popup(
None, None, None, None, event.button, event.get_time())
return False
widget.connect("button-press-event", popup_callback)
class Actions (dict):
def __init__(self):
dict.__init__(self)
self.groups = {}
def __getattr__(self, name):
try:
return self[name]
except KeyError:
if "_" in name:
try:
return self[name.replace("_", "-")]
except KeyError:
pass
raise AttributeError("no action with name %r" % (name,))
def add_group(self, group):
name = group.props.name
if name in self.groups:
raise ValueError("already have a group named %s", name)
self.groups[name] = group
for action in group.list_actions():
self[action.props.name] = action
class Widgets (dict):
def __init__(self, builder):
widgets = (obj for obj in builder.get_objects()
if isinstance(obj, Gtk.Buildable))
# Gtk.Widget.get_name() shadows out the GtkBuildable interface method
# of the same name, hence calling the unbound interface method here:
items = ((Gtk.Buildable.get_name(w), w,) for w in widgets)
dict.__init__(self, items)
def __getattr__(self, name):
try:
return self[name]
except KeyError:
if "_" in name:
try:
return self[name.replace("_", "-")]
except KeyError:
pass
raise AttributeError("no widget with name %r" % (name,))
class WidgetFactory (object):
def __init__(self, directory):
self.directory = directory
def get_builder(self, filename):
builder_filename = os.path.join(self.directory, filename)
builder = Gtk.Builder()
builder.set_translation_domain(GstDebugViewer.GETTEXT_DOMAIN)
builder.add_from_file(builder_filename)
return builder
def make(self, filename, widget_name, autoconnect=None):
builder = self.get_builder(filename)
if autoconnect is not None:
builder.connect_signals(autoconnect)
return Widgets(builder)
def make_one(self, filename, widget_name):
builder = self.get_builder(filename)
return builder.get_object(widget_name)
class UIFactory (object):
def __init__(self, ui_filename, actions=None):
self.filename = ui_filename
if actions:
self.action_groups = actions.groups
else:
self.action_groups = ()
def make(self, extra_actions=None):
ui_manager = Gtk.UIManager()
for action_group in list(self.action_groups.values()):
ui_manager.insert_action_group(action_group, 0)
if extra_actions:
for action_group in extra_actions.groups:
ui_manager.insert_action_group(action_group, 0)
ui_manager.add_ui_from_file(self.filename)
ui_manager.ensure_update()
return ui_manager
class MetaModel (GObjectMeta):
"""Meta class for easy setup of gtk tree models.
Looks for a class attribute named `columns' which must be set to a
sequence of the form name1, type1, name2, type2, ..., where the
names are strings. This metaclass adds the following attributes
to created classes:
cls.column_types = (type1, type2, ...)
cls.column_ids = (0, 1, ...)
cls.name1 = 0
cls.name2 = 1
...
Example: A Gtk.ListStore derived model can use
columns = ("COL_NAME", str, "COL_VALUE", str)
and use this in __init__:
GObject.GObject.__init__ (self, *self.column_types)
Then insert data like this:
self.set (self.append (),
self.COL_NAME, "spam",
self.COL_VALUE, "ham")
"""
def __init__(cls, name, bases, dict):
super(MetaModel, cls).__init__(name, bases, dict)
spec = tuple(cls.columns)
column_names = spec[::2]
column_types = spec[1::2]
column_indices = list(range(len(column_names)))
for col_index, col_name, in zip(column_indices, column_names):
setattr(cls, col_name, col_index)
cls.column_types = column_types
cls.column_ids = tuple(column_indices)
class Manager (object):
"""GUI Manager base class."""
@classmethod
def iter_item_classes(cls):
msg = "%s class does not support manager item class access"
raise NotImplementedError(msg % (cls.__name__,))
@classmethod
def find_item_class(self, **kw):
return self.__find_by_attrs(self.iter_item_classes(), kw)
def iter_items(self):
msg = "%s object does not support manager item access"
raise NotImplementedError(msg % (type(self).__name__,))
def find_item(self, **kw):
return self.__find_by_attrs(self.iter_items(), kw)
@staticmethod
def __find_by_attrs(i, kw):
from operator import attrgetter
if len(kw) != 1:
raise ValueError("need exactly one keyword argument")
attr, value = list(kw.items())[0]
getter = attrgetter(attr)
for item in i:
if getter(item) == value:
return item
else:
raise KeyError("no item such that item.%s == %r" % (attr, value,))
class StateString (object):
"""Descriptor for binding to StateSection classes."""
def __init__(self, option, default=None):
self.option = option
self.default = default
def __get__(self, section, section_class=None):
import configparser
if section is None:
return self
try:
return self.get(section)
except (configparser.NoSectionError,
configparser.NoOptionError,):
return self.get_default(section)
def __set__(self, section, value):
import configparser
self.set(section, value)
def get(self, section):
return section.get(self)
def get_default(self, section):
return self.default
def set(self, section, value):
if value is None:
value = ""
section.set(self, str(value))
class StateBool (StateString):
"""Descriptor for binding to StateSection classes."""
def get(self, section):
return section.state._parser.getboolean(section._name, self.option)
class StateInt (StateString):
"""Descriptor for binding to StateSection classes."""
def get(self, section):
return section.state._parser.getint(section._name, self.option)
class StateInt4 (StateString):
"""Descriptor for binding to StateSection classes. This implements storing
a tuple of 4 integers."""
def get(self, section):
value = StateString.get(self, section)
try:
l = value.split(",")
if len(l) != 4:
return None
else:
return tuple((int(v) for v in l))
except (AttributeError, TypeError, ValueError,):
return None
def set(self, section, value):
if value is None:
svalue = ""
elif len(value) != 4:
raise ValueError("value needs to be a 4-sequence, or None")
else:
svalue = ", ".join((str(v) for v in value))
return StateString.set(self, section, svalue)
class StateItem (StateString):
"""Descriptor for binding to StateSection classes. This implements storing
a class controlled by a Manager class."""
def __init__(self, option, manager_class, default=None):
StateString.__init__(self, option, default=default)
self.manager = manager_class
def get(self, section):
value = SectionString.get(self, section)
if not value:
return None
return self.parse_item(value)
def set(self, section, value):
if value is None:
svalue = ""
else:
svalue = value.name
StateString.set(self, section, svalue)
def parse_item(self, value):
name = value.strip()
try:
return self.manager.find_item_class(name=name)
except KeyError:
return None
class StateItemList (StateItem):
"""Descriptor for binding to StateSection classes. This implements storing
an ordered set of Manager items."""
def get(self, section):
value = StateString.get(self, section)
if not value:
return []
classes = []
for name in value.split(","):
item_class = self.parse_item(name)
if item_class is None:
continue
if not item_class in classes:
classes.append(item_class)
return classes
def get_default(self, section):
default = StateItem.get_default(self, section)
if default is None:
return []
else:
return default
def set(self, section, value):
if value is None:
svalue = ""
else:
svalue = ", ".join((v.name for v in value))
StateString.set(self, section, svalue)
class StateSection (object):
_name = None
def __init__(self, state):
self.state = state
if self._name is None:
raise NotImplementedError(
"subclasses must override the _name attribute")
def get(self, state_string):
return self.state._parser.get(self._name, state_string.option)
def set(self, state_string, value):
import configparser
parser = self.state._parser
try:
parser.set(self._name, state_string.option, value)
except configparser.NoSectionError:
parser.add_section(self._name)
parser.set(self._name, state_string.option, value)
class State (object):
def __init__(self, filename, old_filenames=()):
import configparser
self.sections = {}
self._filename = filename
self._parser = configparser.RawConfigParser()
success = self._parser.read([filename])
if not success:
for old_filename in old_filenames:
success = self._parser.read([old_filename])
if success:
break
def add_section_class(self, section_class):
self.sections[section_class._name] = section_class(self)
def save(self):
with utils.SaveWriteFile(self._filename, "wt") as fp:
self._parser.write(fp)
class WindowState (object):
def __init__(self):
self.logger = logging.getLogger("ui.window-state")
self.is_maximized = False
def attach(self, window, state):
self.window = window
self.state = state
self.window.connect("window-state-event",
self.handle_window_state_event)
geometry = self.state.geometry
if geometry:
self.window.move(*geometry[:2])
self.window.set_default_size(*geometry[2:])
if self.state.maximized:
self.logger.debug("initially maximized")
self.window.maximize()
def detach(self):
window = self.window
self.state.maximized = self.is_maximized
if not self.is_maximized:
position = tuple(window.get_position())
size = tuple(window.get_size())
self.state.geometry = position + size
self.window.disconnect_by_func(self.handle_window_state_event)
self.window = None
def handle_window_state_event(self, window, event):
if not event.changed_mask & Gdk.WindowState.MAXIMIZED:
return
if event.new_window_state & Gdk.WindowState.MAXIMIZED:
self.logger.debug("maximized")
self.is_maximized = True
else:
self.logger.debug("unmaximized")
self.is_maximized = False

View file

@ -0,0 +1,366 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Development Utilities
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Development Utilities Common Main module."""
import sys
import os
import traceback
from operator import attrgetter
import logging
import locale
import gettext
from gettext import gettext as _, ngettext
import gi
from gi.repository import GLib
from gi.repository import GObject
from gi.repository import Gtk
class ExceptionHandler (object):
exc_types = (Exception,)
priority = 50
inherit_fork = True
_handling_exception = False
def __call__(self, exc_type, exc_value, exc_traceback):
raise NotImplementedError(
"derived classes need to override this method")
class DefaultExceptionHandler (ExceptionHandler):
exc_types = (BaseException,)
priority = 0
inherit_fork = True
def __init__(self, excepthook):
ExceptionHandler.__init__(self)
self.excepthook = excepthook
def __call__(self, *exc_info):
return self.excepthook(*exc_info)
class ExitOnInterruptExceptionHandler (ExceptionHandler):
exc_types = (KeyboardInterrupt,)
priority = 100
inherit_fork = False
exit_status = 2
def __call__(self, *args):
print("Interrupt caught, exiting.", file=sys.stderr)
sys.exit(self.exit_status)
class MainLoopWrapper (ExceptionHandler):
priority = 95
inherit_fork = False
def __init__(self, enter, exit):
ExceptionHandler.__init__(self)
self.exc_info = (None,) * 3
self.enter = enter
self.exit = exit
def __call__(self, *exc_info):
self.exc_info = exc_info
self.exit()
def run(self):
ExceptHookManager.register_handler(self)
try:
self.enter()
finally:
ExceptHookManager.unregister_handler(self)
if self.exc_info != (None,) * 3:
# Re-raise unhandled exception that occured while running the loop.
exc_type, exc_value, exc_tb = self.exc_info
raise exc_value
class ExceptHookManagerClass (object):
def __init__(self):
self._in_forked_child = False
self.handlers = []
def setup(self):
if sys.excepthook == self.__excepthook:
raise ValueError("already set up")
hook = sys.excepthook
self.__instrument_excepthook()
self.__instrument_fork()
self.register_handler(DefaultExceptionHandler(hook))
def shutdown(self):
if sys.excepthook != self.__excepthook:
raise ValueError("not set up")
self.__restore_excepthook()
self.__restore_fork()
def __instrument_excepthook(self):
hook = sys.excepthook
self._original_excepthook = hook
sys.excepthook = self.__excepthook
def __restore_excepthook(self):
sys.excepthook = self._original_excepthook
def __instrument_fork(self):
try:
fork = os.fork
except AttributeError:
# System has no fork() system call.
self._original_fork = None
else:
self._original_fork = fork
os.fork = self.__fork
def __restore_fork(self):
if not hasattr(os, "fork"):
return
os.fork = self._original_fork
def entered_forked_child(self):
self._in_forked_child = True
for handler in tuple(self.handlers):
if not handler.inherit_fork:
self.handlers.remove(handler)
def register_handler(self, handler):
if self._in_forked_child and not handler.inherit_fork:
return
self.handlers.append(handler)
def unregister_handler(self, handler):
self.handlers.remove(handler)
def __fork(self):
pid = self._original_fork()
if pid == 0:
# Child process.
self.entered_forked_child()
return pid
def __excepthook(self, exc_type, exc_value, exc_traceback):
for handler in sorted(self.handlers,
key=attrgetter("priority"),
reverse=True):
if handler._handling_exception:
continue
for type_ in handler.exc_types:
if issubclass(exc_type, type_):
break
else:
continue
handler._handling_exception = True
handler(exc_type, exc_value, exc_traceback)
# Not using try...finally on purpose here. If the handler itself
# fails with an exception, this prevents recursing into it again.
handler._handling_exception = False
return
else:
from warnings import warn
warn("ExceptHookManager: unhandled %r" % (exc_value,),
RuntimeWarning,
stacklevel=2)
ExceptHookManager = ExceptHookManagerClass()
class PathsBase (object):
data_dir = None
icon_dir = None
locale_dir = None
@classmethod
def setup_installed(cls, data_prefix):
"""Set up paths for running from a regular installation."""
pass
@classmethod
def setup_uninstalled(cls, source_dir):
"""Set up paths for running 'uninstalled' (i.e. directly from the
source dist)."""
pass
@classmethod
def ensure_setup(cls):
"""If paths are still not set up, try to set from a fallback."""
if cls.data_dir is None:
source_dir = os.path.dirname(
os.path.dirname(os.path.abspath(__file__)))
cls.setup_uninstalled(source_dir)
def __new__(cls):
raise RuntimeError("do not create instances of this class -- "
"use the class object directly")
class PathsProgramBase (PathsBase):
program_name = None
@classmethod
def setup_installed(cls, data_prefix):
if cls.program_name is None:
raise NotImplementedError(
"derived classes need to set program_name attribute")
cls.data_dir = os.path.join(data_prefix, cls.program_name)
cls.icon_dir = os.path.join(data_prefix, "icons")
cls.locale_dir = os.path.join(data_prefix, "locale")
@classmethod
def setup_uninstalled(cls, source_dir):
"""Set up paths for running 'uninstalled' (i.e. directly from the
source dist)."""
# This is essential: The GUI module needs to find the .glade file.
cls.data_dir = os.path.join(source_dir, "data")
# The locale data might be missing if "setup.py build" wasn't run.
cls.locale_dir = os.path.join(source_dir, "build", "mo")
# Not setting icon_dir. It is not useful since we don't employ the
# needed directory structure in the source dist.
def _init_excepthooks():
ExceptHookManager.setup()
ExceptHookManager.register_handler(ExitOnInterruptExceptionHandler())
def _init_paths(paths):
paths.ensure_setup()
def _init_locale(gettext_domain=None):
if Paths.locale_dir and gettext_domain is not None:
try:
locale.setlocale(locale.LC_ALL, "")
except locale.Error as exc:
from warnings import warn
warn("locale error: %s" % (exc,),
RuntimeWarning,
stacklevel=2)
Paths.locale_dir = None
else:
gettext.bindtextdomain(gettext_domain, Paths.locale_dir)
gettext.textdomain(gettext_domain)
gettext.bind_textdomain_codeset(gettext_domain, "UTF-8")
def _init_logging(level):
if level == "none":
return
mapping = {"debug": logging.DEBUG,
"info": logging.INFO,
"warning": logging.WARNING,
"error": logging.ERROR,
"critical": logging.CRITICAL}
logging.basicConfig(level=mapping[level],
format='%(asctime)s.%(msecs)03d %(levelname)8s %(name)20s: %(message)s',
datefmt='%H:%M:%S')
logger = logging.getLogger("main")
logger.debug("logging at level %s", logging.getLevelName(level))
logger.info("using Python %i.%i.%i %s %i", *sys.version_info)
def _init_log_option(parser):
choices = ["none", "debug", "info", "warning", "error", "critical"]
parser.add_option("--log-level", "-l",
type="choice",
choices=choices,
action="store",
dest="log_level",
default="none",
help=_("Enable logging, possible values: ") + ", ".join(choices))
return parser
def main(main_function, option_parser, gettext_domain=None, paths=None):
# FIXME:
global Paths
Paths = paths
_init_excepthooks()
_init_paths(paths)
_init_locale(gettext_domain)
parser = _init_log_option(option_parser)
options, args = option_parser.parse_args()
_init_logging(options.log_level)
try:
main_function(args)
finally:
logging.shutdown()

View file

@ -0,0 +1,25 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Development Utilities
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Development Utilities Common package."""
from . import Data
from . import GUI
from . import Main
from . import utils

View file

@ -0,0 +1,420 @@
# -*- Mode: Python; py-indent-offset: 4 -*-
# generictreemodel - GenericTreeModel implementation for pygtk compatibility.
# Copyright (C) 2013 Simon Feltman
#
# generictreemodel.py: GenericTreeModel implementation for pygtk compatibility
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, see <http://www.gnu.org/licenses/>.
# System
import sys
import random
import collections
import ctypes
# GObject
from gi.repository import GObject
from gi.repository import Gtk
class _CTreeIter(ctypes.Structure):
_fields_ = [('stamp', ctypes.c_int),
('user_data', ctypes.c_void_p),
('user_data2', ctypes.c_void_p),
('user_data3', ctypes.c_void_p)]
@classmethod
def from_iter(cls, iter):
offset = sys.getsizeof(object()) # size of PyObject_HEAD
return ctypes.POINTER(cls).from_address(id(iter) + offset)
def _get_user_data_as_pyobject(iter):
citer = _CTreeIter.from_iter(iter)
return ctypes.cast(citer.contents.user_data, ctypes.py_object).value
def handle_exception(default_return):
"""Returns a function which can act as a decorator for wrapping exceptions and
returning "default_return" upon an exception being thrown.
This is used to wrap Gtk.TreeModel "do_" method implementations so we can return
a proper value from the override upon an exception occurring with client code
implemented by the "on_" methods.
"""
def decorator(func):
def wrapped_func(*args, **kargs):
try:
return func(*args, **kargs)
except BaseException:
# Use excepthook directly to avoid any printing to the screen
# if someone installed an except hook.
sys.excepthook(*sys.exc_info())
return default_return
return wrapped_func
return decorator
class GenericTreeModel(GObject.GObject, Gtk.TreeModel):
"""A base implementation of a Gtk.TreeModel for python.
The GenericTreeModel eases implementing the Gtk.TreeModel interface in Python.
The class can be subclassed to provide a TreeModel implementation which works
directly with Python objects instead of iterators.
All of the on_* methods should be overridden by subclasses to provide the
underlying implementation a way to access custom model data. For the purposes of
this API, all custom model data supplied or handed back through the overridable
API will use the argument names: node, parent, and child in regards to user data
python objects.
The create_tree_iter, set_user_data, invalidate_iters, iter_is_valid methods are
available to help manage Gtk.TreeIter objects and their Python object references.
GenericTreeModel manages a pool of user data nodes that have been used with iters.
This pool stores a references to user data nodes as a dictionary value with the
key being the integer id of the data. This id is what the Gtk.TreeIter objects
use to reference data in the pool.
References will be removed from the pool when the model is deleted or explicitly
by using the optional "node" argument to the "row_deleted" method when notifying
the model of row deletion.
"""
leak_references = GObject.Property(default=True, type=bool,
blurb="If True, strong references to user data attached to iters are "
"stored in a dictionary pool (default). Otherwise the user data is "
"stored as a raw pointer to a python object without a reference.")
#
# Methods
#
def __init__(self):
"""Initialize. Make sure to call this from derived classes if overridden."""
super(GenericTreeModel, self).__init__()
self.stamp = 0
#: Dictionary of (id(user_data): user_data), used when leak-refernces=False
self._held_refs = dict()
# Set initial stamp
self.invalidate_iters()
def iter_depth_first(self):
"""Depth-first iteration of the entire TreeModel yielding the python nodes."""
stack = collections.deque([None])
while stack:
it = stack.popleft()
if it is not None:
yield self.get_user_data(it)
children = [self.iter_nth_child(it, i)
for i in range(self.iter_n_children(it))]
stack.extendleft(reversed(children))
def invalidate_iter(self, iter):
"""Clear user data and its reference from the iter and this model."""
iter.stamp = 0
if iter.user_data:
if iter.user_data in self._held_refs:
del self._held_refs[iter.user_data]
iter.user_data = None
def invalidate_iters(self):
"""
This method invalidates all TreeIter objects associated with this custom tree model
and frees their locally pooled references.
"""
self.stamp = random.randint(-2147483648, 2147483647)
self._held_refs.clear()
def iter_is_valid(self, iter):
"""
:Returns:
True if the gtk.TreeIter specified by iter is valid for the custom tree model.
"""
return iter.stamp == self.stamp
def get_user_data(self, iter):
"""Get the user_data associated with the given TreeIter.
GenericTreeModel stores arbitrary Python objects mapped to instances of Gtk.TreeIter.
This method allows to retrieve the Python object held by the given iterator.
"""
if self.leak_references:
return self._held_refs[iter.user_data]
else:
return _get_user_data_as_pyobject(iter)
def set_user_data(self, iter, user_data):
"""Applies user_data and stamp to the given iter.
If the models "leak_references" property is set, a reference to the
user_data is stored with the model to ensure we don't run into bad
memory problems with the TreeIter.
"""
iter.user_data = id(user_data)
if user_data is None:
self.invalidate_iter(iter)
else:
iter.stamp = self.stamp
if self.leak_references:
self._held_refs[iter.user_data] = user_data
def create_tree_iter(self, user_data):
"""Create a Gtk.TreeIter instance with the given user_data specific for this model.
Use this method to create Gtk.TreeIter instance instead of directly calling
Gtk.Treeiter(), this will ensure proper reference managment of wrapped used_data.
"""
iter = Gtk.TreeIter()
self.set_user_data(iter, user_data)
return iter
def _create_tree_iter(self, data):
"""Internal creation of a (bool, TreeIter) pair for returning directly
back to the view interfacing with this model."""
if data is None:
return (False, None)
else:
it = self.create_tree_iter(data)
return (True, it)
def row_deleted(self, path, node=None):
"""Notify the model a row has been deleted.
Use the node parameter to ensure the user_data reference associated
with the path is properly freed by this model.
:Parameters:
path : Gtk.TreePath
Path to the row that has been deleted.
node : object
Python object used as the node returned from "on_get_iter". This is
optional but ensures the model will not leak references to this object.
"""
super(GenericTreeModel, self).row_deleted(path)
node_id = id(node)
if node_id in self._held_refs:
del self._held_refs[node_id]
#
# GtkTreeModel Interface Implementation
#
@handle_exception(0)
def do_get_flags(self):
"""Internal method."""
return self.on_get_flags()
@handle_exception(0)
def do_get_n_columns(self):
"""Internal method."""
return self.on_get_n_columns()
@handle_exception(GObject.TYPE_INVALID)
def do_get_column_type(self, index):
"""Internal method."""
return self.on_get_column_type(index)
@handle_exception((False, None))
def do_get_iter(self, path):
"""Internal method."""
return self._create_tree_iter(self.on_get_iter(path))
@handle_exception(False)
def do_iter_next(self, iter):
"""Internal method."""
if iter is None:
next_data = self.on_iter_next(None)
else:
next_data = self.on_iter_next(self.get_user_data(iter))
self.set_user_data(iter, next_data)
return next_data is not None
@handle_exception(None)
def do_get_path(self, iter):
"""Internal method."""
path = self.on_get_path(self.get_user_data(iter))
if path is None:
return None
else:
return Gtk.TreePath(path)
@handle_exception(None)
def do_get_value(self, iter, column):
"""Internal method."""
return self.on_get_value(self.get_user_data(iter), column)
@handle_exception((False, None))
def do_iter_children(self, parent):
"""Internal method."""
data = self.get_user_data(parent) if parent else None
return self._create_tree_iter(self.on_iter_children(data))
@handle_exception(False)
def do_iter_has_child(self, parent):
"""Internal method."""
return self.on_iter_has_child(self.get_user_data(parent))
@handle_exception(0)
def do_iter_n_children(self, iter):
"""Internal method."""
if iter is None:
return self.on_iter_n_children(None)
return self.on_iter_n_children(self.get_user_data(iter))
@handle_exception((False, None))
def do_iter_nth_child(self, parent, n):
"""Internal method."""
if parent is None:
data = self.on_iter_nth_child(None, n)
else:
data = self.on_iter_nth_child(self.get_user_data(parent), n)
return self._create_tree_iter(data)
@handle_exception((False, None))
def do_iter_parent(self, child):
"""Internal method."""
return self._create_tree_iter(self.on_iter_parent(self.get_user_data(child)))
@handle_exception(None)
def do_ref_node(self, iter):
self.on_ref_node(self.get_user_data(iter))
@handle_exception(None)
def do_unref_node(self, iter):
self.on_unref_node(self.get_user_data(iter))
#
# Python Subclass Overridables
#
def on_get_flags(self):
"""Overridable.
:Returns Gtk.TreeModelFlags:
The flags for this model. See: Gtk.TreeModelFlags
"""
raise NotImplementedError
def on_get_n_columns(self):
"""Overridable.
:Returns:
The number of columns for this model.
"""
raise NotImplementedError
def on_get_column_type(self, index):
"""Overridable.
:Returns:
The column type for the given index.
"""
raise NotImplementedError
def on_get_iter(self, path):
"""Overridable.
:Returns:
A python object (node) for the given TreePath.
"""
raise NotImplementedError
def on_iter_next(self, node):
"""Overridable.
:Parameters:
node : object
Node at current level.
:Returns:
A python object (node) following the given node at the current level.
"""
raise NotImplementedError
def on_get_path(self, node):
"""Overridable.
:Returns:
A TreePath for the given node.
"""
raise NotImplementedError
def on_get_value(self, node, column):
"""Overridable.
:Parameters:
node : object
column : int
Column index to get the value from.
:Returns:
The value of the column for the given node."""
raise NotImplementedError
def on_iter_children(self, parent):
"""Overridable.
:Returns:
The first child of parent or None if parent has no children.
If parent is None, return the first node of the model.
"""
raise NotImplementedError
def on_iter_has_child(self, node):
"""Overridable.
:Returns:
True if the given node has children.
"""
raise NotImplementedError
def on_iter_n_children(self, node):
"""Overridable.
:Returns:
The number of children for the given node. If node is None,
return the number of top level nodes.
"""
raise NotImplementedError
def on_iter_nth_child(self, parent, n):
"""Overridable.
:Parameters:
parent : object
n : int
Index of child within parent.
:Returns:
The child for the given parent index starting at 0. If parent None,
return the top level node corresponding to "n".
If "n" is larger then available nodes, return None.
"""
raise NotImplementedError
def on_iter_parent(self, child):
"""Overridable.
:Returns:
The parent node of child or None if child is a top level node."""
raise NotImplementedError
def on_ref_node(self, node):
pass
def on_unref_node(self, node):
pass

View file

@ -0,0 +1,333 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Development Utilities
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program; if not, see <http://www.gnu.org/licenses/>.
"""GStreamer Development Utilities Common utils module."""
import os
import logging
import subprocess as _subprocess
class SingletonMeta (type):
def __init__(cls, name, bases, dict_):
from weakref import WeakValueDictionary
super(SingletonMeta, cls).__init__(name, bases, dict_)
cls._singleton_instances = WeakValueDictionary()
def __call__(cls, *a, **kw):
kw_key = tuple(sorted(kw.items()))
try:
obj = cls._singleton_instances[a + kw_key]
except KeyError:
obj = super(SingletonMeta, cls).__call__(*a, **kw)
cls._singleton_instances[a + kw_key] = obj
return obj
def gettext_cache():
"""Return a callable object that operates like gettext.gettext, but is much
faster when a string is looked up more than once. This is very useful in
loops, where calling gettext.gettext can quickly become a major performance
bottleneck."""
from gettext import gettext
d = {}
def gettext_cache_access(s):
if s not in d:
d[s] = gettext(s)
return d[s]
return gettext_cache_access
class ClassProperty (property):
"Like the property class, but also invokes the getter for class access."
def __init__(self, fget=None, fset=None, fdel=None, doc=None):
property.__init__(self, fget, fset, fdel, doc)
self.__fget = fget
def __get__(self, obj, obj_class=None):
ret = property.__get__(self, obj, obj_class)
if ret == self:
return self.__fget(None)
else:
return ret
class _XDGClass (object):
"""Partial implementation of the XDG Base Directory specification v0.6.
http://standards.freedesktop.org/basedir-spec/basedir-spec-0.6.html"""
def __init__(self):
self._add_base_dir("DATA_HOME", "~/.local/share")
self._add_base_dir("CONFIG_HOME", "~/.config")
self._add_base_dir("CACHE_HOME", "~/.cache")
def _add_base_dir(self, name, default):
dir = os.environ.get("XDG_%s" % (name,))
if not dir:
dir = os.path.expanduser(os.path.join(*default.split("/")))
setattr(self, name, dir)
XDG = _XDGClass()
class SaveWriteFile (object):
def __init__(self, filename, mode="wt"):
from tempfile import mkstemp
self.logger = logging.getLogger("tempfile")
dir = os.path.dirname(filename)
base_name = os.path.basename(filename)
temp_prefix = "%s-tmp" % (base_name,)
if dir:
# Destination dir differs from current directory, ensure that it
# exists:
try:
os.makedirs(dir)
except OSError:
pass
self.clean_stale(dir, temp_prefix)
fd, temp_name = mkstemp(dir=dir, prefix=temp_prefix)
self.target_name = filename
self.temp_name = temp_name
self.real_file = os.fdopen(fd, mode)
def __enter__(self):
return self
def __exit__(self, *exc_args):
if exc_args == (None, None, None,):
self.close()
else:
self.discard()
def __del__(self):
try:
self.discard()
except AttributeError:
# If __init__ failed, self has no real_file attribute.
pass
def __close_real(self):
if self.real_file:
self.real_file.close()
self.real_file = None
def clean_stale(self, dir, temp_prefix):
from time import time
from glob import glob
now = time()
pattern = os.path.join(dir, "%s*" % (temp_prefix,))
for temp_filename in glob(pattern):
mtime = os.stat(temp_filename).st_mtime
if now - mtime > 3600:
self.logger.info("deleting stale temporary file %s",
temp_filename)
try:
os.unlink(temp_filename)
except EnvironmentError as exc:
self.logger.warning("deleting stale temporary file "
"failed: %s", exc)
def tell(self, *a, **kw):
return self.real_file.tell(*a, **kw)
def write(self, *a, **kw):
return self.real_file.write(*a, **kw)
def close(self):
self.__close_real()
if self.temp_name:
try:
os.rename(self.temp_name, self.target_name)
except OSError as exc:
import errno
if exc.errno == errno.EEXIST:
# We are probably on windows.
os.unlink(self.target_name)
os.rename(self.temp_name, self.target_name)
self.temp_name = None
def discard(self):
self.__close_real()
if self.temp_name:
try:
os.unlink(self.temp_name)
except EnvironmentError as exc:
self.logger.warning("deleting temporary file failed: %s", exc)
self.temp_name = None
class TeeWriteFile (object):
# TODO Py2.5: Add context manager methods.
def __init__(self, *file_objects):
self.files = list(file_objects)
def close(self):
for file in self.files:
file.close()
def flush(self):
for file in self.files:
file.flush()
def write(self, string):
for file in self.files:
file.write(string)
def writelines(self, lines):
for file in self.files:
file.writelines(lines)
class FixedPopen (_subprocess.Popen):
def __init__(self, args, **kw):
# Unconditionally specify all descriptors as redirected, to
# work around Python bug #1358527 (which is triggered for
# console-less applications on Windows).
close = []
for name in ("stdin", "stdout", "stderr",):
target = kw.get(name)
if not target:
kw[name] = _subprocess.PIPE
close.append(name)
_subprocess.Popen.__init__(self, args, **kw)
for name in close:
fp = getattr(self, name)
fp.close()
setattr(self, name, None)
class DevhelpError (EnvironmentError):
pass
class DevhelpUnavailableError (DevhelpError):
pass
class DevhelpClient (object):
def available(self):
try:
self.version()
except DevhelpUnavailableError:
return False
else:
return True
def version(self):
return self._invoke("--version")
def search(self, entry):
self._invoke_no_interact("-s", entry)
def _check_os_error(self, exc):
import errno
if exc.errno == errno.ENOENT:
raise DevhelpUnavailableError()
def _invoke(self, *args):
from subprocess import PIPE
try:
proc = FixedPopen(("devhelp",) + args,
stdout=PIPE)
except OSError as exc:
self._check_os_error(exc)
raise
out, err = proc.communicate()
if proc.returncode is not None and proc.returncode != 0:
raise DevhelpError("devhelp exited with status %i"
% (proc.returncode,))
return out
def _invoke_no_interact(self, *args):
from subprocess import PIPE
try:
proc = FixedPopen(("devhelp",) + args)
except OSError as exc:
self._check_os_error(exc)
raise

View file

@ -0,0 +1,482 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer Data module."""
import os
import logging
import re
import sys
# Nanosecond resolution (like Gst.SECOND)
SECOND = 1000000000
def time_args(ts):
secs = ts // SECOND
return "%i:%02i:%02i.%09i" % (secs // 60 ** 2,
secs // 60 % 60,
secs % 60,
ts % SECOND,)
def time_diff_args(time_diff):
if time_diff >= 0:
sign = "+"
else:
sign = "-"
secs = abs(time_diff) // SECOND
return "%s%02i:%02i.%09i" % (sign,
secs // 60,
secs % 60,
abs(time_diff) % SECOND,)
def time_args_no_hours(ts):
secs = ts // SECOND
return "%02i:%02i.%09i" % (secs // 60,
secs % 60,
ts % SECOND,)
def parse_time(st):
"""Parse time strings that look like "0:00:00.0000000"."""
h, m, s = st.split(":")
secs, subsecs = s.split(".")
return int((int(h) * 60 ** 2 + int(m) * 60) * SECOND) + \
int(secs) * SECOND + int(subsecs)
class DebugLevel (int):
__names = ["NONE", "ERROR", "WARN", "FIXME",
"INFO", "DEBUG", "LOG", "TRACE", "MEMDUMP"]
__instances = {}
def __new__(cls, level):
try:
level_int = int(level)
except (ValueError, TypeError,):
try:
level_int = cls.__names.index(level.upper())
except ValueError:
raise ValueError("no debug level named %r" % (level,))
if level_int in cls.__instances:
return cls.__instances[level_int]
else:
new_instance = int.__new__(cls, level_int)
new_instance.name = cls.__names[level_int]
cls.__instances[level_int] = new_instance
return new_instance
def __repr__(self):
return "<%s %s (%i)>" % (type(self).__name__, self.__names[self], self,)
def higher_level(self):
if self == len(self.__names) - 1:
raise ValueError("already the highest debug level")
return DebugLevel(self + 1)
def lower_level(self):
if self == 0:
raise ValueError("already the lowest debug level")
return DebugLevel(self - 1)
debug_level_none = DebugLevel("NONE")
debug_level_error = DebugLevel("ERROR")
debug_level_warning = DebugLevel("WARN")
debug_level_info = DebugLevel("INFO")
debug_level_debug = DebugLevel("DEBUG")
debug_level_log = DebugLevel("LOG")
debug_level_fixme = DebugLevel("FIXME")
debug_level_trace = DebugLevel("TRACE")
debug_level_memdump = DebugLevel("MEMDUMP")
debug_levels = [debug_level_none,
debug_level_trace,
debug_level_fixme,
debug_level_log,
debug_level_debug,
debug_level_info,
debug_level_warning,
debug_level_error,
debug_level_memdump]
# For stripping color codes:
_escape = re.compile(b"\x1b\\[[0-9;]*m")
def strip_escape(s):
# FIXME: This can be optimized further!
while b"\x1b" in s:
s = _escape.sub(b"", s)
return s
def default_log_line_regex_():
# "DEBUG "
LEVEL = "([A-Z]+)\s*"
# "0x8165430 "
THREAD = r"(0x[0-9a-f]+)\s+" # r"\((0x[0-9a-f]+) - "
# "0:00:00.777913000 "
TIME = r"(\d+:\d\d:\d\d\.\d+)\s+"
CATEGORY = "([A-Za-z0-9_-]+)\s+" # "GST_REFCOUNTING ", "flacdec "
# " 3089 "
PID = r"(\d+)\s*"
FILENAME = r"([^:]*):"
LINE = r"(\d+):"
FUNCTION = "(~?[A-Za-z0-9_\s\*,\(\)]*):"
# FIXME: When non-g(st)object stuff is logged with *_OBJECT (like
# buffers!), the address is printed *without* <> brackets!
OBJECT = "(?:<([^>]+)>)?"
MESSAGE = "(.+)"
ANSI = "(?:\x1b\\[[0-9;]*m\\s*)*\\s*"
# New log format:
expressions = [TIME, ANSI, PID, ANSI, THREAD, ANSI, LEVEL, ANSI,
CATEGORY, FILENAME, LINE, FUNCTION, ANSI,
OBJECT, ANSI, MESSAGE]
# Old log format:
# expressions = [LEVEL, THREAD, TIME, CATEGORY, PID, FILENAME, LINE,
# FUNCTION, OBJECT, MESSAGE]
return expressions
def default_log_line_regex():
return re.compile("".join(default_log_line_regex_()))
class Producer (object):
def __init__(self):
self.consumers = []
def have_load_started(self):
for consumer in self.consumers:
consumer.handle_load_started()
def have_load_finished(self):
for consumer in self.consumers:
consumer.handle_load_finished()
class SortHelper (object):
def __init__(self, fileobj, offsets):
self._gen = self.__gen(fileobj, offsets)
next(self._gen)
# Override in the instance, for performance (this gets called in an
# inner loop):
self.find_insert_position = self._gen.send
@staticmethod
def find_insert_position(insert_time_string):
# Stub for documentary purposes.
pass
@staticmethod
def __gen(fileobj, offsets):
from math import floor
tell = fileobj.tell
seek = fileobj.seek
read = fileobj.read
time_len = len(time_args(0))
# We remember the previous insertion point. This gives a nice speed up
# for larger bubbles which are already sorted. TODO: In practice, log
# lines only get out of order across threads. Need to check if it pays
# to parse the thread here and maintain multiple insertion points for
# heavily interleaved parts of the log.
pos = 0
pos_time_string = ""
insert_pos = None
while True:
insert_time_string = (yield insert_pos)
save_offset = tell()
if pos_time_string <= insert_time_string:
lo = pos
hi = len(offsets)
else:
lo = 0
hi = pos
# This is a bisection search, except we don't cut the range in the
# middle each time, but at the 90th percentile. This is because
# logs are "mostly sorted", so the insertion point is much more
# likely to be at the end anyways:
while lo < hi:
mid = int(floor(lo * 0.1 + hi * 0.9))
seek(offsets[mid])
mid_time_string = read(time_len)
if insert_time_string.encode('utf8') < mid_time_string:
hi = mid
else:
lo = mid + 1
pos = lo
# Caller will replace row at pos with the new one, so this is
# correct:
pos_time_string = insert_time_string
insert_pos = pos
seek(save_offset)
class LineCache (Producer):
"""
offsets: file position for each line
levels: the debug level for each line
"""
_lines_per_iteration = 50000
def __init__(self, fileobj, dispatcher):
Producer.__init__(self)
self.logger = logging.getLogger("linecache")
self.dispatcher = dispatcher
self.__fileobj = fileobj
self.__fileobj.seek(0, 2)
self.__file_size = self.__fileobj.tell()
self.__fileobj.seek(0)
self.offsets = []
self.levels = [] # FIXME
def start_loading(self):
self.logger.debug("dispatching load process")
self.have_load_started()
self.dispatcher(self.__process())
def get_progress(self):
return float(self.__fileobj.tell()) / self.__file_size
def __process(self):
offsets = self.offsets
levels = self.levels
dict_levels = {"T": debug_level_trace, "F": debug_level_fixme,
"L": debug_level_log, "D": debug_level_debug,
"I": debug_level_info, "W": debug_level_warning,
"E": debug_level_error, " ": debug_level_none,
"M": debug_level_memdump, }
ANSI = "(?:\x1b\\[[0-9;]*m)?"
ANSI_PATTERN = r"\d:\d\d:\d\d\.\d+ " + ANSI + \
r" *\d+" + ANSI + \
r" +0x[0-9a-f]+ +" + ANSI + \
r"([TFLDIEWM ])"
BARE_PATTERN = ANSI_PATTERN.replace(ANSI, "")
rexp_bare = re.compile(BARE_PATTERN)
rexp_ansi = re.compile(ANSI_PATTERN)
rexp = rexp_bare
# Moving attribute lookups out of the loop:
readline = self.__fileobj.readline
tell = self.__fileobj.tell
rexp_match = rexp.match
levels_append = levels.append
offsets_append = offsets.append
dict_levels_get = dict_levels.get
self.__fileobj.seek(0)
limit = self._lines_per_iteration
last_line = ""
i = 0
sort_helper = SortHelper(self.__fileobj, offsets)
find_insert_position = sort_helper.find_insert_position
while True:
i += 1
if i >= limit:
i = 0
yield True
offset = tell()
line = readline().decode('utf-8', errors='replace')
if not line:
break
match = rexp_match(line)
if match is None:
if rexp is rexp_ansi or "\x1b" not in line:
continue
match = rexp_ansi.match(line)
if match is None:
continue
# Switch to slower ANSI parsing:
rexp = rexp_ansi
rexp_match = rexp.match
# Timestamp is in the very beginning of the row, and can be sorted
# by lexical comparison. That's why we don't bother parsing the
# time to integer. We also don't have to take a substring here,
# which would be a useless memcpy.
if line >= last_line:
levels_append(
dict_levels_get(match.group(1), debug_level_none))
offsets_append(offset)
last_line = line
else:
pos = find_insert_position(line)
levels.insert(
pos, dict_levels_get(match.group(1), debug_level_none))
offsets.insert(pos, offset)
self.have_load_finished()
yield False
class LogLine (list):
_line_regex = default_log_line_regex()
@classmethod
def parse_full(cls, line_string):
match = cls._line_regex.match(line_string.decode('utf8', errors='replace'))
if match is None:
# raise ValueError ("not a valid log line (%r)" % (line_string,))
groups = [0, 0, 0, 0, "", "", 0, "", "", 0]
return cls(groups)
line = cls(match.groups())
# Timestamp.
line[0] = parse_time(line[0])
# PID.
line[1] = int(line[1])
# Thread.
line[2] = int(line[2], 16)
# Level (this is handled in LineCache).
line[3] = 0
# Line.
line[6] = int(line[6])
# Message start offset.
line[9] = match.start(9 + 1)
for col_id in (4, # COL_CATEGORY
5, # COL_FILENAME
7, # COL_FUNCTION,
8,): # COL_OBJECT
line[col_id] = sys.intern(line[col_id] or "")
return line
class LogLines (object):
def __init__(self, fileobj, line_cache):
self.__fileobj = fileobj
self.__line_cache = line_cache
def __len__(self):
return len(self.__line_cache.offsets)
def __getitem__(self, line_index):
offset = self.__line_cache.offsets[line_index]
self.__fileobj.seek(offset)
line_string = self.__fileobj.readline()
line = LogLine.parse_full(line_string)
msg = line_string[line[-1]:]
line[-1] = msg
return line
def __iter__(self):
size = len(self)
i = 0
while i < size:
yield self[i]
i += 1
class LogFile (Producer):
def __init__(self, filename, dispatcher):
import mmap
Producer.__init__(self)
self.logger = logging.getLogger("logfile")
self.path = os.path.normpath(os.path.abspath(filename))
self.__real_fileobj = open(filename, "rb")
self.fileobj = mmap.mmap(
self.__real_fileobj.fileno(), 0, access=mmap.ACCESS_READ)
self.line_cache = LineCache(self.fileobj, dispatcher)
self.line_cache.consumers.append(self)
def start_loading(self):
self.logger.debug("starting load")
self.line_cache.start_loading()
def get_load_progress(self):
return self.line_cache.get_progress()
def handle_load_started(self):
# Chain up to our consumers:
self.have_load_started()
def handle_load_finished(self):
self.logger.debug("finish loading")
self.lines = LogLines(self.fileobj, self.line_cache)
# Chain up to our consumers:
self.have_load_finished()

View file

@ -0,0 +1,44 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer GUI module."""
__author__ = u"René Stadler <mail@renestadler.de>"
__version__ = "0.1"
import gi
from GstDebugViewer.GUI.app import App
def main(args):
app = App()
# TODO: Once we support more than one window, open one window for each
# supplied filename.
window = app.windows[0]
if len(args) > 0:
window.set_log_file(args[0])
app.run()
if __name__ == "__main__":
main()

View file

@ -0,0 +1,158 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer GUI module."""
import os.path
import gi
gi.require_version('Gdk', '3.0')
gi.require_version('Gtk', '3.0')
from gi.repository import GObject
from gi.repository import Gdk
from gi.repository import Gtk
from GstDebugViewer import Common
from GstDebugViewer.GUI.columns import ViewColumnManager
from GstDebugViewer.GUI.window import Window
class AppStateSection (Common.GUI.StateSection):
_name = "state"
geometry = Common.GUI.StateInt4("window-geometry")
maximized = Common.GUI.StateBool("window-maximized")
column_order = Common.GUI.StateItemList("column-order", ViewColumnManager)
columns_visible = Common.GUI.StateItemList(
"columns-visible", ViewColumnManager)
zoom_level = Common.GUI.StateInt("zoom-level")
class AppState (Common.GUI.State):
def __init__(self, *a, **kw):
Common.GUI.State.__init__(self, *a, **kw)
self.add_section_class(AppStateSection)
class App (object):
def __init__(self):
self.attach()
def load_plugins(self):
from GstDebugViewer import Plugins
plugin_classes = list(
Plugins.load([os.path.dirname(Plugins.__file__)]))
self.plugins = []
for plugin_class in plugin_classes:
plugin = plugin_class(self)
self.plugins.append(plugin)
def iter_plugin_features(self):
for plugin in self.plugins:
for feature in plugin.features:
yield feature
def attach(self):
config_home = Common.utils.XDG.CONFIG_HOME
state_filename = os.path.join(
config_home, "gst-debug-viewer", "state")
self.state = AppState(state_filename)
self.state_section = self.state.sections["state"]
self.load_plugins()
self.windows = []
# Apply custom widget stying
# TODO: check for dark theme
css = b"""
@define-color normal_bg_color #FFFFFF;
@define-color shade_bg_color shade(@normal_bg_color, 0.95);
#log_view row:nth-child(even) {
background-color: @normal_bg_color;
}
#log_view row:nth-child(odd) {
background-color: @shade_bg_color;
}
#log_view row:selected {
background-color: #4488FF;
}
#log_view {
-GtkTreeView-horizontal-separator: 0;
-GtkTreeView-vertical-separator: 1;
outline-width: 0;
outline-offset: 0;
}
"""
style_provider = Gtk.CssProvider()
style_provider.load_from_data(css)
Gtk.StyleContext.add_provider_for_screen(
Gdk.Screen.get_default(),
style_provider,
Gtk.STYLE_PROVIDER_PRIORITY_APPLICATION
)
self.open_window()
def detach(self):
# TODO: If we take over deferred saving from the inspector, specify now
# = True here!
self.state.save()
def run(self):
try:
Common.Main.MainLoopWrapper(Gtk.main, Gtk.main_quit).run()
except BaseException:
raise
else:
self.detach()
def open_window(self):
self.windows.append(Window(self))
def close_window(self, window):
self.windows.remove(window)
if not self.windows:
# GtkTreeView takes some time to go down for large files. Let's block
# until the window is hidden:
GObject.idle_add(Gtk.main_quit)
Gtk.main()
Gtk.main_quit()

View file

@ -0,0 +1,162 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer GUI module."""
from gi.repository import Gtk
from gi.repository import Gdk
from GstDebugViewer import Data
class Color (object):
def __init__(self, hex_24):
if hex_24.startswith("#"):
s = hex_24[1:]
else:
s = hex_24
self._fields = tuple((int(hs, 16) for hs in (s[:2], s[2:4], s[4:],)))
def gdk_color(self):
return Gdk.color_parse(self.hex_string())
def hex_string(self):
return "#%02x%02x%02x" % self._fields
def float_tuple(self):
return tuple((float(x) / 255 for x in self._fields))
def byte_tuple(self):
return self._fields
def short_tuple(self):
return tuple((x << 8 for x in self._fields))
class ColorPalette (object):
@classmethod
def get(cls):
try:
return cls._instance
except AttributeError:
cls._instance = cls()
return cls._instance
class TangoPalette (ColorPalette):
def __init__(self):
for name, r, g, b in [("black", 0, 0, 0,),
("white", 255, 255, 255,),
("butter1", 252, 233, 79),
("butter2", 237, 212, 0),
("butter3", 196, 160, 0),
("chameleon1", 138, 226, 52),
("chameleon2", 115, 210, 22),
("chameleon3", 78, 154, 6),
("orange1", 252, 175, 62),
("orange2", 245, 121, 0),
("orange3", 206, 92, 0),
("skyblue1", 114, 159, 207),
("skyblue2", 52, 101, 164),
("skyblue3", 32, 74, 135),
("plum1", 173, 127, 168),
("plum2", 117, 80, 123),
("plum3", 92, 53, 102),
("chocolate1", 233, 185, 110),
("chocolate2", 193, 125, 17),
("chocolate3", 143, 89, 2),
("scarletred1", 239, 41, 41),
("scarletred2", 204, 0, 0),
("scarletred3", 164, 0, 0),
("aluminium1", 238, 238, 236),
("aluminium2", 211, 215, 207),
("aluminium3", 186, 189, 182),
("aluminium4", 136, 138, 133),
("aluminium5", 85, 87, 83),
("aluminium6", 46, 52, 54)]:
setattr(self, name, Color("%02x%02x%02x" % (r, g, b,)))
class ColorTheme (object):
def __init__(self):
self.colors = {}
def add_color(self, key, *colors):
self.colors[key] = colors
class LevelColorTheme (ColorTheme):
pass
class LevelColorThemeTango (LevelColorTheme):
def __init__(self):
LevelColorTheme.__init__(self)
p = TangoPalette.get()
self.add_color(Data.debug_level_none, None, None, None)
self.add_color(Data.debug_level_trace, p.black, p.aluminium2)
self.add_color(Data.debug_level_fixme, p.black, p.butter3)
self.add_color(Data.debug_level_log, p.black, p.plum1)
self.add_color(Data.debug_level_debug, p.black, p.skyblue1)
self.add_color(Data.debug_level_info, p.black, p.chameleon1)
self.add_color(Data.debug_level_warning, p.black, p.orange1)
self.add_color(Data.debug_level_error, p.white, p.scarletred1)
self.add_color(Data.debug_level_memdump, p.white, p.aluminium3)
class ThreadColorTheme (ColorTheme):
pass
class ThreadColorThemeTango (ThreadColorTheme):
def __init__(self):
ThreadColorTheme.__init__(self)
t = TangoPalette.get()
for i, color in enumerate([t.butter2,
t.orange2,
t.chocolate3,
t.chameleon2,
t.skyblue1,
t.plum1,
t.scarletred1,
t.aluminium6]):
self.add_color(i, color)

View file

@ -0,0 +1,741 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer GUI module."""
import logging
from gi.repository import Gtk, GLib
from GstDebugViewer import Common, Data
from GstDebugViewer.GUI.colors import LevelColorThemeTango
from GstDebugViewer.GUI.models import LazyLogModel, LogModelBase
def _(s):
return s
# Sync with gst-inspector!
class Column (object):
"""A single list view column, managed by a ColumnManager instance."""
name = None
id = None
label_header = None
get_modify_func = None
get_data_func = None
get_sort_func = None
def __init__(self):
view_column = Gtk.TreeViewColumn(self.label_header)
view_column.props.reorderable = True
self.view_column = view_column
class SizedColumn (Column):
default_size = None
def compute_default_size(self):
return None
# Sync with gst-inspector?
class TextColumn (SizedColumn):
font_family = None
def __init__(self):
Column.__init__(self)
column = self.view_column
cell = Gtk.CellRendererText()
column.pack_start(cell, True)
cell.props.yalign = 0.
cell.props.ypad = 0
if self.font_family:
cell.props.family = self.font_family
cell.props.family_set = True
if self.get_data_func:
data_func = self.get_data_func()
assert data_func
id_ = self.id
if id_ is not None:
def cell_data_func(column, cell, model, tree_iter, user_data):
data_func(cell.props, model.get_value(tree_iter, id_))
else:
cell_data_func = data_func
column.set_cell_data_func(cell, cell_data_func)
elif not self.get_modify_func:
column.add_attribute(cell, "text", self.id)
else:
self.update_modify_func(column, cell)
column.props.resizable = True
def update_modify_func(self, column, cell):
modify_func = self.get_modify_func()
id_ = self.id
def cell_data_func(column, cell, model, tree_iter, user_data):
cell.props.text = modify_func(model.get_value(tree_iter, id_))
column.set_cell_data_func(cell, cell_data_func)
def compute_default_size(self):
values = self.get_values_for_size()
if not values:
return SizedColumn.compute_default_size(self)
cell = self.view_column.get_cells()[0]
if self.get_modify_func is not None:
format = self.get_modify_func()
else:
def identity(x):
return x
format = identity
max_width = 0
for value in values:
cell.props.text = format(value)
x, y, w, h = self.view_column.cell_get_size()
max_width = max(max_width, w)
return max_width
def get_values_for_size(self):
return ()
class TimeColumn (TextColumn):
name = "time"
label_header = _("Time")
id = LazyLogModel.COL_TIME
font_family = "monospace"
def __init__(self, *a, **kw):
self.base_time = 0
TextColumn.__init__(self, *a, **kw)
def get_modify_func(self):
if self.base_time:
time_diff_args = Data.time_diff_args
base_time = self.base_time
def format_time(value):
return time_diff_args(value - base_time)
else:
time_args = Data.time_args
def format_time(value):
# TODO: This is hard coded to omit hours.
return time_args(value)[2:]
return format_time
def get_values_for_size(self):
values = [0]
return values
def set_base_time(self, base_time):
self.base_time = base_time
column = self.view_column
cell = column.get_cells()[0]
self.update_modify_func(column, cell)
class LevelColumn (TextColumn):
name = "level"
label_header = _("L")
id = LazyLogModel.COL_LEVEL
def __init__(self):
TextColumn.__init__(self)
cell = self.view_column.get_cells()[0]
cell.props.xalign = .5
@staticmethod
def get_modify_func():
def format_level(value):
return value.name[0]
return format_level
@staticmethod
def get_data_func():
theme = LevelColorThemeTango()
colors = dict((level, tuple((c.gdk_color()
for c in theme.colors[level])),)
for level in Data.debug_levels
if level != Data.debug_level_none)
def level_data_func(cell_props, level):
cell_props.text = level.name[0]
if level in colors:
cell_colors = colors[level]
else:
cell_colors = (None, None, None,)
cell_props.foreground_gdk = cell_colors[0]
cell_props.background_gdk = cell_colors[1]
return level_data_func
def get_values_for_size(self):
values = [Data.debug_level_log, Data.debug_level_debug,
Data.debug_level_info, Data.debug_level_warning,
Data.debug_level_error, Data.debug_level_memdump]
return values
class PidColumn (TextColumn):
name = "pid"
label_header = _("PID")
id = LazyLogModel.COL_PID
font_family = "monospace"
@staticmethod
def get_modify_func():
return str
def get_values_for_size(self):
return ["999999"]
class ThreadColumn (TextColumn):
name = "thread"
label_header = _("Thread")
id = LazyLogModel.COL_THREAD
font_family = "monospace"
@staticmethod
def get_modify_func():
def format_thread(value):
return "0x%07x" % (value,)
return format_thread
def get_values_for_size(self):
return [int("ffffff", 16)]
class CategoryColumn (TextColumn):
name = "category"
label_header = _("Category")
id = LazyLogModel.COL_CATEGORY
def get_values_for_size(self):
return ["GST_LONG_CATEGORY", "somelongelement"]
class CodeColumn (TextColumn):
name = "code"
label_header = _("Code")
id = None
@staticmethod
def get_data_func():
filename_id = LogModelBase.COL_FILENAME
line_number_id = LogModelBase.COL_LINE_NUMBER
def filename_data_func(column, cell, model, tree_iter, user_data):
args = model.get(tree_iter, filename_id, line_number_id)
cell.props.text = "%s:%i" % args
return filename_data_func
def get_values_for_size(self):
return ["gstsomefilename.c:1234"]
class FunctionColumn (TextColumn):
name = "function"
label_header = _("Function")
id = LazyLogModel.COL_FUNCTION
def get_values_for_size(self):
return ["gst_this_should_be_enough"]
class ObjectColumn (TextColumn):
name = "object"
label_header = _("Object")
id = LazyLogModel.COL_OBJECT
def get_values_for_size(self):
return ["longobjectname00"]
class MessageColumn (TextColumn):
name = "message"
label_header = _("Message")
id = None
def __init__(self, *a, **kw):
self.highlighters = {}
TextColumn.__init__(self, *a, **kw)
def get_data_func(self):
highlighters = self.highlighters
id_ = LazyLogModel.COL_MESSAGE
def message_data_func(column, cell, model, tree_iter, user_data):
msg = model.get_value(tree_iter, id_).decode("utf8", errors="replace")
if not highlighters:
cell.props.text = msg
return
if len(highlighters) > 1:
raise NotImplementedError("FIXME: Support more than one...")
highlighter = list(highlighters.values())[0]
row = model[tree_iter]
ranges = highlighter(row)
if not ranges:
cell.props.text = msg
else:
tags = []
prev_end = 0
end = None
for start, end in ranges:
if prev_end < start:
tags.append(
GLib.markup_escape_text(msg[prev_end:start]))
msg_escape = GLib.markup_escape_text(msg[start:end])
tags.append("<span foreground=\'#FFFFFF\'"
" background=\'#0000FF\'>%s</span>" % (msg_escape,))
prev_end = end
if end is not None:
tags.append(GLib.markup_escape_text(msg[end:]))
cell.props.markup = "".join(tags)
return message_data_func
def get_values_for_size(self):
values = ["Just some good minimum size"]
return values
class ColumnManager (Common.GUI.Manager):
column_classes = ()
@classmethod
def iter_item_classes(cls):
return iter(cls.column_classes)
def __init__(self):
self.view = None
self.actions = None
self.zoom = 1.0
self.__columns_changed_id = None
self.columns = []
self.column_order = list(self.column_classes)
self.action_group = Gtk.ActionGroup("ColumnActions")
def make_entry(col_class):
return ("show-%s-column" % (col_class.name,),
None,
col_class.label_header,
None,
None,
None,
True,)
entries = [make_entry(cls) for cls in self.column_classes]
self.action_group.add_toggle_actions(entries)
def iter_items(self):
return iter(self.columns)
def attach(self):
for col_class in self.column_classes:
action = self.get_toggle_action(col_class)
if action.props.active:
self._add_column(col_class())
action.connect("toggled",
self.__handle_show_column_action_toggled,
col_class.name)
self.__columns_changed_id = self.view.connect("columns-changed",
self.__handle_view_columns_changed)
def detach(self):
if self.__columns_changed_id is not None:
self.view.disconnect(self.__columns_changed_id)
self.__columns_changed_id = None
def attach_sort(self):
sort_model = self.view.get_model()
# Inform the sorted tree model of any custom sorting functions.
for col_class in self.column_classes:
if col_class.get_sort_func:
sort_func = col_class.get_sort_func()
sort_model.set_sort_func(col_class.id, sort_func)
def enable_sort(self):
sort_model = self.view.get_model()
if sort_model:
self.logger.debug("activating sort")
sort_model.set_sort_column_id(*self.default_sort)
self.default_sort = None
else:
self.logger.debug("not activating sort (no model set)")
def disable_sort(self):
self.logger.debug("deactivating sort")
sort_model = self.view.get_model()
self.default_sort = tree_sortable_get_sort_column_id(sort_model)
sort_model.set_sort_column_id(TREE_SORTABLE_UNSORTED_COLUMN_ID,
Gtk.SortType.ASCENDING)
def set_zoom(self, scale):
for column in self.columns:
cell = column.view_column.get_cells()[0]
cell.props.scale = scale
column.view_column.queue_resize()
self.zoom = scale
def set_base_time(self, base_time):
try:
time_column = self.find_item(name=TimeColumn.name)
except KeyError:
return
time_column.set_base_time(base_time)
self.size_column(time_column)
def get_toggle_action(self, column_class):
action_name = "show-%s-column" % (column_class.name,)
return self.action_group.get_action(action_name)
def get_initial_column_order(self):
return tuple(self.column_classes)
def _add_column(self, column):
name = column.name
pos = self.__get_column_insert_position(column)
if self.view.props.fixed_height_mode:
column.view_column.props.sizing = Gtk.TreeViewColumnSizing.FIXED
cell = column.view_column.get_cells()[0]
cell.props.scale = self.zoom
self.columns.insert(pos, column)
self.view.insert_column(column.view_column, pos)
def _remove_column(self, column):
self.columns.remove(column)
self.view.remove_column(column.view_column)
def __get_column_insert_position(self, column):
col_class = self.find_item_class(name=column.name)
pos = self.column_order.index(col_class)
before = self.column_order[:pos]
shown_names = [col.name for col in self.columns]
for col_class in before:
if col_class.name not in shown_names:
pos -= 1
return pos
def __iter_next_hidden(self, column_class):
pos = self.column_order.index(column_class)
rest = self.column_order[pos + 1:]
for next_class in rest:
try:
self.find_item(name=next_class.name)
except KeyError:
# No instance -- the column is hidden.
yield next_class
else:
break
def __handle_show_column_action_toggled(self, toggle_action, name):
if toggle_action.props.active:
try:
# This should fail.
column = self.find_item(name=name)
except KeyError:
col_class = self.find_item_class(name=name)
self._add_column(col_class())
else:
# Out of sync for some reason.
return
else:
try:
column = self.find_item(name=name)
except KeyError:
# Out of sync for some reason.
return
else:
self._remove_column(column)
def __handle_view_columns_changed(self, element_view):
view_columns = element_view.get_columns()
new_visible = [self.find_item(view_column=column)
for column in view_columns]
# We only care about reordering here.
if len(new_visible) != len(self.columns):
return
if new_visible != self.columns:
new_order = []
for column in new_visible:
col_class = self.find_item_class(name=column.name)
new_order.append(col_class)
new_order.extend(self.__iter_next_hidden(col_class))
names = (column.name for column in new_visible)
self.logger.debug("visible columns reordered: %s",
", ".join(names))
self.columns[:] = new_visible
self.column_order[:] = new_order
class ViewColumnManager (ColumnManager):
column_classes = (
TimeColumn, LevelColumn, PidColumn, ThreadColumn, CategoryColumn,
CodeColumn, FunctionColumn, ObjectColumn, MessageColumn,)
default_column_classes = (
TimeColumn, LevelColumn, CategoryColumn, CodeColumn,
FunctionColumn, ObjectColumn, MessageColumn,)
def __init__(self, state):
ColumnManager.__init__(self)
self.logger = logging.getLogger("ui.columns")
self.state = state
def attach(self, view):
self.view = view
view.connect("notify::model", self.__handle_notify_model)
order = self.state.column_order
if len(order) == len(self.column_classes):
self.column_order[:] = order
visible = self.state.columns_visible
if not visible:
visible = self.default_column_classes
for col_class in self.column_classes:
action = self.get_toggle_action(col_class)
action.props.active = (col_class in visible)
ColumnManager.attach(self)
self.columns_sized = False
def detach(self):
self.state.column_order = self.column_order
self.state.columns_visible = self.columns
return ColumnManager.detach(self)
def set_zoom(self, scale):
ColumnManager.set_zoom(self, scale)
if self.view is None:
return
# Timestamp and log level columns are pretty much fixed size, so resize
# them back to default on zoom change:
names = (TimeColumn.name,
LevelColumn.name,
PidColumn.name,
ThreadColumn.name)
for column in self.columns:
if column.name in names:
self.size_column(column)
def size_column(self, column):
if column.default_size is None:
default_size = column.compute_default_size()
else:
default_size = column.default_size
# FIXME: Abstract away fixed size setting in Column class!
if default_size is None:
# Dummy fallback:
column.view_column.props.fixed_width = 50
self.logger.warning(
"%s column does not implement default size", column.name)
else:
column.view_column.props.fixed_width = default_size
def _add_column(self, column):
result = ColumnManager._add_column(self, column)
self.size_column(column)
return result
def _remove_column(self, column):
column.default_size = column.view_column.props.fixed_width
return ColumnManager._remove_column(self, column)
def __handle_notify_model(self, view, gparam):
if self.columns_sized:
# Already sized.
return
model = self.view.get_model()
if model is None:
return
self.logger.debug("model changed, sizing columns")
for column in self.iter_items():
self.size_column(column)
self.columns_sized = True
class WrappingMessageColumn (MessageColumn):
def wrap_to_width(self, width):
col = self.view_column
col.props.max_width = width
col.get_cells()[0].props.wrap_width = width
col.queue_resize()
class LineViewColumnManager (ColumnManager):
column_classes = (TimeColumn, WrappingMessageColumn,)
def __init__(self):
ColumnManager.__init__(self)
def attach(self, window):
self.__size_update = None
self.view = window.widgets.line_view
self.view.set_size_request(0, 0)
self.view.connect_after("size-allocate", self.__handle_size_allocate)
ColumnManager.attach(self)
def __update_sizes(self):
view_width = self.view.get_allocation().width
if view_width == self.__size_update:
# Prevent endless recursion.
return
self.__size_update = view_width
col = self.find_item(name="time")
other_width = col.view_column.props.width
try:
col = self.find_item(name="message")
except KeyError:
return
width = view_width - other_width
col.wrap_to_width(width)
def __handle_size_allocate(self, self_, allocation):
self.__update_sizes()

View file

@ -0,0 +1,114 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer GUI module."""
from GstDebugViewer.GUI.models import LogModelBase
def get_comparison_function(all_but_this):
if (all_but_this):
return lambda x, y: x == y
else:
return lambda x, y: x != y
class Filter (object):
pass
class DebugLevelFilter (Filter):
only_this, all_but_this, this_and_above = range(3)
def __init__(self, debug_level, mode=0):
col_id = LogModelBase.COL_LEVEL
if mode == self.this_and_above:
def comparison_function(x, y):
return x < y
else:
comparison_function = get_comparison_function(
mode == self.all_but_this)
def filter_func(row):
return comparison_function(row[col_id], debug_level)
self.filter_func = filter_func
class CategoryFilter (Filter):
def __init__(self, category, all_but_this=False):
col_id = LogModelBase.COL_CATEGORY
comparison_function = get_comparison_function(all_but_this)
def category_filter_func(row):
return comparison_function(row[col_id], category)
self.filter_func = category_filter_func
class ObjectFilter (Filter):
def __init__(self, object_, all_but_this=False):
col_id = LogModelBase.COL_OBJECT
comparison_function = get_comparison_function(all_but_this)
def object_filter_func(row):
return comparison_function(row[col_id], object_)
self.filter_func = object_filter_func
class FunctionFilter (Filter):
def __init__(self, function_, all_but_this=False):
col_id = LogModelBase.COL_FUNCTION
comparison_function = get_comparison_function(all_but_this)
def function_filter_func(row):
return comparison_function(row[col_id], function_)
self.filter_func = function_filter_func
class ThreadFilter (Filter):
def __init__(self, thread_, all_but_this=False):
col_id = LogModelBase.COL_THREAD
comparison_function = get_comparison_function(all_but_this)
def thread_filter_func(row):
return comparison_function(row[col_id], thread_)
self.filter_func = thread_filter_func
class FilenameFilter (Filter):
def __init__(self, filename, all_but_this=False):
col_id = LogModelBase.COL_FILENAME
comparison_function = get_comparison_function(all_but_this)
def filename_filter_func(row):
return comparison_function(row[col_id], filename)
self.filter_func = filename_filter_func

View file

@ -0,0 +1,498 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer GUI module."""
from array import array
from bisect import bisect_left
import logging
from gi.repository import GObject
from gi.repository import Gtk
from GstDebugViewer import Common, Data
class LogModelBase (Common.GUI.GenericTreeModel, metaclass=Common.GUI.MetaModel):
columns = ("COL_TIME", GObject.TYPE_UINT64,
"COL_PID", int,
"COL_THREAD", GObject.TYPE_UINT64,
"COL_LEVEL", object,
"COL_CATEGORY", str,
"COL_FILENAME", str,
"COL_LINE_NUMBER", int,
"COL_FUNCTION", str,
"COL_OBJECT", str,
"COL_MESSAGE", str,)
def __init__(self):
Common.GUI.GenericTreeModel.__init__(self)
# self.props.leak_references = False
self.line_offsets = array("I")
self.line_levels = [] # FIXME: Not so nice!
self.line_cache = {}
def ensure_cached(self, line_offset):
raise NotImplementedError("derived classes must override this method")
def access_offset(self, offset):
raise NotImplementedError("derived classes must override this method")
def iter_rows_offset(self):
ensure_cached = self.ensure_cached
line_cache = self.line_cache
line_levels = self.line_levels
COL_LEVEL = self.COL_LEVEL
COL_MESSAGE = self.COL_MESSAGE
access_offset = self.access_offset
for i, offset in enumerate(self.line_offsets):
ensure_cached(offset)
row = line_cache[offset]
# adjust special rows
row[COL_LEVEL] = line_levels[i]
msg_offset = row[COL_MESSAGE]
row[COL_MESSAGE] = access_offset(offset + msg_offset)
yield (row, offset,)
row[COL_MESSAGE] = msg_offset
def on_get_flags(self):
flags = Gtk.TreeModelFlags.LIST_ONLY | Gtk.TreeModelFlags.ITERS_PERSIST
return flags
def on_get_n_columns(self):
return len(self.column_types)
def on_get_column_type(self, col_id):
return self.column_types[col_id]
def on_get_iter(self, path):
if not path:
return
if len(path) > 1:
# Flat model.
return None
line_index = path[0]
if line_index > len(self.line_offsets) - 1:
return None
return line_index
def on_get_path(self, rowref):
line_index = rowref
return (line_index,)
def on_get_value(self, line_index, col_id):
last_index = len(self.line_offsets) - 1
if line_index > last_index:
return None
if col_id == self.COL_LEVEL:
return self.line_levels[line_index]
line_offset = self.line_offsets[line_index]
self.ensure_cached(line_offset)
value = self.line_cache[line_offset][col_id]
if col_id == self.COL_MESSAGE:
# strip whitespace + newline
value = self.access_offset(line_offset + value).strip()
elif col_id in (self.COL_TIME, self.COL_THREAD):
value = GObject.Value(GObject.TYPE_UINT64, value)
return value
def get_value_range(self, col_id, start, stop):
if col_id != self.COL_LEVEL:
raise NotImplementedError("XXX FIXME")
return self.line_levels[start:stop]
def on_iter_next(self, line_index):
last_index = len(self.line_offsets) - 1
if line_index >= last_index:
return None
else:
return line_index + 1
def on_iter_children(self, parent):
return self.on_iter_nth_child(parent, 0)
def on_iter_has_child(self, rowref):
return False
def on_iter_n_children(self, rowref):
if rowref is not None:
return 0
return len(self.line_offsets)
def on_iter_nth_child(self, parent, n):
last_index = len(self.line_offsets) - 1
if parent or n > last_index:
return None
return n
def on_iter_parent(self, child):
return None
# def on_ref_node (self, rowref):
# pass
# def on_unref_node (self, rowref):
# pass
class LazyLogModel (LogModelBase):
def __init__(self, log_obj=None):
LogModelBase.__init__(self)
self.__log_obj = log_obj
if log_obj:
self.set_log(log_obj)
def set_log(self, log_obj):
self.__fileobj = log_obj.fileobj
self.line_cache.clear()
self.line_offsets = log_obj.line_cache.offsets
self.line_levels = log_obj.line_cache.levels
def access_offset(self, offset):
# TODO: Implement using one slice access instead of seek+readline.
self.__fileobj.seek(offset)
return self.__fileobj.readline()
def ensure_cached(self, line_offset):
if line_offset in self.line_cache:
return
if len(self.line_cache) > 10000:
self.line_cache.clear()
self.__fileobj.seek(line_offset)
line = self.__fileobj.readline()
self.line_cache[line_offset] = Data.LogLine.parse_full(line)
class FilteredLogModelBase (LogModelBase):
def __init__(self, super_model):
LogModelBase.__init__(self)
self.logger = logging.getLogger("filter-model-base")
self.super_model = super_model
self.access_offset = super_model.access_offset
self.ensure_cached = super_model.ensure_cached
self.line_cache = super_model.line_cache
def line_index_to_super(self, line_index):
raise NotImplementedError("index conversion not supported")
def line_index_from_super(self, super_line_index):
raise NotImplementedError("index conversion not supported")
class FilteredLogModel (FilteredLogModelBase):
def __init__(self, super_model):
FilteredLogModelBase.__init__(self, super_model)
self.logger = logging.getLogger("filtered-log-model")
self.filters = []
self.reset()
self.__active_process = None
self.__filter_progress = 0.
def reset(self):
self.logger.debug("reset filter")
self.line_offsets = self.super_model.line_offsets
self.line_levels = self.super_model.line_levels
self.super_index = range(len(self.line_offsets))
del self.filters[:]
def __filter_process(self, filter):
YIELD_LIMIT = 10000
self.logger.debug("preparing new filter")
new_line_offsets = array("I")
new_line_levels = []
new_super_index = array("I")
level_id = self.COL_LEVEL
func = filter.filter_func
def enum():
i = 0
for row, offset in self.iter_rows_offset():
line_index = self.super_index[i]
yield (line_index, row, offset,)
i += 1
self.logger.debug("running filter")
progress = 0.
progress_full = float(len(self))
y = YIELD_LIMIT
for i, row, offset in enum():
if func(row):
new_line_offsets.append(offset)
new_line_levels.append(row[level_id])
new_super_index.append(i)
y -= 1
if y == 0:
progress += float(YIELD_LIMIT)
self.__filter_progress = progress / progress_full
y = YIELD_LIMIT
yield True
self.line_offsets = new_line_offsets
self.line_levels = new_line_levels
self.super_index = new_super_index
self.logger.debug("filtering finished")
self.__filter_progress = 1.
self.__handle_filter_process_finished()
yield False
def add_filter(self, filter, dispatcher):
if self.__active_process is not None:
raise ValueError("dispatched a filter process already")
self.logger.debug("adding filter")
self.filters.append(filter)
self.__dispatcher = dispatcher
self.__active_process = self.__filter_process(filter)
dispatcher(self.__active_process)
def abort_process(self):
if self.__active_process is None:
raise ValueError("no filter process running")
self.__dispatcher.cancel()
self.__active_process = None
self.__dispatcher = None
del self.filters[-1]
def get_filter_progress(self):
if self.__active_process is None:
raise ValueError("no filter process running")
return self.__filter_progress
def __handle_filter_process_finished(self):
self.__active_process = None
self.handle_process_finished()
def handle_process_finished(self):
pass
def line_index_from_super(self, super_line_index):
return bisect_left(self.super_index, super_line_index)
def line_index_to_super(self, line_index):
return self.super_index[line_index]
def set_range(self, super_start, super_stop):
old_super_start = self.line_index_to_super(0)
old_super_stop = self.line_index_to_super(
len(self.super_index) - 1) + 1
self.logger.debug("set range (%i, %i), current (%i, %i)",
super_start, super_stop, old_super_start, old_super_stop)
if len(self.filters) == 0:
# Identity.
self.super_index = range(super_start, super_stop)
self.line_offsets = SubRange(self.super_model.line_offsets,
super_start, super_stop)
self.line_levels = SubRange(self.super_model.line_levels,
super_start, super_stop)
return
if super_start < old_super_start:
# TODO:
raise NotImplementedError("Only handling further restriction of the range"
" (start offset = %i)" % (super_start,))
if super_stop > old_super_stop:
# TODO:
raise NotImplementedError("Only handling further restriction of the range"
" (end offset = %i)" % (super_stop,))
start = self.line_index_from_super(super_start)
stop = self.line_index_from_super(super_stop)
self.super_index = SubRange(self.super_index, start, stop)
self.line_offsets = SubRange(self.line_offsets, start, stop)
self.line_levels = SubRange(self.line_levels, start, stop)
class SubRange (object):
__slots__ = ("size", "start", "stop",)
def __init__(self, size, start, stop):
if start > stop:
raise ValueError(
"need start <= stop (got %r, %r)" % (start, stop,))
if isinstance(size, type(self)):
# Another SubRange, don't stack:
start += size.start
stop += size.start
size = size.size
self.size = size
self.start = start
self.stop = stop
def __getitem__(self, i):
if isinstance(i, slice):
stop = i.stop
if stop >= 0:
stop += self.start
else:
stop += self.stop
return self.size[i.start + self.start:stop]
else:
return self.size[i + self.start]
def __len__(self):
return self.stop - self.start
def __iter__(self):
size = self.size
for i in range(self.start, self.stop):
yield size[i]
class LineViewLogModel (FilteredLogModelBase):
def __init__(self, super_model):
FilteredLogModelBase.__init__(self, super_model)
self.line_offsets = []
self.line_levels = []
self.parent_indices = []
def reset(self):
del self.line_offsets[:]
del self.line_levels[:]
def line_index_to_super(self, line_index):
return self.parent_indices[line_index]
def insert_line(self, position, super_line_index):
if position == -1:
position = len(self.line_offsets)
li = super_line_index
self.line_offsets.insert(position, self.super_model.line_offsets[li])
self.line_levels.insert(position, self.super_model.line_levels[li])
self.parent_indices.insert(position, super_line_index)
path = (position,)
tree_iter = self.get_iter(path)
self.row_inserted(path, tree_iter)
def replace_line(self, line_index, super_line_index):
li = line_index
self.line_offsets[li] = self.super_model.line_offsets[super_line_index]
self.line_levels[li] = self.super_model.line_levels[super_line_index]
self.parent_indices[li] = super_line_index
path = (line_index,)
tree_iter = self.get_iter(path)
self.row_changed(path, tree_iter)
def remove_line(self, line_index):
for l in (self.line_offsets,
self.line_levels,
self.parent_indices,):
del l[line_index]
path = (line_index,)
self.row_deleted(path)

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,61 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer Main module."""
import sys
import optparse
from gettext import gettext as _, ngettext
from gi.repository import GLib
from GstDebugViewer import GUI
import GstDebugViewer.Common.Main
Common = GstDebugViewer.Common
GETTEXT_DOMAIN = "gst-debug-viewer"
def main_version(opt, value, parser, *args, **kwargs):
from GstDebugViewer import version
print("GStreamer Debug Viewer %s" % (version,))
sys.exit(0)
class Paths (Common.Main.PathsProgramBase):
program_name = "gst-debug-viewer"
def main():
parser = optparse.OptionParser(
_("%prog [OPTION...] [FILENAME]"),
description=_("Display and analyze GStreamer debug log files"))
parser.add_option("--version", "-v",
action="callback",
dest="version",
callback=main_version,
help=_("Display version and exit"))
Common.Main.main(main_function=GUI.main,
option_parser=parser,
gettext_domain=GETTEXT_DOMAIN,
paths=Paths)

View file

@ -0,0 +1,497 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer timeline widget plugin."""
import logging
from GstDebugViewer import Common, Data, GUI
from GstDebugViewer.Plugins import FeatureBase, PluginBase, _N
from gettext import gettext as _
from gi.repository import GObject, GLib
from gi.repository import Gtk
class SearchOperation (object):
def __init__(self, model, search_text, search_forward=True, start_position=None):
self.model = model
if isinstance(search_text, str):
self.search_text = search_text.encode('utf8')
else:
self.search_text = search_text
self.search_forward = search_forward
self.start_position = start_position
col_id = GUI.models.LogModelBase.COL_MESSAGE
len_search_text = len(self.search_text)
def match_func(model_row):
message = model_row[col_id]
if self.search_text in message:
ranges = []
start = 0
while True:
pos = message.find(self.search_text, start)
if pos == -1:
break
ranges.append((pos, pos + len_search_text,))
start = pos + len_search_text
return ranges
else:
return ()
self.match_func = match_func
class SearchSentinel (object):
def __init__(self):
self.dispatcher = Common.Data.GSourceDispatcher()
self.cancelled = False
def run_for(self, operation):
self.dispatcher.cancel()
self.dispatcher(self.__process(operation))
self.cancelled = False
def abort(self):
self.dispatcher.cancel()
self.cancelled = True
def __process(self, operation):
model = operation.model
if operation.start_position is not None:
start_pos = operation.start_position
elif operation.search_forward:
start_pos = 0
else:
start_pos = len(model) - 1
start_iter = model.iter_nth_child(None, start_pos)
match_func = operation.match_func
if operation.search_forward:
iter_next = model.iter_next
else:
# FIXME: This is really ugly.
nth_child = model.iter_nth_child
def iter_next_():
for i in range(start_pos, -1, -1):
yield nth_child(None, i)
yield None
it_ = iter_next_()
def iter_next(it):
return it_.__next__()
YIELD_LIMIT = 1000
i = YIELD_LIMIT
tree_iter = start_iter
while tree_iter and not self.cancelled:
i -= 1
if i == 0:
yield True
i = YIELD_LIMIT
row = model[tree_iter]
if match_func(row):
self.handle_match_found(model, tree_iter)
tree_iter = iter_next(tree_iter)
if not self.cancelled:
self.handle_search_complete()
yield False
def handle_match_found(self, model, tree_iter):
pass
def handle_search_complete(self):
pass
class FindBarWidget (Gtk.HBox):
__status = {"no-match-found": _N("No match found"),
"searching": _N("Searching...")}
def __init__(self, action_group):
GObject.GObject.__init__(self)
label = Gtk.Label(label=_("Find:"))
self.pack_start(label, False, False, 2)
self.entry = Gtk.Entry()
self.pack_start(self.entry, True, True, 0)
prev_action = action_group.get_action("goto-previous-search-result")
prev_button = Gtk.Button()
prev_button.set_related_action(prev_action)
self.pack_start(prev_button, False, False, 0)
next_action = action_group.get_action("goto-next-search-result")
next_button = Gtk.Button()
next_button.set_related_action(next_action)
self.pack_start(next_button, False, False, 0)
self.status_label = Gtk.Label()
self.status_label.props.xalign = 0.
self.status_label.props.use_markup = True
self.pack_start(self.status_label, False, False, 6)
self.__compute_status_size()
self.status_label.connect("notify::style", self.__handle_notify_style)
self.show_all()
def __compute_status_size(self):
label = self.status_label
old_markup = label.props.label
label.set_size_request(-1, -1)
max_width = 0
try:
for status in self.__status.values():
self.__set_status(_(status))
req = label.size_request()
max_width = max(max_width, req.width)
label.set_size_request(max_width, -1)
finally:
label.props.label = old_markup
def __handle_notify_style(self, *a, **kw):
self.__compute_status_size()
def __set_status(self, text):
markup = "<b>%s</b>" % (GLib.markup_escape_text(text),)
self.status_label.props.label = markup
def status_no_match_found(self):
self.__set_status(_(self.__status["no-match-found"]))
def status_searching(self):
self.__set_status(_(self.__status["searching"]))
def clear_status(self):
self.__set_status("")
class FindBarFeature (FeatureBase):
def __init__(self, app):
FeatureBase.__init__(self, app)
self.logger = logging.getLogger("ui.findbar")
self.action_group = Gtk.ActionGroup("FindBarActions")
self.action_group.add_toggle_actions([("show-find-bar",
None,
_("Find Bar"),
"<Ctrl>F")])
self.action_group.add_actions([("goto-next-search-result",
None, _("Goto Next Match"),
"<Ctrl>G"),
("goto-previous-search-result",
None, _("Goto Previous Match"),
"<Ctrl><Shift>G")])
self.bar = None
self.operation = None
self.search_state = None
self.next_match = None
self.prev_match = None
self.scroll_match = False
self.sentinel = SearchSentinel()
self.sentinel.handle_match_found = self.handle_match_found
self.sentinel.handle_search_complete = self.handle_search_complete
def scroll_view_to_line(self, line_index):
view = self.log_view
path = Gtk.TreePath((line_index,))
start_path, end_path = view.get_visible_range()
if path >= start_path and path <= end_path:
self.logger.debug(
"line index %i already visible, not scrolling", line_index)
return
self.logger.debug("scrolling to line_index %i", line_index)
view.scroll_to_cell(path, use_align=True, row_align=.5)
def handle_attach_window(self, window):
self.window = window
ui = window.ui_manager
ui.insert_action_group(self.action_group, 0)
self.log_view = window.log_view
self.merge_id = ui.new_merge_id()
for name, action_name in [("ViewFindBar", "show-find-bar",),
("ViewNextResult",
"goto-next-search-result",),
("ViewPrevResult", "goto-previous-search-result",)]:
ui.add_ui(self.merge_id, "/menubar/ViewMenu/ViewMenuAdditions",
name, action_name, Gtk.UIManagerItemType.MENUITEM, False)
box = window.widgets.vbox_view
self.bar = FindBarWidget(self.action_group)
box.pack_end(self.bar, False, False, 0)
self.bar.hide()
action = self.action_group.get_action("show-find-bar")
handler = self.handle_show_find_bar_action_toggled
action.connect("toggled", handler)
action = self.action_group.get_action("goto-previous-search-result")
handler = self.handle_goto_previous_search_result_action_activate
action.props.sensitive = False
action.connect("activate", handler)
action = self.action_group.get_action("goto-next-search-result")
handler = self.handle_goto_next_search_result_action_activate
action.props.sensitive = False
action.connect("activate", handler)
self.bar.entry.connect("changed", self.handle_entry_changed)
def handle_detach_window(self, window):
self.window = None
window.ui_manager.remove_ui(self.merge_id)
self.merge_id = None
def handle_show_find_bar_action_toggled(self, action):
if action.props.active:
self.bar.show()
self.bar.entry.grab_focus()
self.update_search()
else:
try:
column = self.window.column_manager.find_item(
name="message")
del column.highlighters[self]
except KeyError:
pass
self.bar.clear_status()
self.bar.hide()
for action_name in ["goto-next-search-result",
"goto-previous-search-result"]:
self.action_group.get_action(
action_name).props.sensitive = False
def handle_goto_previous_search_result_action_activate(self, action):
if self.prev_match is None:
self.logger.warning("inconsistent action sensitivity")
return
self.scroll_view_to_line(self.prev_match)
self.prev_match = None
start_path = self.log_view.get_visible_range()[0]
new_position = start_path[0] - 1
self.start_search_operation(start_position=new_position,
forward=False)
# FIXME
def handle_goto_next_search_result_action_activate(self, action):
if self.next_match is None:
self.logger.warning("inconsistent action sensitivity")
return
self.scroll_view_to_line(self.next_match)
self.next_match = None
end_path = self.log_view.get_visible_range()[1]
new_position = end_path[0] + 1
self.start_search_operation(start_position=new_position,
forward=True)
# FIXME: Finish.
# model = self.log_view.get_model ()
# start_path, end_path = self.log_view.get_visible_range ()
# start_index, end_index = start_path[0], end_path[0]
# for line_index in self.matches:
# if line_index > end_index:
# break
# else:
# return
# self.scroll_view_to_line (line_index)
def handle_entry_changed(self, entry):
self.update_search()
def update_search(self):
model = self.log_view.get_model()
search_text = self.bar.entry.props.text
column = self.window.column_manager.find_item(name="message")
if search_text == "":
self.logger.debug("search string set to '', aborting search")
self.search_state = None
self.next_match = None
self.prev_match = None
self.update_sensitivity()
self.sentinel.abort()
try:
del column.highlighters[self]
except KeyError:
pass
else:
self.logger.debug("starting search for %r", search_text)
self.next_match = None
self.prev_match = None
self.update_sensitivity()
self.scroll_match = True
start_path = self.log_view.get_visible_range()[0]
self.start_search_operation(
search_text, start_position=start_path[0])
self.bar.status_searching()
column.highlighters[self] = self.operation.match_func
self.window.update_view()
def update_sensitivity(self):
for name, value in (("goto-next-search-result", self.next_match,),
("goto-previous-search-result", self.prev_match,),):
action = self.action_group.get_action(name)
action.props.sensitive = (value is not None)
def start_search_operation(self, search_text=None, forward=True, start_position=None):
model = self.log_view.get_model()
if forward:
self.search_state = "search-forward"
if start_position is None:
start_position = 0
else:
self.search_state = "search-backward"
if start_position is None:
start_position = len(model) - 1
if search_text is None:
operation = self.operation
if operation is None:
raise ValueError(
"search_text not given but have no previous search operation")
search_text = operation.search_text
self.operation = SearchOperation(model, search_text,
start_position=start_position,
search_forward=forward)
self.sentinel.run_for(self.operation)
def handle_match_found(self, model, tree_iter):
if self.search_state not in ("search-forward", "search-backward",):
self.logger.warning(
"inconsistent search state %r", self.search_state)
return
line_index = model.get_path(tree_iter)[0]
forward_search = (self.search_state == "search-forward")
if forward_search:
self.logger.debug("forward search for %r matches line %i",
self.operation.search_text, line_index)
else:
self.logger.debug("backward search for %r matches line %i",
self.operation.search_text, line_index)
self.sentinel.abort()
if self.scroll_match:
self.logger.debug("scrolling to matching line")
self.scroll_view_to_line(line_index)
# Now search for the next one:
self.scroll_match = False
# FIXME: Start with first line that is outside of the visible
# range.
self.start_search_operation(start_position=line_index + 1,
forward=forward_search)
else:
if forward_search:
self.next_match = line_index
self.search_state = "search-backward"
self.start_search_operation(forward=False,
start_position=line_index - 1)
else:
self.prev_match = line_index
self.update_sensitivity()
self.bar.clear_status()
def handle_search_complete(self):
if self.search_state == "search-forward":
self.logger.debug("forward search for %r reached last line",
self.operation.search_text)
self.next_match = None
elif self.search_state == "search-backward":
self.logger.debug("backward search for %r reached first line",
self.operation.search_text)
self.prev_match = None
else:
self.logger.warning("inconsistent search state %r",
self.search_state)
return
self.update_sensitivity()
if self.prev_match is None and self.next_match is None:
self.bar.status_no_match_found()
class Plugin (PluginBase):
features = (FindBarFeature,)

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,103 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer Plugins package."""
__all__ = ["_", "_N", "FeatureBase", "PluginBase"]
import os.path
def _N(s):
return s
def load(paths=()):
for path in paths:
for plugin_module in _load_plugins(path):
yield plugin_module.Plugin
def _load_plugins(path):
import imp
import glob
files = glob.glob(os.path.join(path, "*.py"))
for filename in files:
name = os.path.basename(os.path.splitext(filename)[0])
if name == "__init__":
continue
fp, pathname, description = imp.find_module(name, [path])
module = imp.load_module(name, fp, pathname, description)
yield module
class FeatureBase (object):
def __init__(self, app):
pass
def handle_attach_window(self, window):
"""
window: GstDebugViewer.GUI.window.Window
"""
pass
def handle_attach_log_file(self, window, log_file):
"""
window: GstDebugViewer.GUI.window.Window
log_file: GstDebugViewer.Data.LogFile
"""
pass
def handle_detach_log_file(self, window, log_file):
"""
window: GstDebugViewer.GUI.window.Window
log_file: GstDebugViewer.Data.LogFile
"""
pass
def handle_detach_window(self, window):
"""
window: GstDebugViewer.GUI.window.Window
"""
pass
class PluginBase (object):
"""
All plugins must implement a class called Plugin inheriting from PluginBase.
They should place a tuple of features they export into 'features'. Each
feature should be a subclass of FeatureBase.
"""
features = ()
def __init__(self, app):
pass

View file

@ -0,0 +1,29 @@
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer package."""
version = "@VERSION@"
if version.startswith('@'):
version = 'master'
__version__ = version
from GstDebugViewer.Main import Paths, GETTEXT_DOMAIN, main as run # noqa

View file

@ -0,0 +1,54 @@
#!/usr/bin/env python
def line_string(ts, pid, thread, level, category, filename, line, function,
object_, message):
# Replicates gstreamer/gst/gstinfo.c:gst_debug_log_default.
# FIXME: Regarding object_, this doesn't fully replicate the formatting!
return "%s %5d 0x%x %s %20s %s:%d:%s:<%s> %s" % (Data.time_args(ts), pid, thread,
level.name.ljust(
5), category,
filename, line, function,
object_, message,)
def main():
import sys
import os.path
sys.path.append(os.path.dirname(os.path.dirname(sys.argv[0])))
global Data
from GstDebugViewer import Data
count = 100000
ts = 0
pid = 12345
thread = int("89abcdef", 16)
level = Data.debug_level_log
category = "GST_DUMMY"
filename = "gstdummyfilename.c"
file_line = 1
function = "gst_dummy_function"
object_ = "dummyobj0"
message = "dummy message with no content"
levels = (Data.debug_level_log,
Data.debug_level_debug,
Data.debug_level_info,)
shift = 0
for i in range(count):
ts = i * 10000
shift += i % (count // 100)
level = levels[(i + shift) % 3]
print(line_string(ts, pid, thread, level, category, filename, file_line,
function, object_, message))
if __name__ == "__main__":
main()

View file

@ -0,0 +1,79 @@
#!/usr/bin/env python
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer performance test program."""
import sys
import os
import os.path
from glob import glob
import time
import gi
from gi.repository import GObject
from .. import Common, Data, GUI
class TestParsingPerformance (object):
def __init__(self, filename):
self.main_loop = GObject.MainLoop()
self.log_file = Data.LogFile(filename, Common.Data.DefaultDispatcher())
self.log_file.consumers.append(self)
def start(self):
self.log_file.start_loading()
def handle_load_started(self):
self.start_time = time.time()
def handle_load_finished(self):
diff = time.time() - self.start_time
print("line cache built in %0.1f ms" % (diff * 1000.,))
start_time = time.time()
model = GUI.LazyLogModel(self.log_file)
for row in model:
pass
diff = time.time() - start_time
print("model iterated in %0.1f ms" % (diff * 1000.,))
print("overall time spent: %0.1f s" % (time.time() - self.start_time,))
import resource
rusage = resource.getrusage(resource.RUSAGE_SELF)
print("time spent in user mode: %.2f s" % (rusage.ru_utime,))
print("time spent in system mode: %.2f s" % (rusage.ru_stime,))
def main():
if len(sys.argv) > 1:
test = TestParsingPerformance(sys.argv[1])
test.start()
if __name__ == "__main__":
main()

View file

@ -0,0 +1,281 @@
#!/usr/bin/env python
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer test suite for the custom tree models."""
import sys
import os
import os.path
from glob import glob
from unittest import TestCase, main as test_main
from .. import Common, Data
from .. GUI.filters import CategoryFilter, Filter
from .. GUI.models import (FilteredLogModel,
LogModelBase,
SubRange,)
class TestSubRange (TestCase):
def test_len(self):
values = list(range(20))
sr = SubRange(values, 0, 20)
self.assertEqual(len(sr), 20)
sr = SubRange(values, 10, 20)
self.assertEqual(len(sr), 10)
sr = SubRange(values, 0, 10)
self.assertEqual(len(sr), 10)
sr = SubRange(values, 5, 15)
self.assertEqual(len(sr), 10)
def test_iter(self):
values = list(range(20))
sr = SubRange(values, 0, 20)
self.assertEqual(list(sr), values)
sr = SubRange(values, 10, 20)
self.assertEqual(list(sr), list(range(10, 20)))
sr = SubRange(values, 0, 10)
self.assertEqual(list(sr), list(range(0, 10)))
sr = SubRange(values, 5, 15)
self.assertEqual(list(sr), list(range(5, 15)))
class Model (LogModelBase):
def __init__(self):
LogModelBase.__init__(self)
for i in range(20):
self.line_offsets.append(i * 100)
self.line_levels.append(Data.debug_level_debug)
def ensure_cached(self, line_offset):
pid = line_offset // 100
if pid % 2 == 0:
category = b"EVEN"
else:
category = b"ODD"
line_fmt = (b"0:00:00.000000000 %5i 0x0000000 DEBUG "
b"%20s dummy.c:1:dummy: dummy")
line_str = line_fmt % (pid, category,)
log_line = Data.LogLine.parse_full(line_str)
self.line_cache[line_offset] = log_line
def access_offset(self, line_offset):
return ""
class IdentityFilter (Filter):
def __init__(self):
def filter_func(row):
return True
self.filter_func = filter_func
class RandomFilter (Filter):
def __init__(self, seed):
import random
rand = random.Random()
rand.seed(seed)
def filter_func(row):
return rand.choice((True, False,))
self.filter_func = filter_func
class TestDynamicFilter (TestCase):
def test_unset_filter_rerange(self):
full_model = Model()
filtered_model = FilteredLogModel(full_model)
row_list = self.__row_list
self.assertEqual(row_list(full_model), list(range(20)))
self.assertEqual(row_list(filtered_model), list(range(20)))
filtered_model.set_range(5, 16)
self.assertEqual(row_list(filtered_model), list(range(5, 16)))
def test_identity_filter_rerange(self):
full_model = Model()
filtered_model = FilteredLogModel(full_model)
row_list = self.__row_list
self.assertEqual(row_list(full_model), list(range(20)))
self.assertEqual(row_list(filtered_model), list(range(20)))
filtered_model.add_filter(IdentityFilter(),
Common.Data.DefaultDispatcher())
filtered_model.set_range(5, 16)
self.assertEqual(row_list(filtered_model), list(range(5, 16)))
def test_filtered_range_refilter_skip(self):
full_model = Model()
filtered_model = FilteredLogModel(full_model)
row_list = self.__row_list
filtered_model.add_filter(CategoryFilter("EVEN"),
Common.Data.DefaultDispatcher())
self.__dump_model(filtered_model, "filtered")
self.assertEqual(row_list(filtered_model), list(range(1, 20, 2)))
self.assertEqual([filtered_model.line_index_from_super(i)
for i in range(1, 20, 2)],
list(range(10)))
self.assertEqual([filtered_model.line_index_to_super(i)
for i in range(10)],
list(range(1, 20, 2)))
filtered_model.set_range(1, 20)
self.__dump_model(filtered_model, "ranged (1, 20)")
self.__dump_model(filtered_model, "filtered range")
self.assertEqual([filtered_model.line_index_from_super(i)
for i in range(0, 19, 2)],
list(range(10)))
self.assertEqual([filtered_model.line_index_to_super(i)
for i in range(10)],
list(range(1, 20, 2)))
filtered_model.set_range(2, 20)
self.__dump_model(filtered_model, "ranged (2, 20)")
self.assertEqual(row_list(filtered_model), list(range(3, 20, 2)))
def test_filtered_range_refilter(self):
full_model = Model()
filtered_model = FilteredLogModel(full_model)
row_list = self.__row_list
rows = row_list(full_model)
rows_filtered = row_list(filtered_model)
self.__dump_model(full_model, "full model")
self.assertEqual(rows, rows_filtered)
self.assertEqual([filtered_model.line_index_from_super(i)
for i in range(20)],
list(range(20)))
self.assertEqual([filtered_model.line_index_to_super(i)
for i in range(20)],
list(range(20)))
filtered_model.set_range(5, 16)
self.__dump_model(filtered_model, "ranged model (5, 16)")
rows_ranged = row_list(filtered_model)
self.assertEqual(rows_ranged, list(range(5, 16)))
self.__dump_model(filtered_model, "filtered model (nofilter, 5, 15)")
filtered_model.add_filter(CategoryFilter("EVEN"),
Common.Data.DefaultDispatcher())
rows_filtered = row_list(filtered_model)
self.assertEqual(rows_filtered, list(range(5, 16, 2)))
self.__dump_model(filtered_model, "filtered model")
def test_random_filtered_range_refilter(self):
full_model = Model()
filtered_model = FilteredLogModel(full_model)
row_list = self.__row_list
self.assertEqual(row_list(full_model), list(range(20)))
self.assertEqual(row_list(filtered_model), list(range(20)))
filtered_model.add_filter(RandomFilter(538295943),
Common.Data.DefaultDispatcher())
random_rows = row_list(filtered_model)
self.__dump_model(filtered_model)
filtered_model = FilteredLogModel(full_model)
filtered_model.add_filter(RandomFilter(538295943),
Common.Data.DefaultDispatcher())
self.__dump_model(filtered_model, "filtered model")
self.assertEqual(row_list(filtered_model), random_rows)
filtered_model.set_range(1, 10)
self.__dump_model(filtered_model)
self.assertEqual(row_list(filtered_model), [
x for x in range(0, 10) if x in random_rows])
def __row_list(self, model):
return [row[Model.COL_PID] for row in model]
def __dump_model(self, model, comment=None):
# TODO: Provide a command line option to turn this on and off.
return
if not hasattr(model, "super_model"):
# Top model.
print("\t(%s)" % ("|".join([str(i).rjust(2)
for i in self.__row_list(model)]),), end=' ')
else:
top_model = model.super_model
if hasattr(top_model, "super_model"):
top_model = top_model.super_model
top_indices = self.__row_list(top_model)
positions = self.__row_list(model)
output = [" "] * len(top_indices)
for i, position in enumerate(positions):
output[position] = str(i).rjust(2)
print("\t(%s)" % ("|".join(output),), end=' ')
if comment is None:
print()
else:
print(comment)
if __name__ == "__main__":
test_main()

9
debug-viewer/MANIFEST.in Normal file
View file

@ -0,0 +1,9 @@
recursive-include GstDebugViewer *.py
recursive-include data *.glade *.ui *.svg *.png
recursive-include po *.po
recursive-include tests *.py
include gst-debug-viewer
include gst-debug-viewer.desktop.in
include org.freedesktop.GstDebugViewer.appdata.xml.in
include AUTHORS COPYING ChangeLog MANIFEST.in NEWS README TODO
include po/POTFILES.in

36
debug-viewer/README Normal file
View file

@ -0,0 +1,36 @@
# how to build #
./setup.py build; sudo ./setup.py install --prefix=/usr
sudo chmod a+r /usr/share/gst-debug-viewer/*.ui
# porting issues #
http://stackoverflow.com/questions/11025700/generictreemodel-with-pygobject-introspection-gtk-3
# tips #
OLD: prev_action.connect_proxy(prev_button)
NEW: prev_button.set_related_action (prev_action)
OLD: box.pack_start (widget)
NEW: box.pack_start (widget, True, True, 0)
OLD: column.pack_start (cell)
NEW: column.pack_start (cell, True)
OLD: view_column.get_cell_renderers ()
NEW: column.get_cells ()
# porting ressources #
https://www.xpra.org/trac/ticket/90?cversion=0&cnum_hist=3
https://mail.gnome.org/archives/commits-list/2013-October/msg05205.html
# profiling #
python -m profile -o output.pstats path/to/your/script arg1 arg2
gprof2dot.py -f pstats output.pstats | dot -Tpng -o output.png
~/projects/tools/gprof2dot/gprof2dot.py -f pstats output.pstats | dot -Tpng -o output.png
eog output.png
python -m cProfile -o output.pstats2 ./gst-debug-viewer debug.noansi.log
~/projects/tools/gprof2dot/gprof2dot.py -f pstats output.pstats2 | dot -Tpng -o output2.png
eog output2.png

View file

@ -0,0 +1,701 @@
<?xml version="1.0"?>
<interface>
<!-- interface-requires gtk+ 2.12 -->
<!-- interface-naming-policy toplevel-contextual -->
<object class="GtkAboutDialog" id="about_dialog">
<property name="visible">True</property>
<property name="border_width">5</property>
<property name="type_hint">dialog</property>
<property name="copyright" translatable="yes">Copyright &#xA9; 2007-2009 Ren&#xE9; Stadler</property>
<property name="comments" translatable="yes">View and analyze GStreamer debug files</property>
<property name="license"> GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. &lt;http://fsf.org/&gt;
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
&lt;one line to give the program's name and a brief idea of what it does.&gt;
Copyright (C) &lt;year&gt; &lt;name of author&gt;
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see &lt;http://www.gnu.org/licenses/&gt;.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
&lt;program&gt; Copyright (C) &lt;year&gt; &lt;name of author&gt;
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
&lt;http://www.gnu.org/licenses/&gt;.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
&lt;http://www.gnu.org/philosophy/why-not-lgpl.html&gt;.</property>
<property name="authors">Ren&#xE9; Stadler &lt;mail@renestadler.de&gt;</property>
<property name="translator_credits" translatable="yes" comments="TRANSLATORS: Replace this string with your names, one name per line.">translator-credits</property>
<property name="logo">gst-debug-viewer.png</property>
<child internal-child="vbox">
<object class="GtkVBox" id="dialog-vbox1">
<child internal-child="action_area">
<object class="GtkHButtonBox" id="dialog-action_area1"/>
<packing>
<property name="expand">False</property>
<property name="pack_type">end</property>
<property name="position">0</property>
</packing>
</child>
</object>
</child>
</object>
</interface>

Binary file not shown.

After

Width:  |  Height:  |  Size: 2 KiB

View file

@ -0,0 +1,399 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Generator: Adobe Illustrator 12.0.0, SVG Export Plug-In . SVG Version: 6.00 Build 51448) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://web.resource.org/cc/"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
version="1.0"
id="Layer_1"
width="48"
height="48"
viewBox="0 0 280.22 69.387"
overflow="visible"
enable-background="new 0 0 280.22 69.387"
xml:space="preserve"
sodipodi:version="0.32"
inkscape:version="0.45"
sodipodi:docname="gst-inspector.svg"
sodipodi:docbase="/home/cymacs/Desktop"
inkscape:output_extension="org.inkscape.output.svg.inkscape"
inkscape:export-filename="/home/cymacs/Desktop/gst-inspector.png"
inkscape:export-xdpi="120"
inkscape:export-ydpi="120"
sodipodi:modified="TRUE"><metadata
id="metadata30"><rdf:RDF><cc:Work
rdf:about=""><dc:format>image/svg+xml</dc:format><dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" /></cc:Work></rdf:RDF></metadata><defs
id="defs28"><linearGradient
id="linearGradient2846"><stop
id="stop2848"
offset="0.0000000"
style="stop-color:#8a8a8a;stop-opacity:1.0000000;" /><stop
id="stop2850"
offset="1.0000000"
style="stop-color:#484848;stop-opacity:1.0000000;" /></linearGradient><linearGradient
id="linearGradient2366"><stop
id="stop2368"
offset="0"
style="stop-color:#ffffff;stop-opacity:1;" /><stop
style="stop-color:#ffffff;stop-opacity:0.21904762;"
offset="0.50000000"
id="stop2374" /><stop
id="stop2370"
offset="1.0000000"
style="stop-color:#ffffff;stop-opacity:1.0000000;" /></linearGradient><linearGradient
inkscape:collect="always"
id="linearGradient4477"><stop
style="stop-color:#000000;stop-opacity:1;"
offset="0"
id="stop4479" /><stop
style="stop-color:#000000;stop-opacity:0;"
offset="1"
id="stop4481" /></linearGradient><linearGradient
id="linearGradient4467"><stop
style="stop-color:#ffffff;stop-opacity:1;"
offset="0"
id="stop4469" /><stop
style="stop-color:#ffffff;stop-opacity:0.24761905;"
offset="1.0000000"
id="stop4471" /></linearGradient><linearGradient
id="linearGradient4454"><stop
style="stop-color:#729fcf;stop-opacity:0.20784314;"
offset="0.0000000"
id="stop4456" /><stop
style="stop-color:#729fcf;stop-opacity:0.67619050;"
offset="1.0000000"
id="stop4458" /></linearGradient><linearGradient
id="linearGradient4440"><stop
style="stop-color:#7d7d7d;stop-opacity:1;"
offset="0"
id="stop4442" /><stop
id="stop4448"
offset="0.50000000"
style="stop-color:#b1b1b1;stop-opacity:1.0000000;" /><stop
style="stop-color:#686868;stop-opacity:1.0000000;"
offset="1.0000000"
id="stop4444" /></linearGradient><radialGradient
gradientUnits="userSpaceOnUse"
gradientTransform="matrix(1,0,0,0.284916,0,30.08928)"
r="15.821514"
fy="42.07798"
fx="24.306795"
cy="42.07798"
cx="24.306795"
id="radialGradient4548"
xlink:href="#linearGradient4542"
inkscape:collect="always" /><linearGradient
id="linearGradient259"><stop
style="stop-color:#fafafa;stop-opacity:1.0000000;"
offset="0.0000000"
id="stop260" /><stop
style="stop-color:#bbbbbb;stop-opacity:1.0000000;"
offset="1.0000000"
id="stop261" /></linearGradient><linearGradient
id="linearGradient269"><stop
style="stop-color:#a3a3a3;stop-opacity:1.0000000;"
offset="0.0000000"
id="stop270" /><stop
style="stop-color:#4c4c4c;stop-opacity:1.0000000;"
offset="1.0000000"
id="stop271" /></linearGradient><radialGradient
gradientUnits="userSpaceOnUse"
fy="114.5684"
fx="20.892099"
r="5.256"
cy="114.5684"
cx="20.892099"
id="aigrd2"><stop
id="stop15566"
style="stop-color:#F0F0F0"
offset="0" /><stop
id="stop15568"
style="stop-color:#9a9a9a;stop-opacity:1.0000000;"
offset="1.0000000" /></radialGradient><radialGradient
gradientUnits="userSpaceOnUse"
fy="64.567902"
fx="20.892099"
r="5.257"
cy="64.567902"
cx="20.892099"
id="aigrd3"><stop
id="stop15573"
style="stop-color:#F0F0F0"
offset="0" /><stop
id="stop15575"
style="stop-color:#9a9a9a;stop-opacity:1.0000000;"
offset="1.0000000" /></radialGradient><linearGradient
id="linearGradient15662"><stop
style="stop-color:#ffffff;stop-opacity:1.0000000;"
offset="0.0000000"
id="stop15664" /><stop
style="stop-color:#f8f8f8;stop-opacity:1.0000000;"
offset="1.0000000"
id="stop15666" /></linearGradient><linearGradient
id="linearGradient4542"
inkscape:collect="always"><stop
id="stop4544"
offset="0"
style="stop-color:#000000;stop-opacity:1;" /><stop
id="stop4546"
offset="1"
style="stop-color:#000000;stop-opacity:0;" /></linearGradient><linearGradient
id="linearGradient5048"><stop
id="stop5050"
offset="0"
style="stop-color:black;stop-opacity:0;" /><stop
style="stop-color:black;stop-opacity:1;"
offset="0.5"
id="stop5056" /><stop
id="stop5052"
offset="1"
style="stop-color:black;stop-opacity:0;" /></linearGradient><linearGradient
id="linearGradient5060"
inkscape:collect="always"><stop
id="stop5062"
offset="0"
style="stop-color:black;stop-opacity:1;" /><stop
id="stop5064"
offset="1"
style="stop-color:black;stop-opacity:0;" /></linearGradient><linearGradient
id="linearGradient3449"><stop
id="stop3451"
offset="0.0000000"
style="stop-color:#8a8a8a;stop-opacity:1.0000000;" /><stop
id="stop3453"
offset="1.0000000"
style="stop-color:#484848;stop-opacity:1.0000000;" /></linearGradient><linearGradient
id="linearGradient3441"><stop
id="stop3443"
offset="0"
style="stop-color:#ffffff;stop-opacity:1;" /><stop
style="stop-color:#ffffff;stop-opacity:0.21904762;"
offset="0.50000000"
id="stop3445" /><stop
id="stop3447"
offset="1.0000000"
style="stop-color:#ffffff;stop-opacity:1.0000000;" /></linearGradient><linearGradient
id="linearGradient3429"><stop
style="stop-color:#ffffff;stop-opacity:1;"
offset="0"
id="stop3431" /><stop
style="stop-color:#ffffff;stop-opacity:0.24761905;"
offset="1.0000000"
id="stop3433" /></linearGradient><linearGradient
id="linearGradient3423"><stop
style="stop-color:#729fcf;stop-opacity:0.20784314;"
offset="0.0000000"
id="stop3425" /><stop
style="stop-color:#729fcf;stop-opacity:0.67619050;"
offset="1.0000000"
id="stop3427" /></linearGradient><linearGradient
id="linearGradient3415"><stop
style="stop-color:#7d7d7d;stop-opacity:1;"
offset="0"
id="stop3417" /><stop
id="stop3419"
offset="0.50000000"
style="stop-color:#b1b1b1;stop-opacity:1.0000000;" /><stop
style="stop-color:#686868;stop-opacity:1.0000000;"
offset="1.0000000"
id="stop3421" /></linearGradient><radialGradient
gradientUnits="userSpaceOnUse"
gradientTransform="matrix(1,0,0,0.284916,0,30.08928)"
r="15.821514"
fy="42.07798"
fx="24.306795"
cy="42.07798"
cx="24.306795"
id="radialGradient3413"
xlink:href="#linearGradient4542"
inkscape:collect="always" /><linearGradient
id="linearGradient3402"><stop
style="stop-color:#fafafa;stop-opacity:1.0000000;"
offset="0.0000000"
id="stop3404" /><stop
style="stop-color:#bbbbbb;stop-opacity:1.0000000;"
offset="1.0000000"
id="stop3406" /></linearGradient><linearGradient
id="linearGradient3396"><stop
style="stop-color:#a3a3a3;stop-opacity:1.0000000;"
offset="0.0000000"
id="stop3398" /><stop
style="stop-color:#4c4c4c;stop-opacity:1.0000000;"
offset="1.0000000"
id="stop3400" /></linearGradient><radialGradient
gradientUnits="userSpaceOnUse"
fy="114.5684"
fx="20.892099"
r="5.256"
cy="114.5684"
cx="20.892099"
id="radialGradient3390"><stop
id="stop3392"
style="stop-color:#F0F0F0"
offset="0" /><stop
id="stop3394"
style="stop-color:#9a9a9a;stop-opacity:1.0000000;"
offset="1.0000000" /></radialGradient><radialGradient
gradientUnits="userSpaceOnUse"
fy="64.567902"
fx="20.892099"
r="5.257"
cy="64.567902"
cx="20.892099"
id="radialGradient3384"><stop
id="stop3386"
style="stop-color:#F0F0F0"
offset="0" /><stop
id="stop3388"
style="stop-color:#9a9a9a;stop-opacity:1.0000000;"
offset="1.0000000" /></radialGradient><linearGradient
id="linearGradient3378"><stop
style="stop-color:#ffffff;stop-opacity:1.0000000;"
offset="0.0000000"
id="stop3380" /><stop
style="stop-color:#f8f8f8;stop-opacity:1.0000000;"
offset="1.0000000"
id="stop3382" /></linearGradient><linearGradient
y2="609.50507"
x2="302.85715"
y1="366.64789"
x1="302.85715"
gradientTransform="matrix(2.774389,0,0,1.969706,-1892.179,-872.8854)"
gradientUnits="userSpaceOnUse"
id="linearGradient3370"
xlink:href="#linearGradient5048"
inkscape:collect="always" /><linearGradient
id="linearGradient3362"><stop
id="stop3364"
offset="0"
style="stop-color:black;stop-opacity:0;" /><stop
style="stop-color:black;stop-opacity:1;"
offset="0.5"
id="stop3366" /><stop
id="stop3368"
offset="1"
style="stop-color:black;stop-opacity:0;" /></linearGradient><radialGradient
r="117.14286"
fy="486.64789"
fx="605.71429"
cy="486.64789"
cx="605.71429"
gradientTransform="matrix(2.774389,0,0,1.969706,-1891.633,-872.8854)"
gradientUnits="userSpaceOnUse"
id="radialGradient3360"
xlink:href="#linearGradient5060"
inkscape:collect="always" /><radialGradient
r="117.14286"
fy="486.64789"
fx="605.71429"
cy="486.64789"
cx="605.71429"
gradientTransform="matrix(-2.774389,0,0,1.969706,112.7623,-872.8854)"
gradientUnits="userSpaceOnUse"
id="radialGradient3352"
xlink:href="#linearGradient5060"
inkscape:collect="always" /><radialGradient
inkscape:collect="always"
xlink:href="#linearGradient4477"
id="radialGradient3593"
gradientUnits="userSpaceOnUse"
gradientTransform="matrix(1,0,0,0.237968,0,28.93278)"
cx="24.130018"
cy="37.967922"
fx="24.130018"
fy="37.967922"
r="16.528622" /><linearGradient
inkscape:collect="always"
xlink:href="#linearGradient2846"
id="linearGradient3595"
gradientUnits="userSpaceOnUse"
x1="27.366341"
y1="26.580296"
x2="31.335964"
y2="30.557772" /><linearGradient
inkscape:collect="always"
xlink:href="#linearGradient4440"
id="linearGradient3597"
gradientUnits="userSpaceOnUse"
gradientTransform="matrix(1.334593,0,0,1.291292,-6.973842,-7.460658)"
x1="30.65625"
y1="34"
x2="33.21875"
y2="31.0625" /><linearGradient
inkscape:collect="always"
xlink:href="#linearGradient2366"
id="linearGradient3599"
gradientUnits="userSpaceOnUse"
x1="18.292673"
y1="13.602121"
x2="17.500893"
y2="25.743469" /><radialGradient
inkscape:collect="always"
xlink:href="#linearGradient4454"
id="radialGradient3601"
gradientUnits="userSpaceOnUse"
cx="18.240929"
cy="21.817987"
fx="18.240929"
fy="21.817987"
r="8.3085051" /><radialGradient
inkscape:collect="always"
xlink:href="#linearGradient4467"
id="radialGradient3603"
gradientUnits="userSpaceOnUse"
gradientTransform="matrix(2.592963,0,0,2.252104,-25.05975,-18.941)"
cx="15.414371"
cy="13.078408"
fx="15.414371"
fy="13.078408"
r="6.65625" /></defs><sodipodi:namedview
inkscape:cy="22.58868"
inkscape:cx="8"
inkscape:zoom="8"
inkscape:window-height="920"
inkscape:window-width="1280"
inkscape:pageshadow="2"
inkscape:pageopacity="0.0"
guidetolerance="10.0"
gridtolerance="10.0"
objecttolerance="10.0"
borderopacity="1.0"
bordercolor="#666666"
pagecolor="#ffffff"
id="base"
height="13.546667mm"
width="13.546667mm"
inkscape:window-x="0"
inkscape:window-y="25"
inkscape:current-layer="Layer_1"
units="mm" />
<g
id="g3605"
transform="matrix(1.0683169,0,0,1.0683169,-4.1837749,-0.3623247)"><path
style="fill:#ff3131"
id="path21"
d="M 146.01171,-12.604976 C 131.30382,-12.604976 109.78446,-24.588273 88.811657,-24.588273 C 67.838848,-24.588273 55.311721,-12.604976 53.130962,-10.970767 C 50.952923,-9.3365572 49.587909,-3.0743532 56.127467,-4.9804782 C 62.667024,-6.8866032 68.110763,-6.8866032 78.462569,-6.8866032 C 88.814376,-6.8866032 113.59944,7.2774532 138.65913,7.2774532 C 163.71882,7.2774532 184.68891,-12.058427 189.31962,-18.869899 C 193.95034,-25.681371 189.59154,-27.85941 186.0512,-26.222482 C 182.50815,-24.588273 159.63194,-12.604976 146.01171,-12.604976 z " /><path
style="fill:#319831"
id="path23"
d="M 209.90631,24.707208 C 194.45338,24.707208 171.84636,12.726631 149.81308,12.726631 C 127.77981,12.726631 114.61639,24.707208 112.32687,26.344137 C 110.03734,27.981065 108.60436,34.24055 115.47293,32.334425 C 122.3415,30.428301 128.06259,30.428301 138.93648,30.428301 C 149.81037,30.428301 175.85167,44.592357 202.17577,44.592357 C 228.49986,44.592357 250.53586,25.256477 255.40042,18.447724 C 260.26498,11.636251 255.68593,9.4554924 251.96613,11.092421 C 248.24634,12.729349 224.21448,24.707208 209.90631,24.707208 z " /><path
style="fill:#3232cc"
id="path25"
d="M 120.98193,65.880587 C 104.95797,65.880587 81.513456,52.93471 58.664434,52.93471 C 35.815411,52.93471 22.162557,65.880587 19.788738,67.645315 C 17.414919,69.410044 15.930264,76.175294 23.051719,74.114172 C 30.175893,72.05578 36.109079,72.05578 47.385397,72.05578 C 58.661714,72.05578 85.668318,87.353714 112.97131,87.353714 C 140.27158,87.353714 163.12061,66.467922 168.16735,59.112623 C 173.21409,51.757318 168.46374,49.405253 164.60526,51.169982 C 160.74679,52.93471 135.82034,65.880587 120.98193,65.880587 z " /></g>
</svg>

After

Width:  |  Height:  |  Size: 15 KiB

View file

@ -0,0 +1,88 @@
<?xml version="1.0"?>
<interface>
<!-- interface-requires gtk+ 2.12 -->
<!-- interface-naming-policy toplevel-contextual -->
<object class="GtkWindow" id="main_window">
<property name="title" translatable="yes">GStreamer Debug Viewer</property>
<property name="default_width">640</property>
<property name="default_height">480</property>
<signal name="destroy" handler="handle_main_window_destroy"/>
<child>
<object class="GtkVBox" id="vbox_main">
<property name="visible">True</property>
<child>
<object class="GtkVBox" id="vbox_view">
<property name="visible">True</property>
<child>
<object class="GtkVPaned" id="vpaned_view">
<property name="visible">True</property>
<property name="can_focus">True</property>
<child>
<object class="GtkHBox" id="hbox_view">
<property name="visible">True</property>
<child>
<object class="GtkScrolledWindow" id="log_view_scrolled_window">
<property name="visible">True</property>
<property name="can_focus">True</property>
<property name="hscrollbar_policy">automatic</property>
<property name="vscrollbar_policy">automatic</property>
<property name="shadow_type">in</property>
<child>
<object class="GtkTreeView" id="log_view">
<property name="name">log_view</property>
<property name="visible">True</property>
<property name="can_focus">True</property>
<property name="reorderable">True</property>
<property name="rules_hint">True</property>
<property name="enable_search">False</property>
<property name="fixed_height_mode">True</property>
</object>
</child>
</object>
<packing>
<property name="pack_type">end</property>
<property name="position">0</property>
</packing>
</child>
</object>
<packing>
<property name="resize">True</property>
<property name="shrink">True</property>
</packing>
</child>
<child>
<object class="GtkScrolledWindow" id="line_view_scrolled_window">
<property name="visible">True</property>
<property name="can_focus">True</property>
<property name="hscrollbar_policy">never</property>
<property name="vscrollbar_policy">automatic</property>
<property name="shadow_type">in</property>
<child>
<object class="GtkTreeView" id="line_view">
<property name="visible">True</property>
<property name="can_focus">True</property>
<property name="headers_visible">False</property>
<property name="rules_hint">True</property>
</object>
</child>
</object>
<packing>
<property name="resize">False</property>
<property name="shrink">True</property>
</packing>
</child>
</object>
<packing>
<property name="position">0</property>
</packing>
</child>
</object>
<packing>
<property name="pack_type">end</property>
<property name="position">0</property>
</packing>
</child>
</object>
</child>
</object>
</interface>

View file

@ -0,0 +1,82 @@
<!-- -*- mode: xml; -*- -->
<ui>
<menubar>
<menu name="AppMenu" action="AppMenuAction">
<menuitem name="AppNewWindow" action="new-window"/>
<menuitem name="WindowOpen" action="open-file"/>
<menuitem name="WindowReload" action="reload-file"/>
<separator/>
<menuitem name="ShowAbout" action="show-about"/>
<separator/>
<menuitem name="WindowClose" action="close-window"/>
</menu>
<menu name="ViewMenu" action="ViewMenuAction">
<menu name="ViewColumnsMenu" action="ViewColumnsMenuAction">
<menuitem name="ViewColumnsTime" action="show-time-column"/>
<menuitem name="ViewColumnsLevel" action="show-level-column"/>
<menuitem name="ViewColumnsPid" action="show-pid-column"/>
<menuitem name="ViewColumnsThread" action="show-thread-column"/>
<menuitem name="ViewColumnsCode" action="show-code-column"/>
<menuitem name="ViewColumnsCategory" action="show-category-column"/>
<menuitem name="ViewColumnsFunction" action="show-function-column"/>
<menuitem name="ViewColumnsObject" action="show-object-column"/>
<menuitem name="ViewColumnsMessage" action="show-message-column"/>
</menu>
<placeholder name="ViewMenuAdditions"/>
<separator/>
<menuitem name="ViewContextMenuHideLevel" action="hide-log-level"/>
<menuitem name="ViewContextMenuHideLevelAndAbove" action="hide-log-level-and-above"/>
<menuitem name="ViewContextMenuShowOnlyLevel" action="show-only-log-level"/>
<menuitem name="ViewContextMenuHideCategory" action="hide-log-category"/>
<menuitem name="ViewContextMenuShowOnlyCategory" action="show-only-log-category"/>
<menuitem name="ViewContextMenuHideThread" action="hide-thread"/>
<menuitem name="ViewContextMenuShowOnlyThread" action="show-only-thread"/>
<menuitem name="ViewContextMenuHideObject" action="hide-object"/>
<menuitem name="ViewContextMenuShowOnlyObject" action="show-only-object"/>
<menuitem name="ViewContextMenuHideFunction" action="hide-function"/>
<menuitem name="ViewContextMenuShowOnlyFunction" action="show-only-function"/>
<menuitem name="ViewContextMenuHideFilename" action="hide-filename"/>
<menuitem name="ViewContextMenuShowOnlyFilename" action="show-only-filename"/>
<menuitem name="ViewContextMenuHideBefore" action="hide-before-line"/>
<menuitem name="ViewContextMenuHideAfter" action="hide-after-line"/>
<menuitem name="ViewContextMenuShowHidden" action="show-hidden-lines"/>
<separator/>
<menuitem name="ViewContextMenuCopyMessage" action="edit-copy-message"/>
<menuitem name="ViewContextMenuCopyLine" action="edit-copy-line"/>
<separator/>
<menuitem name="ZoomIn" action="enlarge-text"/>
<menuitem name="ZoomOut" action="shrink-text"/>
<menuitem name="Zoom100" action="reset-text"/>
</menu>
</menubar>
<menubar name="context">
<menu name="LogViewContextMenu" action="ViewMenuAction">
<placeholder name="LogViewContextMenuAdditions"/>
<separator/>
<menuitem name="ViewContextMenuSetBaseTime" action="set-base-time"/>
<menuitem name="ViewContextMenuHideLevel" action="hide-log-level"/>
<menuitem name="ViewContextMenuHideLevelAndAbove" action="hide-log-level-and-above"/>
<menuitem name="ViewContextMenuShowOnlyLevel" action="show-only-log-level"/>
<menuitem name="ViewContextMenuHideCategory" action="hide-log-category"/>
<menuitem name="ViewContextMenuShowOnlyCategory" action="show-only-log-category"/>
<menuitem name="ViewContextMenuHideThread" action="hide-thread"/>
<menuitem name="ViewContextMenuShowOnlyThread" action="show-only-thread"/>
<menuitem name="ViewContextMenuHideObject" action="hide-object"/>
<menuitem name="ViewContextMenuShowOnlyObject" action="show-only-object"/>
<menuitem name="ViewContextMenuHideFunction" action="hide-function"/>
<menuitem name="ViewContextMenuShowOnlyFunction" action="show-only-function"/>
<menuitem name="ViewContextMenuHideFilename" action="hide-filename"/>
<menuitem name="ViewContextMenuShowOnlyFilename" action="show-only-filename"/>
<menuitem name="ViewContextMenuHideBefore" action="hide-before-line"/>
<menuitem name="ViewContextMenuHideAfter" action="hide-after-line"/>
<menuitem name="ViewContextMenuShowHidden" action="show-hidden-lines"/>
<separator/>
<menuitem name="ViewContextMenuCopyMessage" action="edit-copy-message"/>
<menuitem name="ViewContextMenuCopyLine" action="edit-copy-line"/>
</menu>
<menu name="LineViewContextMenu" action="LineViewContextMenuAction">
<menuitem name="LineViewContextMenuClear" action="clear-line-view"/>
<placeholder name="LineViewContextMenuAdditions"/>
</menu>
</menubar>
</ui>

View file

@ -0,0 +1,6 @@
install_data('about-dialog.ui', 'main-window.ui', 'menus.ui', 'gst-debug-viewer.png',
install_dir: join_paths(get_option('datadir'), 'gst-debug-viewer'))
install_data('gst-debug-viewer.png',
install_dir: join_paths(get_option('datadir'), 'icons/hicolor/48x48/apps'))
install_data('gst-debug-viewer.svg',
install_dir: join_paths(get_option('datadir'), 'icons/hicolor/scalable/apps'))

70
debug-viewer/gst-debug-viewer Executable file
View file

@ -0,0 +1,70 @@
#!/usr/bin/env python3
# -*- coding: utf-8; mode: python; -*-
#
# GStreamer Debug Viewer - View and analyze GStreamer debug log files
#
# Copyright (C) 2007 René Stadler <mail@renestadler.de>
#
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the Free
# Software Foundation; either version 3 of the License, or (at your option)
# any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along with
# this program. If not, see <http://www.gnu.org/licenses/>.
"""GStreamer Debug Viewer program invocation."""
def main():
import sys
import os.path
def substituted(s):
if s.startswith("@") and s.endswith("@"):
return None
else:
return s
# These "$"-enclosed strings are substituted at install time by a custom
# distutils extension (see setup.py). If you don't see any dollar signs at
# all, you are looking at an installed version of this file.
data_dir = substituted("@DATADIR@")
lib_dir = substituted("@LIBDIR@")
if data_dir:
installed = True
else:
# Substitution has not been run, we are running uninstalled:
lib_dir = os.path.dirname(os.path.realpath(sys.argv[0]))
installed = False
if lib_dir:
if not os.path.normpath(lib_dir) in [os.path.normpath(p)
for p in sys.path]:
sys.path.insert(0, lib_dir)
try:
import GstDebugViewer
except ImportError as exc:
print(str(exc), file=sys.stderr)
sys.exit(1)
else:
if installed:
GstDebugViewer.Paths.setup_installed(data_dir)
else:
# Assume that we reside inside the source dist.
source_dir = os.path.dirname(os.path.realpath(sys.argv[0]))
GstDebugViewer.Paths.setup_uninstalled(source_dir)
GstDebugViewer.run()
if __name__ == "__main__":
main()

85
debug-viewer/meson.build Normal file
View file

@ -0,0 +1,85 @@
python3.install_sources (
'GstDebugViewer/Main.py',
'GstDebugViewer/Data.py',
subdir: 'GstDebugViewer')
python3.install_sources (
'GstDebugViewer/GUI/columns.py',
'GstDebugViewer/GUI/__init__.py',
'GstDebugViewer/GUI/models.py',
'GstDebugViewer/GUI/filters.py',
'GstDebugViewer/GUI/colors.py',
'GstDebugViewer/GUI/window.py',
'GstDebugViewer/GUI/app.py',
subdir: 'GstDebugViewer/GUI')
python3.install_sources (
'GstDebugViewer/Plugins/__init__.py',
'GstDebugViewer/Plugins/FindBar.py',
'GstDebugViewer/Plugins/Timeline.py',
subdir: 'GstDebugViewer/Plugins')
python3.install_sources (
'GstDebugViewer/Common/Main.py',
'GstDebugViewer/Common/utils.py',
'GstDebugViewer/Common/__init__.py',
'GstDebugViewer/Common/generictreemodel.py',
'GstDebugViewer/Common/Data.py',
'GstDebugViewer/Common/GUI.py',
subdir: 'GstDebugViewer/Common')
if find_program('msgfmt', required : get_option('nls')).found()
# Desktop launcher and description file.
desktop_file = i18n.merge_file(
input: 'org.freedesktop.GstDebugViewer.desktop.in',
output: 'org.freedesktop.GstDebugViewer.desktop',
type: 'desktop',
po_dir: 'po',
install: true,
install_dir: join_paths(get_option('datadir'), 'applications'),
)
# Appdata file.
appdata_file = i18n.merge_file(
input: 'org.freedesktop.GstDebugViewer.appdata.xml.in',
output: 'org.freedesktop.GstDebugViewer.appdata.xml',
po_dir: 'po',
install: true,
install_dir: join_paths(get_option('datadir'), 'metainfo'),
)
else
install_data('org.freedesktop.GstDebugViewer.desktop.in',
rename: 'org.freedesktop.GstDebugViewer.desktop',
install_dir: join_paths(get_option('datadir'), 'applications'))
install_data('org.freedesktop.GstDebugViewer.appdata.xml.in',
rename: 'org.freedesktop.GstDebugViewer.appdata.xml',
install_dir: join_paths(get_option('datadir'), 'metainfo'))
endif
cdata = configuration_data()
cdata.set('LIBDIR', join_paths(get_option('prefix'), get_option('libdir')))
cdata.set('DATADIR', join_paths(get_option('prefix'), get_option('datadir')))
cdata.set('VERSION', meson.project_version())
configure_file(input: 'gst-debug-viewer',
output: 'gst-debug-viewer',
configuration: cdata,
install_dir: get_option('bindir'))
init_file = configure_file(
input: 'GstDebugViewer/__init__.py',
output: '__init__.py',
configuration: cdata)
python3.install_sources (init_file, subdir: 'GstDebugViewer')
pkgdatadir = join_paths(get_option('datadir'), meson.project_name())
icondir = join_paths(get_option('datadir'), 'icons/hicolor')
subdir('data')
if run_command(python3,
'-c', 'import gi; gi.require_version("Gtk", "3.0")').returncode() == 0
test('gst-debug-viewer', python3, args: ['-m', 'unittest'],
workdir: meson.current_source_dir())
endif

View file

@ -0,0 +1,29 @@
<?xml version="1.0" encoding="UTF-8"?>
<component type="desktop-application">
<id>org.freedesktop.GstDebugViewer</id>
<launchable type="desktop-id">org.freedesktop.GstDebugViewer.desktop</launchable>
<metadata_license>CC-BY-3.0</metadata_license>
<project_license>GPL-3.0+</project_license>
<name>GStreamer Debug Viewer</name>
<summary>Examine GStreamer debug log information</summary>
<description>
<p>View and read GStreamer debug logs in an efficient way</p>
</description>
<url type="homepage">https://gstreamer.freedesktop.org/</url>
<url type="bugtracker">https://gitlab.freedesktop.org/gstreamer/gst-devtools/issues/</url>
<update_contact>tsaunier@gnome.org</update_contact>
<project_group>GStreamer</project_group>
<translation type="gettext">GStreamer</translation>
<developer_name>The GStreamer Team</developer_name>
<screenshots>
<screenshot type="default">
<caption>The main window</caption>

</screenshot>
</screenshots>
<content_rating type="oars-1.0" />
<releases>
<release version="1.16.2" date="2019-12-03" />
<release version="1.16.1" date="2019-09-23" />
</releases>
</component>

View file

@ -0,0 +1,9 @@
[Desktop Entry]
Name=GStreamer Debug Viewer
Comment=Examine GStreamer debug log information
StartupNotify=true
Exec=gst-debug-viewer
Icon=gst-debug-viewer
Type=Application
Categories=GNOME;Development

Binary file not shown.

After

Width:  |  Height:  |  Size: 2 KiB

0
debug-viewer/po/LINGUAS Normal file
View file

View file

Binary file not shown.

After

Width:  |  Height:  |  Size: 175 KiB

6
docs/api.md Normal file
View file

@ -0,0 +1,6 @@
# GstValidate API reference
This is GstValidate API reference but note that the GstValidate is not
totally stable and might very well change even between minor versions.
The override API should be mostly stable still.

View file

@ -0,0 +1,8 @@
meta,
args = {
"fakesrc num-buffers=1 ! fakesink name=sink",
},
configs = {
"$(validateflow), pad=sink:sink, buffers-checksum=true",
}
# The validate tool will simply play the pipeline until EOS is reached.

View file

@ -0,0 +1 @@
fakesrc.simple.validatetest

View file

@ -0,0 +1,4 @@
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE, group-id=(uint)1;
event segment: format=BYTES, start=0, offset=0, stop=18446744073709551615, time=0, base=0, position=0
buffer: checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709, dts=0:00:00.000000000, flags=discont
event eos: (no structure)

File diff suppressed because it is too large Load diff

1
docs/gi-index.md Normal file
View file

@ -0,0 +1 @@
# GstValidate API reference

File diff suppressed because it is too large Load diff

110
docs/gst-validate-config.md Normal file
View file

@ -0,0 +1,110 @@
---
title: Configuration
short-description: GstValidate configuration
...
# GstValidate Configuration
GstValidate comes with some possible configuration files
to setup its plugins and core behaviour. The config format is very similar
to the [scenario](gst-validate-scenarios.md) file format.
You can check the [ssim plugin](plugins/ssim.md)
and the [validate flow plugin](gst-validate-flow.md)
for examples.
## Core settings parameters
Config name should be `core`.
### `verbosity`
Default: `position`
See [GstValidateVerbosityFlags](GstValidateVerbosityFlags) for possible values.
### `action`
The [action type](gst-validate-action-types.md) to execute, the action type
must be a CONFIG action or the action type must have a `as-config` argument. When the `action`
is specified in a parameter, a validate action is executed using the other parameters of the
config as configuration for the validate scenario action.
#### Example:
```
GST_VALIDATE_CONFIG="core, action=set-property, target-element-name="videotestsrc0", property-name=pattern, property-value=blue" gst-validate-1.0 videotestsrc ! autovideosink
```
This will execute the `set-property, target-element-name="videotestsrc0",
property-name=pattern, property-value=blue` validate action directly from the
config file
### `scenario-action-execution-interval`
Default: `0` meaning that action are executed in `idle` callbacks.
Set the interval between [GstValidateScenario](gst-validate-scenarios.md) actions execution.
### `max-latency`
Default: `GST_CLOCK_TIME_NONE` - disabled
Set the maximum latency reported by the pipeline, over that defined latency the scenario will report
an `config::latency-too-high` issue.
### `max-dropped`
Default: `GST_CLOCK_TIME_NONE` - disabled
The maximum number of dropped buffers, a `config::too-many-buffers-dropped` issue will be reported
if that limit is reached.
### `fail-on-missing-plugin`
Default: `false` meaning that tests are marked as skipped when a GStreamer plugin is missing.
## Variables
You can use variables in the configs the same way you can set them in
[gst-validate-scenarios](gst-validate-scenarios.md).
Defaults variables are:
- `$(TMPDIR)`: The default temporary directory as returned by `g_get_tmp_dir`.
- `$(CONFIG_PATH)`: The path of the running scenario.
- `$(CONFIG_DIR)`: The directory the running scenario is in.
- `$(CONFIG_NAME)`: The name of the config file
- `$(LOGSDIR)`: The directory where to place log files. This uses the
`GST_VALIDATE_LOGSDIR` environment variable if available or `$(TMPDIR)` if
the variables hasn't been set. (Note that the
[gst-validate-launcher](gst-validate-launcher.md) set the environment
variables).
You can also set you own variables by using the `set-vars=true` argument:
``` yaml
core, set-vars=true, log-path=$(CONFIG_DIR/../log)
```
It is also possible to set global variables (also usable from
[scenarios](gst-validate-scenarios.md)) with:
``` yaml
set-globals, TESTSUITE_ROOT_DIR=$(CONFIG_DIR)
```
## `change-issue-severity` settings parameters
You can change issues severity with the `change-issue-severity` configuration
with the following parameters:
* `issue-id`: The GQuark name of the issue, for example: `event::segment-has-wrong-start`,
You can use `gst-validate-1.0 --print-issue-types` to list all issue types.
* `new-severity`: The new [`severity`](GstValidateReportLevel) of the issue
* `element-name` (*optional*): The name of the element the severity
change applies to
* `element-factory-name` (*optional*): The element factory name of the elements the
severity change applies to
* `element-classification` (*optional*): The classification of the elements the
severity change applies to

View file

@ -0,0 +1,153 @@
---
title: Environment variables
short-description: Environment variables influencing runtime behaviour
...
# GstValidate Environment Variables
The runtime behaviour of GstValidate applications can be influenced by a
number of environment variables.
**GST_VALIDATE.**
This environment variable can be set to a list of debug options, which
cause GstValidate to print out different types of test result
information and consider differently the level of the reported issues.
* `fatal-criticals`: Causes GstValidate to consider only critical issues as import enough
to consider the test failed (default behaviour)
* `fatal-warnings`: Causes GstValidate to consider warning, and critical issues as
import enough to consider the test failed
* `fatal-issues`: Causes GstValidate to consider issue, warning, and critical issues
as import enough to consider the test failed
* `print-issues`: Causes GstValidate to print issue, warning and critical issues in
the final reports (default behaviour)
* `print-warnings`: Causes GstValidate to only print warning and critical issues in the
final reports
* `print-criticals`: Causes GstValidate to only print critical issues in the final
reports
**GST_VALIDATE_FILE.**
Set this variable to a colon-separated list of paths to redirect all
GstValidate messages to this file. If left unset, debug messages will be
outputed into the standard error.
You can use the special names `stdout` and `stderr` to use those output.
**GST_VALIDATE_SCENARIOS_PATH.**
Set this variable to a colon-separated list of paths. GstValidate will
scan these paths for GstValidate scenario files. By default GstValidate
will look for scenarios in the user data directory as specified in the
[XDG standard]:
`.local/share/gstreamer-GST_API_VERSION/validate/scenarios` and the
system wide user data directory:
`/usr/lib/gstreamer-GST_API_VERSION/validate/scenarios`
**GST_VALIDATE_CONFIG.**
Set this variable to a colon-separated list of paths to GstValidate
config files or directly as a string in the GstCaps serialization
format. The config file has a format similar to the scenario file. The
name of the configuration corresponds to the name of the plugin the
configuration applies to.
The special name "core" is used to configure GstValidate core
functionalities (monitors, scenarios, etc...).
If you want to make sure to set a property on a element of a type (for
example to disable QoS on all sinks) you can do:
```
core, action=set-property, target-element-klass=Sink
```
If you want the GstPipeline to get dumped when an issue of a certain
level (and higher) happens, you can do:
```
core, action=dot-pipeline, report-level=issue
```
Note that you will still need to set GST_DEBUG_DUMP_DOT_DIR.
For more examples you can look at the ssim GstValidate plugin
documentation to see how to configure that plugin.
You can also check that a src pad is pushing buffers at a minimum
frequency. For example to check if v4l2src is producing at least 60 frames
per second you can do:
``` yaml
core,min-buffer-frequency=60,target-element-factory-name=v4l2src
```
This config accepts the following fields:
- `min-buffer-frequency`: the expected minimum rate, in buffers per
second, at which buffers are pushed on the pad
- `target-element-{factory-name,name,klass}`: the factory-name, object
name or class of the element to check
- `name`: (optional) only check the frequency if the src pad has this
name
- `buffer-frequency-start`: (optional) if defined, validate will
ignore the frequency of the pad during the time specified in this
field, in ns. This can be useful when testing live pipelines where
configuring and setting up elements can take some time slowing down
the first buffers until the pipeline reaches its cruising speed.
**GST_VALIDATE_OVERRIDE.**
Set this variable to a colon-separated list of dynamically linkable
files that GstValidate will scan looking for overrides. By default
GstValidate will look for scenarios in the user data directory as
specified in the [XDG standard]:
`.local/share/gstreamer-GST_API_VERSION/validate/scenarios` and the
system wide user data directory:
`/usr/lib/gstreamer-GST_API_VERSION/validate/scenarios`
**GST_VALIDATE_SCENARIO_WAIT_MULITPLIER.**
A decimal number to set as a multiplier for the wait actions. For
example if you set `GST_VALIDATE_SCENARIO_WAIT_MULITPLIER=0.5`, for a
wait action that has a duration of 2.0 the waiting time will only be of
1.0 second. If set to 0, wait action will be ignored.
**GST_VALIDATE_REPORTING_DETAILS.**
The reporting level can be set through the
GST_VALIDATE_REPORTING_DETAILS environment variable, as a
comma-separated list of (optional) object categories / names and levels.
Omit the object category / name to set the global level.
Examples:
```
GST_VALIDATE_REPORTING_DETAILS=synthetic,h264parse:all
GST_VALIDATE_REPORTING_DETAILS=none,h264parse::sink_0:synthetic
```
Levels being:
* `none`: No debugging level specified or desired. Used to deactivate
debugging output.
* `synthetic`: Summary of the issues found, with no details.
* `subchain`: If set as the default level, similar issues can be reported multiple
times for different subchains. If set as the level for a particular
object (`my_object:subchain`), validate will report the issues where
the object is the first to report an issue for a subchain.
* `monitor`: If set as the default level, all the distinct issues for all the
monitors will be reported. If set as the level for a particular
object, all the distinct issues for this object will be reported.
Note that if the same issue happens twice on the same object, up
until this level that issue is only reported once.
* `all`: All the issues will be reported, even those that repeat themselves
inside the same object. This can be **very** verbose if set
globally.
Setting the reporting level allows to control the way issues are
reported when calling [gst_validate_runner_printf()](gst_validate_runner_printf).
[XDG standard]: http://www.freedesktop.org/wiki/Software/xdg-user-dirs/

196
docs/gst-validate-flow.md Normal file
View file

@ -0,0 +1,196 @@
---
title: Validate Flow
short-description: Validate a pad data flow
...
# Validate Flow
Validate Flow — GStreamer validate component to record a log of buffers and
events flowing in a specified pad and compare it with an expectation file.
## Description
This component exists for the purpose of testing non-regular-playback use cases
where the test author specifies the full pipeline, a series of actions and needs
to check whether the generated buffers and events make sense.
The testing procedure goes like this:
1. The test author writes a [.validatetest](gst-validate-test-file.md) test
where validateflow is used. A pad where monitoring will occur is specified
and possibly a list of [actions](gst-validate-action-types.md) to run can
also be specified.
2. The test author runs the test with the desired pipeline, the configuration
and the actions. Since an expectation file does not exist at
this point, validateflow will create one. The author should check its
contents for any missing or unwanted events. No actual checking is done by
validateflow in this step, since there is nothing to compare to yet.
3. Further executions of the test will also record the produced buffers and
events, but now they will be compared to the previous log (expectation file).
Any difference will be reported as a test failure. The original expectation
file is never modified by validateflow. Any desired changes can be made by
editing the file manually or deleting it and running the test again.
## Example
### Simplest example
The following is an example of a `fakesrc.simple.validatetest` file using
validateflow.
{{ fakesrc.simple.validatetest.ini }}
Then generate the expectation file with:
``` bash
gst-validate-1.0 --set-test-file /path/to/fakesrc.simple.validatetest
```
This will generate the
`/path/to/fakesrc.simple/flow-expectations/log-sink-sink-expected` file
containing:
{{ plugins/fakesrc.simple/flow-expectations/log-sink-sink-expected.log }}
Note that the test will be marked as "SKIPPED" when we generate expectation
files.
The test can now be run with:
```
gst-validate-1.0 --set-test-file /path/to/fakesrc.simple.validatetest
```
### Example controlling the source
The following is an example of the `qtdemux_change_edit_list.validatetest` file using validateflow.
``` ini
set-globals, media_dir="$(test_dir)/../../../medias/"
meta,
seek=false,
handles-states=false,
args = {
"appsrc ! qtdemux ! fakesink async=false",
},
configs = {
"$(validateflow), pad=fakesink0:sink, record-buffers=false",
}
# Scenario action types
appsrc-push, target-element-name=appsrc0, file-name="$(media_dir)/fragments/car-20120827-85.mp4/init.mp4"
appsrc-push, target-element-name=appsrc0, file-name="$(media_dir)/fragments/car-20120827-85.mp4/media1.mp4"
checkpoint, text="A moov with a different edit list is now pushed"
appsrc-push, target-element-name=appsrc0, file-name="$(media_dir)/fragments/car-20120827-86.mp4/init.mp4"
appsrc-push, target-element-name=appsrc0, file-name="$(media_dir)/fragments/car-20120827-86.mp4/media2.mp4"
stop
```
This example shows the elements of a typical validate flow test (a pipeline, a
config and a scenario). Some actions typically used together with validateflow
can also be seen. Notice variable interpolation is used to fill absolute paths
for media files in the scenario (`$(test_dir)`). In the configuration,
`$(validateflow)` is expanded to something like this, containing proper paths
for expectations and actual results (these values are interpolated from the
`.validatetest` file location):
``` ini
validateflow, expectations-dir="/validate/test/file/path/validateqtdemux_change_edit_list/flow-expectations/", actual-results-dir="$(GST_VALIDATE_LOGSDIR)/logs/validate/launch_pipeline/qtdemux_change_edit_list"
```
The resulting log looks like this:
``` ini
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE, group-id=(uint)1;
event caps: video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)main, codec_data=(buffer)014d4015ffe10016674d4015d901b1fe4e1000003e90000bb800f162e48001000468eb8f20, width=(int)426, height=(int)240, pixel-aspect-ratio=(fraction)1/1;
event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=none, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000
event tag: GstTagList-stream, taglist=(taglist)"taglist\,\ video-codec\=\(string\)\"H.264\\\ /\\\ AVC\"\;";
event tag: GstTagList-global, taglist=(taglist)"taglist\,\ datetime\=\(datetime\)2012-08-27T01:00:50Z\,\ container-format\=\(string\)\"ISO\\\ fMP4\"\;";
event tag: GstTagList-stream, taglist=(taglist)"taglist\,\ video-codec\=\(string\)\"H.264\\\ /\\\ AVC\"\;";
event caps: video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)main, codec_data=(buffer)014d4015ffe10016674d4015d901b1fe4e1000003e90000bb800f162e48001000468eb8f20, width=(int)426, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)24000/1001;
CHECKPOINT: A moov with a different edit list is now pushed
event caps: video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3, profile=(string)main, codec_data=(buffer)014d401effe10016674d401ee8805017fcb0800001f480005dc0078b168901000468ebaf20, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1;
event segment: format=TIME, start=0:00:00.041711111, offset=0:00:00.000000000, stop=none, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.041711111
event tag: GstTagList-stream, taglist=(taglist)"taglist\,\ video-codec\=\(string\)\"H.264\\\ /\\\ AVC\"\;";
event tag: GstTagList-stream, taglist=(taglist)"taglist\,\ video-codec\=\(string\)\"H.264\\\ /\\\ AVC\"\;";
event caps: video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3, profile=(string)main, codec_data=(buffer)014d401effe10016674d401ee8805017fcb0800001f480005dc0078b168901000468ebaf20, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)24000/1001;
```
## Configuration
In order to use the plugin a validate configuration must be provided,
containing a line starting by `validateflow` followed by a number of settings.
Every `validateflow` line creates a `ValidateFlowOverride`, which listens to a
given pad. A test may have several `validateflow` lines, therefore having
several overrides and listening to different pads with different settings.
* `pad`: Required. Name of the pad that will be monitored.
* `record-buffers`: Default: false. Whether buffers will be logged. By default
only events are logged.
* `buffers-checksum`: Default: 'none'. Define the type of checksums to be used
valid values are:
* `none`: No checksum recorded
* `as-id`: Record checksum as 'ids' where the IDs are incremented on each new
checksum passed in
* `md5`: md5 checksum
* `sha1`: sha1 checksum
* `sha256`: sha256 checksum
* `sha512`: sha512 checksum
* *Note*: for backward compatibility reasons, this can be passed as a
boolean and it will default to 'sha1' if true, 'none' if false.
* `ignored-fields`: Default: `"stream-start={ stream-id }"` (as they are often
non reproducible). Key with a serialized GstValueList(str) of fields to not
record.
* `logged-fields`: Default: `NULL` Key with a serialized GstValueList(str) of
fields to record, eg. `logged-event-fields="stream-start={flags},
caps={width, height, framerate}, buffer={pts}"`. Overrides
`ignored-event-fields` for specified event types.
* `ignored-event-types`: Default: `{ }`. List of event type names to not record
* `logged-event-types`: Default: `NULL`. List of event type names to not record,
if noone provided, all events are logged, except the ones defined in the
`ignored-event-types`.
* `expectations-dir`: Path to the directory where the expectations will be
written if they don't exist, relative to the current working directory. By
default the current working directory is used, but this setting is usually
set automatically as part of the `%(validateflow)s` expansion to a correct
path like `~/gst-validate/gst-integration-testsuites/flow-expectations/<test
name>`.
* `actual-results-dir`: Path to the directory where the events will be recorded.
The expectation file will be compared to this. By default the current working
directory is used, but this setting is usually set automatically as part of
the `%(validateflow)s` expansion to the test log directory, i.e.
`~/gst-validate/logs/validate/launch_pipeline/<test name>`.
* `generate-expectations`: Default: unset. When set to `true` the expectation
file will be written and no testing will be done and if set to `false`, the
expectation file will be required. If a validateflow config is used without
specifying any other parametters, the validateflow plugin will consider that
all validateflow overrides will use that value.
## Scenario actions
Scenarios with validateflow work in the same way as other tests. Often
validatetests will use appsrc in order to control the flow of data precisely,
possibly interleaving events in between. The following is a list of useful
actions.
* `appsrc-push`: Pushes a buffer from an appsrc element and waits for the chain
operation to finish. A path to a file is provided, optionally with an offset
and/or size.
* `appsrc-eos`: Queues an EOS event from the appsrc. The action finishes
immediately at this point.
* `stop`: Tears down the pipeline and stops the test.
* `checkpoint`: Records a "checkpoint" message in all validateflow overrides,
with an optional explanation message. This is useful to check certain events
or buffers are sent at a specific moment in the scenario, and can also help
to the comprehension of the scenario.
More details on these actions can be queried from the command line, like this:
``` bash
gst-validate-1.0 --inspect-action-type appsrc-push
```

View file

@ -0,0 +1,166 @@
---
short-description: Integration testsuite builder and launcher
...
# gst-validate-launcher
`gst-validate-launcher` is an application to run unit or integration testsuites
providing a set of options and features to help debugging them.
## Run the GStreamer unit tests
Running GStreamer unit tests inside `gst-build` is as simple as doing:
```
gst-validate-launcher check.gst*
```
If you only want to run GStreamer core tests:
```
gst-validate-launcher check.gstreamer*
```
Or to run unit tests from gst-plugins-base
```
gst-validate-launcher check.gst-plugins-base
```
You can also run them inside valgrind with the `-vg` option or inside gdb with
`--gdb` for example.
## Run the GstValidate default testsuite
GstValidate comes with a default testsuite to be executed on a default
set of media samples. Those media samples are stored with `git-lfs` so
you will need it to be able to launch the default testsuite.
We recommend using `gst-build` to setup everything needed to run the testsuite
and you can simply do:
gst-validate-launcher validate
This will only launch the GstValidate tests and not other applications
that might be supported (currently `ges-launch` is also supported and
has its own default testsuite).
## Example of a testsuite implementation
To implement a testsuite, you will have to write some simple python code
that defines the tests to be launched by `gst-validate-launcher`.
In this example, we will assume that you want to write a whole new
testsuite based on your own media samples and [scenarios](GstValidateScenario). The
set of media files and the testsuite implementation file will be
structured as follow:
testsuite_folder/
|-> testsuite.py
|-> sample_files/
|-> file.mp4
|-> file1.mkv
|-> file2.ogv
|-> scenarios
|-> scenario.scenario
|-> scenario1.scenario
You should generate the `.media_info` files. To generate them for local
files, you can use:
gst-validate-launcher --medias-paths /path/to/sample_files/ --generate-media-info
For remote streams, you should use
`gst-validate-media-check-1.0`. For an http stream you can
for example do:
gst-validate-media-check-GST_API_VERSION http://someonlinestream.com/thestream \
--output-file /path/to/testsuite_folder/sample_files/thestream.stream_info
The `gst-validate-launcher` will use the generated `.media_info` and
`.stream_info` files to validate the tests as those contain the
necessary information.
Then you will need to write the `testsuite.py` file. You can for example
implement the following testsuite:
``` python
"""
The GstValidate custom testsuite
"""
import os
from launcher.baseclasses import MediaFormatCombination
from launcher.apps.gstvalidate import *
TEST_MANAGER = "validate"
KNOWN_ISSUES = {}
def setup_tests(test_manager, options):
print("Setting up the custom testsuite")
assets_dir = os.path.abspath(os.path.join(os.path.dirname(__file__), ".", "samples_files"))
options.add_paths(assets_dir)
# This step will register default data for the test manager:
# - scenarios such as `play_15s`, `reverse_playback` etc.
# - encoding formats such as "mp4,h264,mp3" etc.
# - blacklist such as dash.media_check.*
# - test generators:
# - GstValidatePlaybinTestsGenerator
# - GstValidateMediaCheckTestsGenerator
# - GstValidateTranscodingTestsGenerator
# This 'defaults' can be found in 'gst-devtools/validate/launcher/apps/gstvalidate.py#register_defaults'
# test_manager.register_defaults()
# Add scenarios
scenarios = []
scenarios.append("play_5s")
scenarios.append("seek_backward")
test_manager.set_scenarios(scenarios)
# Add encoding formats used by the transcoding generator
test_manager.add_encoding_formats([
MediaFormatCombination("mp4", "mp3", "h264"),])
# Add generators
# GstValidatePlaybinTestsGenerator needs at least one media file
test_manager.add_generators([GstValidateMediaCheckTestsGenerator(test_manager)])
# GstValidatePlaybinTestsGenerator needs at least one scenario
test_manager.add_generators([GstValidatePlaybinTestsGenerator(test_manager)])
# GstValidateTranscodingTestsGenerator needs at least one MediaFormatCombination
test_manager.add_generators([GstValidateTranscodingTestsGenerator(test_manager)])
# list of combo to blacklist tests. Here it blacklists all tests with playback.seek_backward
test_manager.set_default_blacklist([
("custom_testsuite.file.playback.seek_backward.*",
"Not supported by this testsuite."),])
# you can even pass known issues to bypass an existing error in your custom testsuite
test_manager.add_expected_issues(KNOWN_ISSUES)
return True
```
Once this is done, you've got a testsuite that will:
- Run playbin pipelines on `file.mp4`, `file1.mkv` and `file2.ogv`&gt;
executing `play_5s` and `seek_backward` scenarios
- Transcode `file.mp4,` `file1.mkv` and `file2.ogv` to h264 and
mp3 in a MP4 container
The only thing to do to run the testsuite is:
gst-validate-launcher --testsuites-dir=/path/to/testsuite_folder/ testsuite
# Invocation
You can find detailed information about the launcher by launching it:
gst-validate-launcher --help
You can list all the tests with:
gst-validate-launcher --testsuites-dir=/path/to/testsuite_folder/ testsuite -L

View file

@ -0,0 +1,32 @@
---
short-description: Tool to test GStreamer media types discovery
...
# gst-validate-media-check
`gst-validate-media-check` is command line tool checking that media
files discovering works properly with `gst-discoverer` over multiple
runs. It needs a reference text file containing valid information about
a media file (which can be generated with the same tool) and then it
will be able to check that the reference matches what will be reported
by `gst-discoverer` in the following runs.
For example, given that we have a valid `reference.media_info` file, we
can run:
gst-validate-media-check-GST_API_VERSION file:///./file.ogv --expected-results reference.media_info
It will then output any error encountered and return an exit code
different from 0 if any error is found.
# Invocation
`gst-validate-media-check` takes an URI to analyze and some extra
options to control the output.
## Options
* `-o`, `--output-file`: The output file to store the results.
* `-f`, `--full`: Fully analize the file frame by frame.
* `-e`, `--expected-results`: Path to file containing the expected results (or the last results
found) for comparison with new results.

View file

@ -0,0 +1,110 @@
---
title: Scenarios
short-description: The GstValidate Scenario format
...
# GstValidate Scenario File Format
To be able to define a list of actions to execute on a [`GstPipeline`],
a dedicated file format is used. The name of the scenario is the name of
the file without its `.scenario` extension. The scenario file format is
based on the [`GstStructure`] serialized format which is a basic, type
aware, key value format. It takes the type of the action in the first
comma separated field, and then some key value pairs in the form
`parameter=value` separated by commas. The values type will be guessed
if not casted as in `parameter=(string)value`. You can force the type
guessing system to actually know what type you want by giving it the
right hints. For example to make sure the value is a double, you should
add a decimal (ie. `1` will be considered as a `int`, but `1.0` will be
considered as a `double` and `"1.0"` will be considered as a `string`).
For example to represent a seek action, you should add the following
line in the `.scenario` file.
seek, playback-time=10.0, start=0.0, flags=accurate+flush
The files to be used as scenario should have a `.scenario` extension and
should be placed either in
`$USER_DATA_DIR/gstreamer-1.0/validate/scenarios` ,
`$GST_DATADIR/gstreamer-1.0/validate/scenarios` or in a path defined in
the \$GST\_VALIDATE\_SCENARIOS\_PATH environment variable.
Each line in the `.scenario` file represent an action (you can also use
`\ ` at the end of a line write a single action on multiple lines).
Usually you should start you scenario with a `meta` structure
in order for the user to have more information about the
scenario. It can contain a `summary` field which is a string explaining
what the scenario does and then several info fields about the scenario.
You can find more info about it running:
gst-validate-1.0 --inspect-action-type action_type_name
So a basic scenario file that will seek three times and stop would look
like:
```
meta, summary="Seeks at 1.0 to 2.0 then at \
3.0 to 0.0 and then seeks at \
1.0 to 2.0 for 1.0 second (between 2.0 and 3.0).", \
seek=true, duration=5.0, min-media-duration=4.0
seek, playback-time=1.0, rate=1.0, start=2.0, flags=accurate+flush
seek, playback-time=3.0, rate=1.0, start=0.0, flags=accurate+flush
seek, playback-time=1.0, rate=1.0, start=2.0, stop=3.0, flags=accurate+flush
```
Many action types have been implemented to help users define their own
scenarios. For example there are:
- `seek`: Seeks into the stream.
- `play`: Set the pipeline state to `GST_STATE_PLAYING`.
- `pause`: Set the pipeline state to `GST_STATE_PAUSED`.
- `stop`: Stop the execution of the pipeline.
> **NOTE**: This action actually posts a [`GST_MESSAGE_REQUEST_STATE`]
> message requesting [`GST_STATE_NULL`] on the bus and the application
> should quit.
To get all the details about the registered action types, you can list
them all with:
```
gst-validate-1.0 --inspect-action-type
```
and to include transcoding specific action types:
```
gst-validate-transcoding-1.0 --inspect-action-type
```
Many scenarios are distributed with `gst-validate`, you can list them
all using:
```
gst-validate-1.0 --list-scenarios
```
You can find more information about the scenario implementation and
action types in the [`GstValidateScenario` section].
[`GstPipeline`]: GstPipeline
[`GstStructure`]: GstStructure
[`GST_MESSAGE_REQUEST_STATE`]: GST_MESSAGE_REQUEST_STATE
[`GST_STATE_NULL`]: GST_STATE_NULL
[`GstValidateScenario` section]: GstValidateScenario
## Default variables
Any action can use the default variables:
- `$(position)`: The current position in the pipeline as reported by
[gst_element_query_position()](gst_element_query_position)
- `$(duration)`: The current duration of the pipeline as reported by
[gst_element_query_duration()](gst_element_query_duration)
- `$(TMPDIR)`: The default temporary directory as returned by `g_get_tmp_dir`.
- `$(SCENARIO_PATH)`: The path of the running scenario.
- `$(SCENARIO_DIR)`: The directory the running scenario is in.
- `$(SCENARIO_NAME)`: The name the running scenario
It is also possible to set variables in scenario with the `set-vars` action.

View file

@ -0,0 +1,118 @@
---
title: Test file
short-description: GstValidate test file
...
# GstValidate Test file
A `.validatetest` file describes a fully contained validate test case. It
includes the arguments of the tool supposed to be used to run the test as well
as possibly a [configuration](gst-validate-config.md) and a set of action to
describe the validate [scenario](gst-validate-scenarios.md).
# The file format
A validate test file requires a `meta` structure which contains the same
information as the [scenario](gst-validate-scenarios.md) `meta` with some
additional fields described below. The `meta` structure should be either the
first or the one following the `set-globals` structure. The `set-globals`
structures allows you to set global variables for the rest of the
`.validatetest` file and is a free form variables setter. For example you can
do:
``` yaml
set-globals, media_dir=$(test_dir)/../../media
```
## Tool arguments
In the case of [`gst-validate`](gst-validate.md) it **has to** contain an
`args` field with `gst-validate` argv arguments like:
``` yaml
# This is the default tool so it is not mandatory for the `gst-validate` tool
tool = "gst-validate-$(gst_api_version)",
args = {
# pipeline description
videotestrc num-buffers=2 ! $(videosink),
# Random extra argument
--set-media-info $(test-dir)/some.media_info
}
```
## configs
The `configs` field is an array of structures containing the same content as
usual [configs](gst-validate-config.md) files.
For example:
``` yaml
configs = {
# Set videotestsrc0 pattern value to `blue`
"core, action=set-property, target-element-name=videotestsrc0, property-name=pattern, property-value=blue",
"$(validateflow), pad=sink1:sink, caps-properties={ width, height };",
}
```
Note: Since this is GstStructure synthax, we need to have the structures in the
array as strings/within quotes.
## expected-issues
The `expected-issues` field is an array of `expected-issue` structures containing
information about issues to expect (which can be known bugs or not).
Use `gst-validate-1.0 --print-issue-types` to print information about all issue types.
For example:
``` yaml
expected-issues = {
"expected-issue, issue-id=scenario::not-ended",
}
```
Note: Since this is GstStructure synthax, we need to have the structures in the
array as strings/within quotes.
### Fields:
* `issue-id`: (string): Issue ID - Mandatory if `summary` is not provided.
* `summary`: (string): Summary - Mandatory if `issue-id` is not provided.
* `details`: Regex string to match the issue details `detected-on`: (string):
The name of the element the issue happened on `level`: (string):
Issue level
* `sometimes`: (boolean): Default: `false` - Wheteher the issue happens only
sometimes if `false` and the issue doesn't happen, an error will
be issued.
* `issue-url`: (string): The url of the issue in the bug tracker if the issue is
a bug.
### Variables
The same way
Validate testfile will define some variables to make those files relocable:
* `$(test_dir)`: The directory where the `.validatetest` file is in.
* `$(test_name)`: The name of the test file (without extension).
* `$(test_name_dir)`: The name of the test directory (test_name with folder
separator instead of `.`).
* `$(validateflow)`: The validateflow structure name with the default/right
values for the `expectations-dir` and `actual-results-dir`
fields. See [validateflow](gst-validate-flow.md) for more
information.
* `$(videosink)`: The GStreamer videosink to use if the test can work with
different sinks for the video. It allows the tool to use
fakesinks when the user doesn't want to have visual feedback
for example.
* `$(audiosink)`: The GStreamer audiosink to use if the test can work with
different sinks for the audio. It allows the tool to use
fakesinks when the user doesn't want to have audio feedback
for example.

View file

@ -0,0 +1,126 @@
---
short-description: Tool to test GStreamer components
...
# gst-validate-transcoding
`gst-validate-transcoding` is tool to create media files transcoding
pipelines running inside the GstValidate monitoring infrastructure.
You can for example transcode any media file to Vorbis audio + VP8 video
in a WebM container by doing:
gst-validate-transcoding-GST_API_VERSION file:///./file.ogg file:///.../transcoded.webm -o 'video/webm:video/x-vp8:audio/x-vorbis'
`gst-validate-transcoding` will list every issue encountered during the
execution of the transcoding operation in a human readable report like
the one below:
issue : buffer is out of the segment range Detected on theoradec0.srcpad
at 0:00:00.096556426 Details : buffer is out of segment and shouldn't be
pushed. Timestamp: 0:00:25.000 - duration: 0:00:00.040 Range:
0:00:00.000 - 0:00:04.520 Description : buffer being pushed is out of
the current segment's start-stop range. Meaning it is going to be
discarded downstream without any use
The return code of the process will be 18 in case a `CRITICAL` issue has
been found.
## The encoding profile serialization format
This is the serialization format of a [GstEncodingProfile](GstEncodingProfile).
Internally the transcoding application uses [GstEncodeBin](encodebin).
`gst-validate-transcoding-GST_API_VERSION` uses its own serialization
format to describe the [`GstEncodeBin.profile`](encodebin:profile) property of the
encodebin.
The simplest serialized profile looks like:
muxer_source_caps:videoencoder_source_caps:audioencoder_source_caps
For example to encode a stream into a WebM container, with an OGG audio
stream and a VP8 video stream, the serialized [GstEncodingProfile](GstEncodingProfile)
will look like:
video/webm:video/x-vp8:audio/x-vorbis
You can also set the preset name of the encoding profile using the
caps+preset\_name syntax as in:
video/webm:video/x-vp8+youtube-preset:audio/x-vorbis
Moreover, you can set the [presence](gst_encoding_profile_set_presence) property
of an encoding profile using the `|presence` syntax as in:
video/webm:video/x-vp8|1:audio/x-vorbis
This field allows you to specify how many times maximum a
[GstEncodingProfile](GstEncodingProfile) can be used inside an encodebin.
You can also use the `restriction_caps->encoded_format_caps` syntax to
specify the [restriction caps](GstEncodingProfile:restriction-caps)
to be set on a [GstEncodingProfile](GstEncodingProfile). It
corresponds to the restriction [GstCaps](GstCaps) to apply before the encoder
that will be used in the profile. The fields present in restriction caps
are properties of the raw stream (that is, before encoding), such as
height and width for video and depth and sampling rate for audio. This
property does not make sense for muxers.
To force a video stream to be encoded with a Full HD resolution (using
WebM as the container format, VP8 as the video codec and Vorbis as the
audio codec), you should use:
video/webm:video/x-raw,width=1920,height=1080->video/x-vp8:audio/x-vorbis
### Some serialized encoding formats examples:
MP3 audio and H264 in MP4:
<div class="informalexample">
video/quicktime,variant=iso:video/x-h264:audio/mpeg,mpegversion=1,layer=3
</div>
Vorbis and theora in OGG:
<div class="informalexample">
application/ogg:video/x-theora:audio/x-vorbis
</div>
AC3 and H264 in MPEG-TS:
<div class="informalexample">
video/mpegts:video/x-h264:audio/x-ac3
</div>
# Invocation
`gst-validate-transcoding` takes and input URI and an output URI, plus a
few options to control how transcoding should be tested.
## Options
* `--set-scenario`: Let you set a scenario, it can be a full path to a scenario file or
the name of the scenario (name of the file without the `.scenario`
extension).
* `-l`, `--list-scenarios`: List the avalaible scenarios that can be run.
* `--scenarios-defs-output-file`: The output file to store scenarios details. Implies
`--list-scenario`.
* `-t`, `--inspect-action-type`: Inspect the avalaible action types with which to write scenarios if
no parameter passed, it will list all avalaible action types
otherwize will print the full description of the wanted types.
* `--set-configs`: Let you set a config scenario. The scenario needs to be set as
`config`. You can specify a list of scenarios separated by `:`. It
will override the GST\_VALIDATE\_SCENARIO environment variable.
* `-e`, `--eos-on-shutdown`: If an EOS event should be sent to the pipeline if an interrupt is
received, instead of forcing the pipeline to stop. Sending an EOS
will allow the transcoding to finish the files properly before
exiting.
* `-r`, `--force-reencoding`: Whether to try to force reencoding, meaning trying to only remux if
possible, defaults to `TRUE`.

58
docs/gst-validate.md Normal file
View file

@ -0,0 +1,58 @@
---
short-description: Tool to test GStreamer components
...
# gst-validate
`gst-validate` is the simplest `gst-launch`-like pipeline launcher
running inside GstValidate monitoring infrastructure. Monitors are added
to it to identify issues in the used elements. At the end it will print
a report with some information about all the issues encountered during
its run. To view issues as they are detected, set the environment
variable `GST_DEBUG=validate:2`{.shell} and they will get printed in the
GStreamer debug log. You can basically run any [GstPipeline](GstPipeline) pipeline
using this tool. If you are not familiar with `gst-launch` syntax,
please refer to `gst-launch`'s documentation.
Simple playback pipeline:
gst-validate-1.0 playbin uri=file:///path/to/some/media/file
Transcoding pipeline:
gst-validate-1.0 filesrc location=/media/file/location ! qtdemux name=d ! queue \
! x264enc ! h264parse ! mpegtsmux name=m ! progressreport \
! filesink location=/root/test.ts d. ! queue ! faac ! m.
It will list each issue that has been encountered during the execution
of the specified pipeline in a human readable report like:
issue : buffer is out of the segment range Detected on theoradec0.srcpad at 0:00:00.096556426
Details : buffer is out of segment and shouldn't be pushed. Timestamp: 0:00:25.000 - duration: 0:00:00.040 Range: 0:00:00.000 - 0:00:04.520
Description : buffer being pushed is out of the current segment's start-stop range. Meaning it is going to be discarded downstream without any use
The return code of the process will be 18 in case a `CRITICAL` issue has
been found.
# Invocation
`gst-validate` takes a mandatory description of the pipeline to launch,
similar to `gst-launch`, and some extra options.
## Options
* `--set-scenario`: Let you set a scenario, it can be a full path to a scenario file or
the name of the scenario (name of the file without the `.scenario`
extension).
* `-l`, `--list-scenarios`: List the avalaible scenarios that can be run.
* `--scenarios-defs-output-file`: The output file to store scenarios details. Implies
`--list-scenario`.
* `-t`, `--inspect-action-type`: Inspect the avalaible action types with which to write scenarios if
no parameter passed, it will list all avalaible action types
otherwize will print the full description of the wanted types.
* `--set-media-info`: Set a media\_info XML file descriptor to share information about the
media file that will be reproduced.
* `--set-configs`: Let you set a config scenario. The scenario needs to be set as
`config`. You can specify a list of scenarios separated by "`:`". It
will override the GST\_VALIDATE\_SCENARIO environment variable.

26
docs/index.md Normal file
View file

@ -0,0 +1,26 @@
# GStreamer Validate
GstValidate is a tool that allows GStreamer developers to check that the
GstElements they write behave the way they are supposed to. It was first
started to provide plug-ins developers with a tool to check that they
use the framework the proper way.
GstValidate implements a monitoring logic that allows the system to
check that the elements of a GstPipeline respect some rules GStreamer
components have to follow to make them properly interact together. For
example, a GstValidatePadMonitor will make sure that if we receive a
GstSegment from upstream, an equivalent segment is sent downstream
before any buffer gets out.
Then GstValidate implements a reporting system that allows users to get
detailed informations about what was not properly handled by the
elements. The generated reports are ordered by level of importance from
"issue" to "critical".
Some tools have been implemented to help developers validate and test
their GstElement, see [gst-validate](gst-validate.md) for example.
On top of that, the notion of a [validation scenario](gst-validate-scenarios.md)
has been implemented so that developers can easily execute a set of actions on
pipelines to test real world interactive cases and reproduce existing
issues in a convenient way.

71
docs/meson.build Normal file
View file

@ -0,0 +1,71 @@
build_hotdoc = false
if meson.is_cross_build()
if get_option('doc').enabled()
error('Documentation enabled but building the doc while cross building is not supported yet.')
endif
message('Documentation not built as building it while cross building is not supported yet.')
subdir_done()
endif
hotdoc_p = find_program('hotdoc', required: get_option('doc'))
if not hotdoc_p.found()
message('Hotdoc not found, not building the documentation')
subdir_done()
endif
required_hotdoc_extensions = ['gi-extension']
if not build_gir
if get_option('doc').enabled()
error('Documentation enabled but introspection not built.')
endif
message('Introspection not built, can\'t build the documentation')
subdir_done()
endif
hotdoc = import('hotdoc')
foreach extension: required_hotdoc_extensions
if not hotdoc.has_extensions(extension)
if get_option('doc').enabled()
error('Documentation enabled but @0@ missing'.format(extension))
endif
message('@0@ extension not found, not building documentation'.format(extension))
subdir_done()
endif
endforeach
excludes = ['gettext.h',
'gst-validate-internal.h',
'gst-validate-i18n-lib.c'
]
build_hotdoc = true
validate_excludes = []
foreach f: excludes
validate_excludes += [join_paths(meson.current_source_dir(), '..',
'validate', 'gst', 'validate', f)]
endforeach
validate_sources = []
foreach f: gstvalidate_headers + gstvalidate_sources
validate_sources += [join_paths(meson.current_source_dir(), '..',
'validate', 'gst', 'validate', f)]
endforeach
hotdoc = import('hotdoc')
plugins_doc = []
libs_doc = [hotdoc.generate_doc('gst-devtools',
project_version: apiversion,
sitemap: 'sitemap.txt',
index: 'index.md',
gi_c_sources: validate_sources,
gi_c_source_filters: validate_excludes,
gi_index: 'gi-index.md',
gi_smart_index: true,
gi_sources: [validate_gir[0].full_path()],
disable_incremental_build: true,
dependencies : [validate_dep],
)]

3
docs/plugins/index.md Normal file
View file

@ -0,0 +1,3 @@
# GstValidate plugins
GstValidate offers a plugin system to extend the checks and add new functionnality

87
docs/plugins/ssim.md Normal file
View file

@ -0,0 +1,87 @@
---
title: SSIM plugin
short_description: GstValidate plugin to detect frame corruptions
...
# SSIM plugin
GstValidate plugin to run the ssim algorithm on the buffers flowing in the
pipeline to find regressions and detect frame corruptions.
It allows you to generate image files from the buffers flowing in the pipeline
(either as raw in the many formats supported by GStreamer or as png) and then
check them against pre generated, reference images.
The ssim algorithm will set a value of 1.0 when images are perfectly identical,
and -1.0 if they have nothing in common. By default we consider images as similar
if they have at least a ssim value of 0.95 but you can override it defining the value
under which the test will be considered as failed.
Errors are reported on the GstValidate reporting system. You can also ask
the plugin to generate grey scale output images. Those will be named in a way
that should lets you precisely see where and how the test failed.
# Configuration
The configuration of the plugin is done through a validate configuration file,
specified with the %GST_VALIDATE_CONFIG environment variable. Each line starting
with 'ssim,' will configure the ssim plugin. In practice each configuration statement
will lead to the creation of a #GstValidateOverride object which will then dump
image files and if wanted compare those with a set of reference images.
The following parameters can be passed in the configuration file:
- element-classification: The target element classification as define in
gst_element_class_set_metadata
- output-dir: The directory in which the image files will be saved
- min-avg-priority: (default 0.95): The minimum average similarity
under which we consider the test as failing
- min-lowest-priority: (default 1): The minimum 'lowest' similarity
under which we consider the test as failing
- reference-images-dir: Define the directory in which the files to be
compared can be found
- result-output-dir: The folder in which to store resulting grey scale
images when the test failed. In that folder you will find images
with the structural difference between the expected result and the actual
result.
- output-video-format: The format in which you want the images to be saved
- reference-video-format: The format in which the reference images are stored
- check-recurrence: The recurrence in seconds (as float) the frames should
be dumped and checked.By default it is GST_CLOCK_TIME_NONE, meaning each
and every frame is checked. Not that in any case, after a discontinuity
in the stream (after a seek or a change in the video format for example)
a check is done. And if recurrence == 0, images will be checked only after
such discontinuity
- is-config: Property letting the plugin know that the config line is exclusively
used to configure the following configuration expressions. In practice this
means that it will change the default values for the other configuration
expressions.
- framerate: (GstFraction): The framerate to use to compute frame number from
timestamp, allowing to compare frames by 'frame number' instead of trying to
match timestamp between reference images and output images.
# Example #
Let's take a special configuration where we want to compare frames that are
outputted by a video decoder with the ones after a agingtv element we would
call my_agingtv. We force to check one frame every 5.0 seconds only (with
check-recurrence=5.0) so the test is fast.
The configuration file:
``` shell
core, action=set-property, target-element-klass=Sink, property-name=sync, property-value=false
validatessim, is-config=true, output-video-format="I420", reference-video-format="I420"
validatessim, element-classification="Video/Decoder", output-dir=/tmp/test/before-agingtv/
validatessim, element-name=my_agingtv, output-dir=/tmp/test/after-agingtv/, \
reference-images-dir=/tmp/test/before-agingtv/, \
result-output-dir=/tmp/test/failures, check-recurrence=5.0
```
Save that content in a file called check_agingtv_ssim.config
## Launch the pipeline
``` shell
GST_VALIDATE_CONFIG=check_agingtv_ssim.config gst-validate-1.0-debug uridecodebin uri=file://a/file ! videoconvert ! agingtv name=my_agingtv ! videoconvert ! autovideosink
```

View file

@ -0,0 +1,3 @@
---
redirect: gst-validate-flow.md
...

16
docs/sitemap.txt Normal file
View file

@ -0,0 +1,16 @@
index.md
gst-validate.md
gst-validate-transcoding.md
gst-validate-media-check.md
gst-validate-launcher.md
gst-validate-scenarios.md
gst-validate-test-file.md
gst-validate-config.md
gst-validate-environment-variables.md
gst-validate-action-types.md
ges-validate-action-types.md
gst-validate-flow.md
gi-index
plugins/index.md
plugins/ssim.md
plugins/validateflow.md

384
gst-devtools.doap Normal file
View file

@ -0,0 +1,384 @@
<Project
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xmlns="http://usefulinc.com/ns/doap#"
xmlns:foaf="http://xmlns.com/foaf/0.1/"
xmlns:admin="http://webns.net/mvcb/">
<name>GStreamer development and validation tools</name>
<shortname>gstreamer</shortname>
<homepage rdf:resource="https://gstreamer.freedesktop.org/modules/gstreamer.html" />
<created>1999-10-31</created>
<shortdesc xml:lang="en">
GStreamer development and validation tools including GstValidate, a testing
framework aiming at providing GStreamer developers tools that check the
GstElements they write behave the way they are supposed to.
</shortdesc>
<description xml:lang="en">
GstValidate is a tool that allows GStreamer developers to check that the
GstElements they write behave the way they are supposed to. It was first
started to provide plug-ins developers with a tool to check that they use the
framework the proper way.
GstValidate implements a monitoring logic that allows the system to check that
the elements of a GstPipeline respect some rules GStreamer components have to
follow so that elements can properly interact together. For example, a
GstValidatePadMonitor will make sure that if we receive a GstSegment from
upstream, an equivalent segment is sent downstream before any buffer gets out.
Then GstValidate implements a reporting system that allows users to get
detailed informations about what was not properly handle in elements. The
reports are order by level of importance from "issue" to "critical".
Some tools have been implemented to help the developer validate and test their
GstElement, you can have a look at the command line tools section to find more
information
On top of those tools, the notion of scenario has been implemented so that
developers can easily execute a set of actions on pipelines and thus test real
world interactive cases and reproduce existing issues in a convenient way.
</description>
<category></category>
<bug-database rdf:resource="https://gitlab.freedesktop.org/gstreamer/gst-devtools/issues/" />
<screenshots></screenshots>
<mailing-list rdf:resource="https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel" />
<programming-language>C</programming-language>
<license rdf:resource="http://usefulinc.com/doap/licenses/lgpl" />
<download-page rdf:resource="https://gstreamer.freedesktop.org/download/" />
<repository>
<GitRepository>
<location rdf:resource="https://gitlab.freedesktop.org/gstreamer/gst-devtools"/>
<browse rdf:resource="https://gitlab.freedesktop.org/gstreamer/gst-devtools/"/>
</GitRepository>
</repository>
<release>
<Version>
<revision>1.19.2</revision>
<branch>master</branch>
<name></name>
<created>2021-09-23</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-devtools/gst-devtools-1.19.2.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.19.1</revision>
<branch>master</branch>
<name></name>
<created>2021-06-01</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-devtools/gst-devtools-1.19.1.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.18.0</revision>
<branch>master</branch>
<name></name>
<created>2020-09-08</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-devtools/gst-devtools-1.18.0.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.17.90</revision>
<branch>master</branch>
<name></name>
<created>2020-08-20</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-devtools/gst-devtools-1.17.90.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.17.2</revision>
<branch>master</branch>
<name></name>
<created>2020-07-03</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-devtools/gst-devtools-1.17.2.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.17.1</revision>
<branch>master</branch>
<name></name>
<created>2020-06-19</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-devtools/gst-devtools-1.17.1.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.16.0</revision>
<branch>master</branch>
<name></name>
<created>2019-04-19</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.16.0.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.15.90</revision>
<branch>master</branch>
<name></name>
<created>2019-04-11</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.15.90.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.15.2</revision>
<branch>master</branch>
<name></name>
<created>2019-02-26</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.15.2.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.15.1</revision>
<branch>master</branch>
<name></name>
<created>2019-01-17</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.15.1.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.14.0</revision>
<branch>master</branch>
<name></name>
<created>2018-03-19</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.14.0.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.13.91</revision>
<branch>master</branch>
<name></name>
<created>2018-03-13</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.13.91.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.13.90</revision>
<branch>master</branch>
<name></name>
<created>2018-03-03</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.13.90.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.13.1</revision>
<branch>master</branch>
<name></name>
<created>2018-02-15</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.13.1.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.12.4</revision>
<branch>1.12</branch>
<name></name>
<created>2017-12-07</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.12.4.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.12.3</revision>
<branch>1.12</branch>
<name></name>
<created>2017-09-18</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.12.3.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.12.2</revision>
<branch>1.12</branch>
<name></name>
<created>2017-07-14</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.12.2.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.12.1</revision>
<branch>1.12</branch>
<name></name>
<created>2017-06-20</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.12.1.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.12.0</revision>
<branch>master</branch>
<name></name>
<created>2017-05-04</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.12.0.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.11.91</revision>
<branch>master</branch>
<name></name>
<created>2017-04-27</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.11.91.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.11.90</revision>
<branch>master</branch>
<name></name>
<created>2017-04-07</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.11.90.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.11.2</revision>
<branch>master</branch>
<name></name>
<created>2017-02-24</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.11.2.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.11.1</revision>
<branch>master</branch>
<name></name>
<created>2017-01-12</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.11.1.tar.xz" />
</Version>
</release>
<Version>
<revision>1.10.0</revision>
<branch>master</branch>
<created>2016-11-01</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.10.0.tar.xz" />
</Version>
</release>
<Version>
<revision>1.9.90</revision>
<branch>master</branch>
<created>2016-09-30</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.9.90.tar.xz" />
</Version>
</release>
<Version>
<revision>1.9.2</revision>
<branch>master</branch>
<created>2016-09-01</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.9.2.tar.xz" />
</Version>
</release>
<Version>
<revision>1.9.1</revision>
<branch>master</branch>
<created>2016-06-06</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.9.1.tar.xz" />
</Version>
</release>
<Version>
<revision>1.8.0</revision>
<branch>master</branch>
<created>2016-03-24</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.8.0.tar.xz" />
</Version>
</release>
<Version>
<revision>1.7.91</revision>
<branch>master</branch>
<created>2016-03-15</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.7.91.tar.xz" />
</Version>
</release>
<Version>
<revision>1.7.90</revision>
<branch>master</branch>
<created>2016-03-01</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.7.90.tar.xz" />
</Version>
</release>
<Version>
<revision>1.6.0</revision>
<branch>1.6</branch>
<created>2015-09-25</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.6.0.tar.xz" />
</Version>
</release>
<Version>
<revision>1.5.90</revision>
<branch>1.5</branch>
<created>2015-08-20</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.5.90.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.5.2</revision>
<branch>1.5</branch>
<name></name>
<created>2014-09-29</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.5.2.tar.xz" />
</Version>
</release>
<release>
<Version>
<revision>1.4.0</revision>
<branch>1.4</branch>
<name></name>
<created>2014-09-29</created>
<file-release rdf:resource="https://gstreamer.freedesktop.org/src/gst-validate/gst-validate-1.4.0.tar.xz" />
</Version>
</release>
<maintainer>
<foaf:Person>
<foaf:name>Thibault Saunier</foaf:name>
</foaf:Person>
</maintainer>
</Project>

43
hooks/multi-pre-commit.hook Executable file
View file

@ -0,0 +1,43 @@
#!/bin/sh
# Git pre-commit hook that runs multiple hooks specified in $HOOKS.
# Make sure this script is executable. Bypass hooks with git commit --no-verify.
# This file is inspired by a set of unofficial pre-commit hooks available
# at github.
# Link: https://github.com/githubbrowser/Pre-commit-hooks
# Contact: David Martin, david.martin.mailbox@googlemail.com
###########################################################
# SETTINGS:
# pre-commit hooks to be executed. They should be in the same .git/hooks/ folder
# as this script. Hooks should return 0 if successful and nonzero to cancel the
# commit. They are executed in the order in which they are listed.
###########################################################
HOOKS="hooks/pre-commit.hook hooks/pre-commit-python.hook"
# exit on error
set -e
echo $PWD
for hook in $HOOKS
do
echo "Running hook: $hook"
# run hook if it exists
# if it returns with nonzero exit with 1 and thus abort the commit
if [ -f "$PWD/$hook" ]; then
"$PWD/$hook"
if [ $? != 0 ]; then
exit 1
fi
else
echo "Error: file $hook not found."
echo "Aborting commit. Make sure the hook is at $PWD/$hook and executable."
echo "You can disable it by removing it from the list"
echo "You can skip all pre-commit hooks with --no-verify (not recommended)."
exit 1
fi
done

81
hooks/pre-commit-python.hook Executable file
View file

@ -0,0 +1,81 @@
#!/usr/bin/env python3
import os
import subprocess
import sys
import tempfile
NOT_PYCODESTYLE_COMPLIANT_MESSAGE_PRE = \
"Your code is not fully pycodestyle compliant and contains"\
" the following coding style issues:\n\n"
NOT_PYCODESTYLE_COMPLIANT_MESSAGE_POST = \
"Please fix these errors and commit again, you can do so "\
"from the root directory automatically like this, assuming the whole "\
"file is to be commited:"
NO_PYCODESTYLE_MESSAGE = \
"You should install the pycodestyle style checker to be able"\
" to commit in this repo.\nIt allows us to garantee that "\
"anything that is commited respects the pycodestyle coding style "\
"standard.\nYou can install it:\n"\
" * on ubuntu, debian: $sudo apt-get install pycodestyle \n"\
" * on fedora: #yum install python3-pycodestyle \n"\
" * on arch: #pacman -S python-pycodestyle \n"\
" * or `pip install --user pycodestyle`"
def system(*args, **kwargs):
kwargs.setdefault('stdout', subprocess.PIPE)
proc = subprocess.Popen(args, **kwargs)
out, err = proc.communicate()
if isinstance(out, bytes):
out = out.decode()
return out
def copy_files_to_tmp_dir(files):
tempdir = tempfile.mkdtemp()
for name in files:
filename = os.path.join(tempdir, name)
filepath = os.path.dirname(filename)
if not os.path.exists(filepath):
os.makedirs(filepath)
with open(filename, 'w') as f:
system('git', 'show', ':' + name, stdout=f)
return tempdir
def main():
modified_files = system('git', 'diff-index', '--cached',
'--name-only', 'HEAD', '--diff-filter=ACMR').split("\n")[:-1]
non_compliant_files = []
output_message = None
for modified_file in modified_files:
try:
if not modified_file.endswith(".py"):
continue
pycodestyle_errors = system('pycodestyle', '--repeat', '--ignore', 'E402,E501,E128,W605,W503', modified_file)
if pycodestyle_errors:
if output_message is None:
output_message = NOT_PYCODESTYLE_COMPLIANT_MESSAGE_PRE
output_message += pycodestyle_errors
non_compliant_files.append(modified_file)
except OSError as e:
output_message = NO_PYCODESTYLE_MESSAGE
break
if output_message:
print(output_message)
if non_compliant_files:
print(NOT_PYCODESTYLE_COMPLIANT_MESSAGE_POST)
for non_compliant_file in non_compliant_files:
print("autopep8 -i ", non_compliant_file, "; git add ",
non_compliant_file)
print("git commit")
sys.exit(1)
if __name__ == '__main__':
main()

83
hooks/pre-commit.hook Executable file
View file

@ -0,0 +1,83 @@
#!/bin/sh
#
# Check that the code follows a consistant code style
#
# Check for existence of indent, and error out if not present.
# On some *bsd systems the binary seems to be called gnunindent,
# so check for that first.
version=`gnuindent --version 2>/dev/null`
if test "x$version" = "x"; then
version=`gindent --version 2>/dev/null`
if test "x$version" = "x"; then
version=`indent --version 2>/dev/null`
if test "x$version" = "x"; then
echo "GStreamer git pre-commit hook:"
echo "Did not find GNU indent, please install it before continuing."
exit 1
else
INDENT=indent
fi
else
INDENT=gindent
fi
else
INDENT=gnuindent
fi
case `$INDENT --version` in
GNU*)
;;
default)
echo "GStreamer git pre-commit hook:"
echo "Did not find GNU indent, please install it before continuing."
echo "(Found $INDENT, but it doesn't seem to be GNU indent)"
exit 1
;;
esac
INDENT_PARAMETERS="--braces-on-if-line \
--case-brace-indentation0 \
--case-indentation2 \
--braces-after-struct-decl-line \
--line-length80 \
--no-tabs \
--cuddle-else \
--dont-line-up-parentheses \
--continuation-indentation4 \
--honour-newlines \
--tab-size8 \
--indent-level2 \
--leave-preprocessor-space"
echo "--Checking style--"
for file in `git diff-index --cached --name-only HEAD --diff-filter=ACMR| grep "\.c$"` ; do
# nf is the temporary checkout. This makes sure we check against the
# revision in the index (and not the checked out version).
nf=`git checkout-index --temp ${file} | cut -f 1`
newfile=`mktemp /tmp/${nf}.XXXXXX` || exit 1
$INDENT ${INDENT_PARAMETERS} \
$nf -o $newfile 2>> /dev/null
# FIXME: Call indent twice as it tends to do line-breaks
# different for every second call.
$INDENT ${INDENT_PARAMETERS} \
$newfile 2>> /dev/null
diff -u -p "${nf}" "${newfile}"
r=$?
rm "${newfile}"
rm "${nf}"
if [ $r != 0 ] ; then
echo "================================================================================================="
echo " Code style error in: $file "
echo " "
echo " Please fix before committing. Don't forget to run git add before trying to commit again. "
echo " If the whole file is to be committed, this should work (run from the top-level directory): "
echo " "
echo " gst-indent $file; git add $file; git commit"
echo " "
echo "================================================================================================="
exit 1
fi
done
echo "--Checking style pass--"

167
meson.build Normal file
View file

@ -0,0 +1,167 @@
project('gst-devtools', 'c',
version : '1.19.2',
meson_version : '>= 0.54',
default_options : [ 'warning_level=1',
'c_std=gnu99',
'buildtype=debugoptimized' ])
gst_version = meson.project_version()
version_arr = gst_version.split('.')
gst_version_major = version_arr[0].to_int()
gst_version_minor = version_arr[1].to_int()
gst_version_micro = version_arr[2].to_int()
if gst_version_minor.is_even()
TESTSUITE_VERSION = '@0@.@1@'.format(gst_version_major, gst_version_minor)
else
TESTSUITE_VERSION = 'master'
endif
apiversion = '1.0'
soversion = 0
# maintaining compatibility with the previous libtool versioning
# current = minor * 100 + micro
curversion = gst_version_minor * 100 + gst_version_micro
libversion = '@0@.@1@.0'.format(soversion, curversion)
osxversion = curversion + 1
prefix = get_option('prefix')
glib_req = '>= 2.56.0'
gst_req = '>= @0@.@1@.0'.format(gst_version_major, gst_version_minor)
cc = meson.get_compiler('c')
if cc.get_id() == 'msvc'
msvc_args = [
# Ignore several spurious warnings for things gstreamer does very commonly
# If a warning is completely useless and spammy, use '/wdXXXX' to suppress it
# If a warning is harmless but hard to fix, use '/woXXXX' so it's shown once
# NOTE: Only add warnings here if you are sure they're spurious
'/wd4018', # implicit signed/unsigned conversion
'/wd4146', # unary minus on unsigned (beware INT_MIN)
'/wd4244', # lossy type conversion (e.g. double -> int)
'/wd4305', # truncating type conversion (e.g. double -> float)
cc.get_supported_arguments(['/utf-8']), # set the input encoding to utf-8
# Enable some warnings on MSVC to match GCC/Clang behaviour
'/w14062', # enumerator 'identifier' in switch of enum 'enumeration' is not handled
'/w14101', # 'identifier' : unreferenced local variable
'/w14189', # 'identifier' : local variable is initialized but not referenced
]
add_project_arguments(msvc_args, language: 'c')
# Disable SAFESEH with MSVC for plugins and libs that use external deps that
# are built with MinGW
noseh_link_args = ['/SAFESEH:NO']
else
noseh_link_args = []
endif
# Symbol visibility
if cc.has_argument('-fvisibility=hidden')
add_project_arguments('-fvisibility=hidden', language: 'c')
endif
# Disable strict aliasing
if cc.has_argument('-fno-strict-aliasing')
add_project_arguments('-fno-strict-aliasing', language: 'c')
endif
gst_dep = dependency('gstreamer-' + apiversion, version : gst_req,
fallback : ['gstreamer', 'gst_dep'])
gstbase_dep = dependency('gstreamer-base-' + apiversion, version : gst_req,
fallback : ['gstreamer', 'gst_base_dep'])
gst_pbutils_dep = dependency('gstreamer-pbutils-' + apiversion, version : gst_req,
fallback : ['gst-plugins-base', 'pbutils_dep'])
gst_video_dep = dependency('gstreamer-video-' + apiversion, version : gst_req,
fallback : ['gst-plugins-base', 'video_dep'])
gst_controller_dep = dependency('gstreamer-controller-' + apiversion, version : gst_req,
fallback : ['gstreamer', 'gst_controller_dep'])
gst_check_dep = dependency('gstreamer-check-1.0', version : gst_req,
required : get_option('validate'),
fallback : ['gstreamer', 'gst_check_dep'])
glib_dep = dependency('glib-2.0', version : '>=2.32.0',
fallback: ['glib', 'libglib_dep'])
gmodule_dep = dependency('gmodule-2.0',
fallback: ['glib', 'libgmodule_dep'])
gio_dep = dependency('gio-2.0',
fallback: ['glib', 'libgio_dep'])
gtk_dep = dependency('gtk+-3.0', required: false)
mathlib = cc.find_library('m', required : false)
dl = cc.find_library('dl', required : false)
json_dep = dependency('json-glib-1.0',
fallback : ['json-glib', 'json_glib_dep'])
gst_c_args = ['-DHAVE_CONFIG_H', '-DGST_USE_UNSTABLE_API']
gir_init_section = [ '--add-init-section=extern void gst_init(gint*,gchar**);' + \
'g_setenv("GST_REGISTRY_1.0", "/no/way/this/exists.reg", TRUE);' + \
'g_setenv("GST_PLUGIN_PATH_1_0", "", TRUE);' + \
'g_setenv("GST_PLUGIN_SYSTEM_PATH_1_0", "", TRUE);' + \
'gst_init(NULL,NULL);', '--quiet']
gir = find_program('g-ir-scanner', required : get_option('introspection'))
build_gir = gir.found() and (not meson.is_cross_build() or get_option('introspection').enabled())
gnome = import('gnome')
if gst_dep.type_name() == 'internal'
gst_debug_disabled = not subproject('gstreamer').get_variable('gst_debug')
else
# We can't check that in the case of subprojects as we won't
# be able to build against an internal dependency (which is not built yet)
gst_debug_disabled = cc.has_header_symbol('gst/gstconfig.h', 'GST_DISABLE_GST_DEBUG', dependencies: gst_dep)
endif
if gst_debug_disabled and cc.has_argument('-Wno-unused')
add_project_arguments('-Wno-unused', language: 'c')
endif
warning_flags = [
'-Wmissing-declarations',
'-Wmissing-prototypes',
'-Wredundant-decls',
'-Wundef',
'-Wwrite-strings',
'-Wformat',
'-Wformat-security',
'-Winit-self',
'-Wmissing-include-dirs',
'-Waddress',
'-Wno-multichar',
'-Wdeclaration-after-statement',
'-Wvla',
'-Wpointer-arith',
]
foreach extra_arg : warning_flags
if cc.has_argument (extra_arg)
add_project_arguments([extra_arg], language: 'c')
endif
endforeach
pkgconfig = import('pkgconfig')
plugins_install_dir = join_paths(get_option('libdir'), 'gstreamer-1.0')
plugins_pkgconfig_install_dir = join_paths(plugins_install_dir, 'pkgconfig')
if get_option('default_library') == 'shared'
# If we don't build static plugins there is no need to generate pc files
plugins_pkgconfig_install_dir = disabler()
endif
pkgconfig_subdirs = ['gstreamer-1.0']
plugins_doc_dep = []
plugins = []
i18n = import('i18n')
python_mod = import('python')
python3 = python_mod.find_installation()
if not get_option('validate').disabled()
subdir('validate')
endif
if not get_option('debug_viewer').disabled()
subdir('debug-viewer')
endif
subdir('docs')
run_command(python3, '-c', 'import shutil; shutil.copy("hooks/multi-pre-commit.hook", ".git/hooks/pre-commit")')

12
meson_options.txt Normal file
View file

@ -0,0 +1,12 @@
option('validate', type : 'feature', value : 'auto',
description : 'Build GstValidate')
option('debug_viewer', type : 'feature', value : 'disabled',
description : 'Build GstDebugViewer (GPLv3+)')
option('introspection', type : 'feature', value : 'auto', yield : true,
description : 'Generate gobject-introspection bindings')
option('tests', type : 'feature', value : 'auto', yield : true,
description : 'Build and enable unit tests')
option('nls', type : 'feature', value : 'auto', yield: true,
description : 'Enable native language support (translations)')
option('doc', type : 'feature', value : 'auto', yield: true,
description: 'Enable documentation.')

12
tracer/Makefile Normal file
View file

@ -0,0 +1,12 @@
TEST_DATA = \
logs/trace.latency.log
all:
logs/trace.latency.log:
mkdir -p logs; \
GST_DEBUG="GST_TRACER:7" GST_TRACERS=latency GST_DEBUG_FILE=$@ \
gst-launch-1.0 -q audiotestsrc num-buffers=10 wave=silence ! audioconvert ! autoaudiosink
check: $(TEST_DATA)
python3 -m unittest discover tracer "*_test.py"

117
tracer/README Normal file
View file

@ -0,0 +1,117 @@
# Add a python api for tracer analyzers
The python framework will parse the tracer log and aggregate information.
the tool writer will subclass from the Analyzer class and override methods:
'handle_tracer_class(self, entry)'
'handle_tracer_entry(self, entry)'
Each of those is optional. The entry field is the parsed log line. In most cases
the tools will parse the structure contained in event[Parser.F_MESSAGE].
TODO: maybe do apply_tracer_entry() and revert_tracer_entry() - 'apply' will
patch the shared state forward and 'revert' will 'apply' the inverse. This would
let us go back from a state. An application should still take snapshots to allow
for efficient jumping around. If that is the case we could also always go forward
from a snapshot.
A tool will use an AnalysisRunner to chain one or more analyzers and iterate the
log. A tool can also replay the log multiple times. If it does, it won't work in
'streaming' mode though (streaming mode can offer live stats).
## TODO
### gst shadow types
Do we want to provide classes like GstBin, GstElement, GstPad, ... to aggregate
info. One way to get them would be to have a GstLogAnalyzer that knows
about data from the log tracer and populates the classes. Tools then can
do e.g.
pad.name() # pad name
pad.parent().name() # element name
pad.peer().parent() # peer element
pad.parent().state() # element state
This would allow us to e.g. get a pipeline graph at any point in the log.
### improve class handling
We already parse the tracer classes. Add helpers that for numeric values that
extract them, and aggregate min/max/avg. Consider other statistical information
(std. deviation) and provide a rolling average for live view.
## Examples
### Sequence chart generator (mscgen)
1.) Write file header
2.) collect element order
Replay the log and use pad_link_pre to collect pad->peer_pad relationship.
Build a sequence of element names and write to msc file.
3.) collect event processing
Replay the log and use pad_push_event_pre to output message lines to mscfile.
4.) write footer and run the tool.
## Latency stats
1.) collect per sink-latencies and for each sink per source latencies
Calculate min, max, avg. Consider streaming interface, where we update the stats
e.g. once a sec
2.) in non-streaming mode write final statistic
## cpu load stats
Like latency stats, for cpu load. Process cpu load + per thread cpu load.
## top
Combine various stats tools into one.
# todo
## all tools
* need some (optional) progress reporting
## structure parser
* add an optional compiled regexp matcher an constructor param
* then we'll parse the whole structure with a single regexp
* this will only parse the top-level structure, we'd then check if there are
nested substructure and handle them
# Improve tracers
## log
* the log tracer logs args and results into misc categories
* issues
* not easy/reliable to detect its output among other trace output
* not easy to match pre/post lines
* uses own do_log method, instead of gst_tracer_record_log
* if we also log structures, we need to log the 'function' as the
structure-name, also fields would be key=(type)val, instead of key=value
* if we switch to gst_tracer_record_log, we'd need to register 27 formats :/
## object ids
When logging GstObjects in PTR_FORMAT, we log the name. Unfortunately the name
is not neccesarilly unique over time. Same goes for the object address.
When logging a tracer record we need a way for the scope fileds to uniquely
relate to objects.
a) parse object creation and destruction and build <name:id>-maps in the tracer
tools:
new-element message: gst_util_seqnum_next() and assoc with name
<new stats>: get id by name and get data record via id
if we go this way, the stats tracer would log name in regullar record (which
makes them more readable).
FIXME:
- if we use stats or log and latency, do we log latency messages twice?
grep -c ":: latency, " logs/trace.all.log
8365
grep ":: event, " logs/trace.all.log | grep -c "name=(string)latency"
63
seems to not happen, regardless of order in GST_TRACERS="latency;stats"
- why do we log element-ix for buffer, event, ... log-entries in the stats
tracer? We log new-pad, when the pad get added to a parent, so we should know
the element already

217
tracer/gsttr-stats.py Normal file
View file

@ -0,0 +1,217 @@
#!/usr/bin/env python3
'''
Aggregate values for each tracer event type and print them with some statistics.
How to run:
1) generate some log
GST_DEBUG="GST_TRACER:7" GST_TRACERS="stats;rusage;latency" GST_DEBUG_FILE=trace.log <application>
2) print everything
python3 gsttr-stats.py trace.log
3) print selected entries only
python3 gsttr-stats.py -c latency trace.log
'''
# TODO:
# more options
# - live-update interval (for file=='-')
#
# - for values like timestamps, we only want min/max but no average
import logging
from fnmatch import fnmatch
from tracer.analysis_runner import AnalysisRunner
from tracer.analyzer import Analyzer
from tracer.parser import Parser
from tracer.structure import Structure
logging.basicConfig(level=logging.WARNING)
logger = logging.getLogger('gsttr-stats')
_SCOPE_RELATED_TO = {
'GST_TRACER_VALUE_SCOPE_PAD': 'Pad',
'GST_TRACER_VALUE_SCOPE_ELEMENT': 'Element',
'GST_TRACER_VALUE_SCOPE_THREAD': 'Thread',
'GST_TRACER_VALUE_SCOPE_PROCESS': 'Process',
}
_NUMERIC_TYPES = ('int', 'uint', 'gint', 'guint', 'gint64', 'guint64')
class Stats(Analyzer):
def __init__(self, classes):
super(Stats, self).__init__()
self.classes = classes
self.records = {}
self.data = {}
def handle_tracer_class(self, event):
s = Structure(event[Parser.F_MESSAGE])
# TODO only for debugging
# print("tracer class:", repr(s))
name = s.name[:-len('.class')]
record = {
'class': s,
'scope': {},
'value': {},
}
self.records[name] = record
for k, v in s.values.items():
if v.name == 'scope':
# TODO only for debugging
# print("scope: [%s]=%s" % (k, v))
record['scope'][k] = v
elif v.name == 'value':
# skip non numeric and those without min/max
if v.values['type'] in _NUMERIC_TYPES and 'min' in v.values and 'max' in v.values:
# TODO only for debugging
# print("value: [%s]=%s" % (k, v))
record['value'][k] = v
# else:
# TODO only for debugging
# print("skipping value: [%s]=%s" % (k, v))
def handle_tracer_entry(self, event):
# use first field in message (structure-id) if none
if event[Parser.F_FUNCTION]:
return
msg = event[Parser.F_MESSAGE]
p = msg.find(',')
if p == -1:
return
entry_name = msg[:p]
if self.classes:
if not any([fnmatch(entry_name, c) for c in self.classes]):
return
record = self.records.get(entry_name)
if not record:
return
try:
s = Structure(msg)
except ValueError:
logger.warning("failed to parse: '%s'", msg)
return
# aggregate event based on class
for sk, sv in record['scope'].items():
# look up bin by scope (or create new)
key = (_SCOPE_RELATED_TO[sv.values['related-to']] + ":" + str(s.values[sk]))
scope = self.data.get(key)
if not scope:
scope = {}
self.data[key] = scope
for vk, vv in record['value'].items():
# skip optional fields
if vk not in s.values:
continue
if not s.values.get('have-' + vk, True):
continue
key = entry_name + "/" + vk
data = scope.get(key)
if not data:
data = {'num': 0}
if '_FLAGS_AGGREGATED' not in vv.values.get('flags', ''):
data['sum'] = 0
if 'max' in vv.values and 'min' in vv.values:
data['min'] = int(vv.values['max'])
data['max'] = int(vv.values['min'])
else:
# aggregated: don't average, collect first value
data['min'] = int(s.values[vk])
scope[key] = data
# update min/max/sum and count via value
dv = int(s.values[vk])
data['num'] += 1
if 'sum' in data:
data['sum'] += dv
if 'min' in data:
data['min'] = min(dv, data['min'])
if 'max' in data:
data['max'] = max(dv, data['max'])
else:
# aggregated: collect last value
data['max'] = dv
def report(self):
# headline
print("%-45s: %30s: %16s/%16s/%16s" % (
'scope', 'value', 'min', 'avg', 'max'))
# iterate scopes
for sk, sv in self.data.items():
# iterate tracers
for tk, tv in sv.items():
mi = tv.get('min', '-')
ma = tv.get('max', '-')
if 'sum' in tv:
avg = tv['sum'] / tv['num']
else:
avg = '-'
if mi == ma:
mi = ma = '-'
if is_time_field(tk):
if mi != '-':
mi = format_ts(mi)
if ma != '-':
ma = format_ts(ma)
if avg != '-':
avg = format_ts(avg)
print("%-45s: %30s: %16s/%16s/%16s" % (sk, tk, mi, avg, ma))
class ListClasses(Analyzer):
def __init__(self):
super(ListClasses, self).__init__()
def handle_tracer_class(self, event):
s = Structure(event[Parser.F_MESSAGE])
print(s.name)
def handle_tracer_entry(self, event):
raise StopIteration
def format_ts(ts):
sec = 1e9
h = int(ts // (sec * 60 * 60))
m = int((ts // (sec * 60)) % 60)
s = (ts / sec)
return '{:02d}:{:02d}:{:010.7f}'.format(h, m, s)
def is_time_field(f):
# TODO: need proper units
return (f.endswith('/time') or f.endswith('-dts') or f.endswith('-pts')
or f.endswith('-duration'))
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('file', nargs='?', default='debug.log')
parser.add_argument('-c', '--class', action='append', dest='classes',
help='tracer class selector (default: all)')
parser.add_argument('-l', '--list-classes', action='store_true',
help='show tracer classes')
args = parser.parse_args()
analyzer = None
if args.list_classes:
analyzer = ListClasses()
else:
analyzer = stats = Stats(args.classes)
with Parser(args.file) as log:
runner = AnalysisRunner(log)
runner.add_analyzer(analyzer)
runner.run()
if not args.list_classes:
stats.report()

287
tracer/gsttr-tsplot.py Normal file
View file

@ -0,0 +1,287 @@
#!/usr/bin/env python3
'''
Plot buffer pts and events in relation to the wall clock. The plots can be used
to spot anomalies, such as processing gaps.
How to run:
1) generate a log
GST_DEBUG="GST_TRACER:7" GST_TRACERS=stats GST_DEBUG_FILE=trace.log <application>
2) generate the images
python3 gsttr-tsplot.py trace.log <outdir>
eog <outdir>/*.png
'''
# TODO:
# - improve event plot
# - ideally each event is a vertical line
# http://stackoverflow.com/questions/35105672/vertical-lines-from-data-in-file-in-time-series-plot-using-gnuplot
# - this won't work well if the event is e.g. 'qos'
# - we could sort them by event type and separate them by double new-lines,
# we'd then use 'index <x>' to plot them in different colors with
# - buffer-pts should be ahead of clock time of the pipeline
# - we don't have the clock ts in the log though
import logging
import os
from subprocess import Popen, PIPE, DEVNULL
from string import Template
from tracer.analysis_runner import AnalysisRunner
from tracer.analyzer import Analyzer
from tracer.parser import Parser
from tracer.structure import Structure
logging.basicConfig(level=logging.WARNING)
logger = logging.getLogger('gsttr-tsplot')
_HANDLED_CLASSES = ('buffer', 'event', 'new-pad', 'new-element')
_GST_BUFFER_FLAG_DISCONT = (1 << 6)
_PLOT_SCRIPT_HEAD = Template(
'''
set term pngcairo truecolor size $width,$height font "Helvetica,14"
set style line 1 lc rgb '#8b1a0e' pt 1 ps 1 lt 1 lw 1 # --- red
set style line 2 lc rgb '#5e9c36' pt 6 ps 1 lt 1 lw 1 # --- green
set style line 100 lc rgb '#999999' lt 0 lw 1
set grid back ls 100
set key font ",10"
set label font ",10"
set tics font ",10"
set xlabel font ",10"
set ylabel font ",10"
''')
_PLOT_SCRIPT_BODY = Template(
'''
set output '$png_file_name'
set multiplot layout 3,1 title "$title\\n$subtitle"
set xlabel ""
set xrange [*:*] writeback
set xtics format ""
set ylabel "Buffer Time (sec.msec)" offset 1,0
set yrange [*:*]
set ytics
plot '$buf_file_name' using 1:2 with linespoints ls 1 notitle
set xrange restore
set ylabel "Duration (sec.msec)" offset 1,0
plot '$buf_file_name' using 1:3 with linespoints ls 1title "cycle", \
'' using 1:4 with linespoints ls 2 title "duration"
set xrange restore
set xtics format "%g" scale .5 offset 0,.5
set xlabel "Clock Time (sec.msec)" offset 0,1
set ylabel "Events" offset 1,0
set yrange [$ypos_max:10]
set ytics format ""
plot '$ev_file_name' using 1:4:3:(0) with vectors heads size screen 0.008,90 ls 1 notitle, \
'' using 2:4 with points ls 1 notitle, \
'' using 2:4:5 with labels font ',7' offset char 0,-0.5 notitle
unset multiplot
''')
class TsPlot(Analyzer):
'''Generate a timestamp plots from a tracer log.
These show the buffer pts on the y-axis and the wall-clock time the buffer
was produced on the x-axis. This helps to spot timing issues, such as
stalled elements.
'''
def __init__(self, outdir, show_ghost_pads, size):
super(TsPlot, self).__init__()
self.outdir = outdir
self.show_ghost_pads = show_ghost_pads
self.params = {
'width': size[0],
'height': size[1],
}
self.buf_files = {}
self.buf_cts = {}
self.ev_files = {}
self.element_names = {}
self.element_info = {}
self.pad_names = {}
self.pad_info = {}
self.ev_labels = {}
self.ev_data = {}
self.ev_ypos = {}
def _get_data_file(self, files, key, name_template):
data_file = files.get(key)
if not data_file:
pad_name = self.pad_names.get(key)
if pad_name:
file_name = name_template % (self.outdir, key, pad_name)
data_file = open(file_name, 'w')
files[key] = data_file
return data_file
def _log_event_data(self, pad_file, ix):
data = self.ev_data.get(ix)
if not data:
return
line = self.ev_labels[ix]
ct = data['ct']
x1 = data['first-ts']
# TODO: scale 'y' according to max-y of buf or do a multiplot
y = (1 + data['ypos']) * -10
if ct == 1:
pad_file.write('%f %f %f %f "%s"\n' % (x1, x1, 0.0, y, line))
else:
x2 = data['last-ts']
xd = (x2 - x1)
xm = x1 + xd / 2
pad_file.write('%f %f %f %f "%s (%d)"\n' % (x1, xm, xd, y, line, ct))
def _log_event(self, s):
# build a [ts, event-name] data file
ix = int(s.values['pad-ix'])
pad_file = self._get_data_file(self.ev_files, ix, '%s/ev_%d_%s.dat')
if not pad_file:
return
# convert timestamps to seconds
x = int(s.values['ts']) / 1e9
# some events fire often, labeling each would be unreadable
# so we aggregate a series of events of the same type
line = s.values['name']
if line == self.ev_labels.get(ix):
# count lines and track last ts
data = self.ev_data[ix]
data['ct'] += 1
data['last-ts'] = x
else:
self._log_event_data(pad_file, ix)
# start new data, assign a -y coord by event type
if ix not in self.ev_ypos:
ypos = {}
self.ev_ypos[ix] = ypos
else:
ypos = self.ev_ypos[ix]
if line in ypos:
y = ypos[line]
else:
y = len(ypos)
ypos[line] = y
self.ev_labels[ix] = line
self.ev_data[ix] = {
'ct': 1,
'first-ts': x,
'ypos': y,
}
def _log_buffer(self, s):
if not int(s.values['have-buffer-pts']):
return
# build a [ts, buffer-pts] data file
ix = int(s.values['pad-ix'])
pad_file = self._get_data_file(self.buf_files, ix, '%s/buf_%d_%s.dat')
if not pad_file:
return
flags = int(s.values['buffer-flags'])
if flags & _GST_BUFFER_FLAG_DISCONT:
pad_file.write('\n')
# convert timestamps to e.g. seconds
cts = int(s.values['ts']) / 1e9
pts = int(s.values['buffer-pts']) / 1e9
dur = int(s.values['buffer-duration']) / 1e9
if ix not in self.buf_cts:
dcts = 0
else:
dcts = cts - self.buf_cts[ix]
self.buf_cts[ix] = cts
pad_file.write('%f %f %f %f\n' % (cts, pts, dcts, dur))
def handle_tracer_entry(self, event):
if event[Parser.F_FUNCTION]:
return
msg = event[Parser.F_MESSAGE]
p = msg.find(',')
if p == -1:
return
entry_name = msg[:p]
if entry_name not in _HANDLED_CLASSES:
return
try:
s = Structure(msg)
except ValueError:
logger.warning("failed to parse: '%s'", msg)
return
if entry_name == 'new-element':
ix = int(s.values['ix'])
self.element_names[ix] = s.values['name']
self.element_info[ix] = 'Element Type: %s' % s.values['type']
elif entry_name == 'new-pad':
pad_type = s.values['type']
if self.show_ghost_pads or pad_type not in ['GstGhostPad', 'GstProxyPad']:
parent_ix = int(s.values['parent-ix'])
parent_name = self.element_names.get(parent_ix, '')
ix = int(s.values['ix'])
self.pad_names[ix] = '%s.%s' % (parent_name, s.values['name'])
self.pad_info[ix] = '(%s, Pad Type: %s)' % (
self.element_info.get(parent_ix, ''), pad_type)
elif entry_name == 'event':
self._log_event(s)
else: # 'buffer'
self._log_buffer(s)
def report(self):
for ix, pad_file in self.ev_files.items():
self._log_event_data(pad_file, ix)
pad_file.close()
script = _PLOT_SCRIPT_HEAD.substitute(self.params)
for ix, pad_file in self.buf_files.items():
pad_file.close()
name = self.pad_names[ix]
buf_file_name = '%s/buf_%d_%s.dat' % (self.outdir, ix, name)
ev_file_name = '%s/ev_%d_%s.dat' % (self.outdir, ix, name)
png_file_name = '%s/%d_%s.png' % (self.outdir, ix, name)
sub_title = self.pad_info[ix]
ypos_max = (2 + len(self.ev_ypos[ix])) * -10
script += _PLOT_SCRIPT_BODY.substitute(self.params, title=name,
subtitle=sub_title, buf_file_name=buf_file_name,
ev_file_name=ev_file_name, png_file_name=png_file_name,
ypos_max=ypos_max)
# plot PNGs
p = Popen(['gnuplot'], stdout=DEVNULL, stdin=PIPE)
p.communicate(input=script.encode('utf-8'))
# cleanup
for ix, pad_file in self.buf_files.items():
name = self.pad_names[ix]
buf_file_name = '%s/buf_%d_%s.dat' % (self.outdir, ix, name)
os.unlink(buf_file_name)
for ix, pad_file in self.ev_files.items():
name = self.pad_names[ix]
ev_file_name = '%s/ev_%d_%s.dat' % (self.outdir, ix, name)
os.unlink(ev_file_name)
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('file', nargs='?', default='debug.log')
parser.add_argument('outdir', nargs='?', default='tsplot')
parser.add_argument('-g', '--ghost-pads', action='store_true',
help='also plot data for ghost-pads')
parser.add_argument('-s', '--size', action='store', default='1600x600',
help='graph size as WxH')
args = parser.parse_args()
os.makedirs(args.outdir, exist_ok=True)
size = [int(s) for s in args.size.split('x')]
with Parser(args.file) as log:
tsplot = TsPlot(args.outdir, args.ghost_pads, size)
runner = AnalysisRunner(log)
runner.add_analyzer(tsplot)
runner.run()
tsplot.report()

View file

@ -0,0 +1,48 @@
try:
from tracer.parser import Parser
except BaseException:
from parser import Parser
class AnalysisRunner(object):
"""
Runs several Analyzers over a log.
Iterates log using a Parser and dispatches to a set of analyzers.
"""
def __init__(self, log):
self.log = log
self.analyzers = []
def add_analyzer(self, analyzer):
self.analyzers.append(analyzer)
def handle_tracer_class(self, event):
for analyzer in self.analyzers:
analyzer.handle_tracer_class(event)
def handle_tracer_entry(self, event):
for analyzer in self.analyzers:
analyzer.handle_tracer_entry(event)
def is_tracer_class(self, event):
return (event[Parser.F_FILENAME] == 'gsttracerrecord.c'
and event[Parser.F_CATEGORY] == 'GST_TRACER'
and '.class' in event[Parser.F_MESSAGE])
def is_tracer_entry(self, event):
return (not event[Parser.F_LINE] and not event[Parser.F_FILENAME])
def run(self):
try:
for event in self.log:
# check if it is a tracer.class or tracer event
if self.is_tracer_entry(event):
self.handle_tracer_entry(event)
elif self.is_tracer_class(event):
self.handle_tracer_class(event)
# else:
# print("unhandled:", repr(event))
except StopIteration:
pass

View file

@ -0,0 +1,26 @@
import unittest
from tracer.analysis_runner import AnalysisRunner
TRACER_CLASS = (
'0:00:00.036373170', 1788, '0x23bca70', 'TRACE', 'GST_TRACER',
'gsttracerrecord.c', 110, 'gst_tracer_record_build_format', None,
r'latency.class, src=(structure)"scope\\,\\ type\\=\\(type\\)gchararray\\,\\ related-to\\=\\(GstTracerValueScope\\)GST_TRACER_VALUE_SCOPE_PAD\\;", sink=(structure)"scope\\,\\ type\\=\\(type\\)gchararray\\,\\ related-to\\=\\(GstTracerValueScope\\)GST_TRACER_VALUE_SCOPE_PAD\\;", time=(structure)"value\\,\\ type\\=\\(type\\)guint64\\,\\ description\\=\\(string\\)\\"time\\\\\\ it\\\\\\ took\\\\\\ for\\\\\\ the\\\\\\ buffer\\\\\\ to\\\\\\ go\\\\\\ from\\\\\\ src\\\\\\ to\\\\\\ sink\\\\\\ ns\\"\\,\\ flags\\=\\(GstTracerValueFlags\\)GST_TRACER_VALUE_FLAGS_AGGREGATED\\,\\ min\\=\\(guint64\\)0\\,\\ max\\=\\(guint64\\)18446744073709551615\\;";'
)
TRACER_ENTRY = (
'0:00:00.142391137', 1788, '0x7f8a201056d0', 'TRACE', 'GST_TRACER',
'', 0, '', None,
r'latency, src=(string)source_src, sink=(string)pulsesink0_sink, time=(guint64)47091349;'
)
class TestAnalysisRunner(unittest.TestCase):
def test_detect_tracer_class(self):
a = AnalysisRunner(None)
self.assertTrue(a.is_tracer_class(TRACER_CLASS))
def test_detect_tracer_entry(self):
a = AnalysisRunner(None)
self.assertTrue(a.is_tracer_entry(TRACER_ENTRY))

15
tracer/tracer/analyzer.py Normal file
View file

@ -0,0 +1,15 @@
class Analyzer(object):
"""
Base class for a gst tracer analyzer.
Will be used in conjunction with a AnalysisRunner.
"""
def __init__(self):
pass
def handle_tracer_class(self, event):
pass
def handle_tracer_entry(self, event):
pass

82
tracer/tracer/parser.py Normal file
View file

@ -0,0 +1,82 @@
import os
import re
import sys
def _log_line_regex():
# "0:00:00.777913000 "
TIME = r"(\d+:\d\d:\d\d\.\d+)\s+"
# "DEBUG "
# LEVEL = "([A-Z]+)\s+"
LEVEL = "(TRACE)\s+"
# "0x8165430 "
THREAD = r"(0x[0-9a-f]+)\s+"
# "GST_REFCOUNTING ", "flacdec "
CATEGORY = "([A-Za-z0-9_-]+)\s+"
# " 3089 "
PID = r"(\d+)\s*"
FILENAME = r"([^:]*):"
LINE = r"(\d+):"
FUNCTION = r"([A-Za-z0-9_]*):"
# FIXME: When non-g(st)object stuff is logged with *_OBJECT (like
# buffers!), the address is printed *without* <> brackets!
OBJECT = "(?:<([^>]+)>)?"
MESSAGE = "(.+)"
ANSI = "(?:\x1b\\[[0-9;]*m\\s*)*\\s*"
return [TIME, ANSI, PID, ANSI, THREAD, ANSI, LEVEL, ANSI, CATEGORY,
FILENAME, LINE, FUNCTION, ANSI, OBJECT, ANSI, MESSAGE]
class Parser(object):
"""
Helper to parse a tracer log.
Implements context manager and iterator.
"""
# record fields
F_TIME = 0
F_PID = 1
F_THREAD = 2
F_LEVEL = 3
F_CATEGORY = 4
F_FILENAME = 5
F_LINE = 6
F_FUNCTION = 7
F_OBJECT = 8
F_MESSAGE = 9
def __init__(self, filename):
self.filename = filename
self.log_regex = re.compile(''.join(_log_line_regex()))
self.file = None
def __enter__(self):
if self.filename != '-':
self.file = open(self.filename, 'rt')
else:
self.file = sys.stdin
return self
def __exit__(self, *args):
if self.filename != '-':
self.file.close()
self.file = None
def __iter__(self):
return self
def __next__(self):
log_regex = self.log_regex
data = self.file
while True:
line = next(data)
match = log_regex.match(line)
if match:
g = list(match.groups())
g[Parser.F_PID] = int(g[Parser.F_PID])
g[Parser.F_LINE] = int(g[Parser.F_LINE])
return g

View file

@ -0,0 +1,13 @@
from analysis_runner import AnalysisRunner
from parser import Parser
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('file', nargs='?', default='debug.log')
args = parser.parse_args()
with Parser(args.file) as log:
runner = AnalysisRunner(log)
runner.run()

View file

@ -0,0 +1,47 @@
import sys
import unittest
from tracer.parser import Parser
TESTFILE = './logs/trace.latency.log'
TEXT_DATA = ['first line', 'second line']
TRACER_LOG_DATA = [
'0:00:00.079422574 7664 0x238ac70 TRACE GST_TRACER :0:: thread-rusage, thread-id=(guint64)37268592, ts=(guint64)79416000, average-cpuload=(uint)1000, current-cpuload=(uint)1000, time=(guint64)79418045;'
]
TRACER_CLASS_LOG_DATA = [
'0:00:00.041536066 1788 0x14b2150 TRACE GST_TRACER gsttracerrecord.c:110:gst_tracer_record_build_format: latency.class, src=(structure)"scope\,\ type\=\(type\)gchararray\,\ related-to\=\(GstTracerValueScope\)GST_TRACER_VALUE_SCOPE_PAD\;", sink=(structure)"scope\,\ type\=\(type\)gchararray\,\ related-to\=\(GstTracerValueScope\)GST_TRACER_VALUE_SCOPE_PAD\;", time=(structure)"value\,\ type\=\(type\)guint64\,\ description\=\(string\)\"time\\\ it\\\ took\\\ for\\\ the\\\ buffer\\\ to\\\ go\\\ from\\\ src\\\ to\\\ sink\\\ ns\"\,\ flags\=\(GstTracerValueFlags\)GST_TRACER_VALUE_FLAGS_AGGREGATED\,\ min\=\(guint64\)0\,\ max\=\(guint64\)18446744073709551615\;";'
]
class TestParser(unittest.TestCase):
def test___init__(self):
log = Parser(TESTFILE)
self.assertIsNone(log.file)
def test___enter___with_file(self):
with Parser(TESTFILE) as log:
self.assertIsNotNone(log.file)
def test___enter___with_stdin(self):
sys.stdin = iter(TEXT_DATA)
with Parser('-') as log:
self.assertIsNotNone(log.file)
def test_random_text_reports_none(self):
sys.stdin = iter(TEXT_DATA)
with Parser('-') as log:
with self.assertRaises(StopIteration):
next(log)
def test_log_file_reports_trace_log(self):
with Parser(TESTFILE) as log:
self.assertIsNotNone(next(log))
def test_trace_log_parsed(self):
sys.stdin = iter(TRACER_LOG_DATA)
with Parser('-') as log:
event = next(log)
self.assertEqual(len(event), 10)

View file

@ -0,0 +1,99 @@
import logging
import re
logger = logging.getLogger('structure')
UNESCAPE = re.compile(r'(?<!\\)\\(.)')
INT_TYPES = "".join(
("int", "uint", "int8", "uint8", "int16", "uint16", "int32", "uint32", "int64", "uint64")
)
class Structure(object):
"""
Gst Structure parser.
Has publicly accesible members representing the structure data:
name -- the structure name
types -- a dictionary keyed by the field name
values -- a dictionary keyed by the field name
"""
def __init__(self, text):
self.text = text
self.name, self.types, self.values = Structure._parse(text)
def __repr__(self):
return self.text
@staticmethod
def _find_eos(s):
# find next '"' without preceeding '\'
i = 0
# logger.debug("find_eos: '%s'", s)
while True: # faster than regexp for '[^\\]\"'
p = s.index('"')
i += p + 1
if s[p - 1] != '\\':
# logger.debug("... ok : '%s'", s[p:])
return i
s = s[(p + 1):]
# logger.debug("... : '%s'", s)
return -1
@staticmethod
def _parse(s):
types = {}
values = {}
scan = True
# logger.debug("===: '%s'", s)
# parse id
p = s.find(',')
if p == -1:
p = s.index(';')
scan = False
name = s[:p]
# parse fields
while scan:
s = s[(p + 2):] # skip 'name, ' / 'value, '
# logger.debug("...: '%s'", s)
p = s.index('=')
k = s[:p]
if not s[p + 1] == '(':
raise ValueError
s = s[(p + 2):] # skip 'key=('
p = s.index(')')
t = s[:p]
s = s[(p + 1):] # skip 'type)'
if s[0] == '"':
s = s[1:] # skip '"'
p = Structure._find_eos(s)
if p == -1:
raise ValueError
v = s[:(p - 1)]
if s[p] == ';':
scan = False
# unescape \., but not \\. (using a backref)
# need a reverse for re.escape()
v = v.replace('\\\\', '\\')
v = UNESCAPE.sub(r'\1', v)
else:
p = s.find(',')
if p == -1:
p = s.index(';')
scan = False
v = s[:p]
if t == 'structure':
v = Structure(v)
elif t == 'string' and v[0] == '"':
v = v[1:-1]
elif t == 'boolean':
v = (v == '1')
elif t in INT_TYPES:
v = int(v)
types[k] = t
values[k] = v
return (name, types, values)

View file

@ -0,0 +1,79 @@
import timeit
from structure import Structure
from gi.repository import Gst
Gst.init(None)
PLAIN_STRUCTURE = r'thread-rusage, thread-id=(guint64)37268592, ts=(guint64)79416000, average-cpuload=(uint)1000, current-cpuload=(uint)1000, time=(guint64)79418045;'
NESTED_STRUCTURE = r'latency.class, src=(structure)"scope\,\ type\=\(type\)gchararray\,\ related-to\=\(GstTracerValueScope\)GST_TRACER_VALUE_SCOPE_PAD\;", sink=(structure)"scope\,\ type\=\(type\)gchararray\,\ related-to\=\(GstTracerValueScope\)GST_TRACER_VALUE_SCOPE_PAD\;", time=(structure)"value\,\ type\=\(type\)guint64\,\ description\=\(string\)\"time\\\ it\\\ took\\\ for\\\ the\\\ buffer\\\ to\\\ go\\\ from\\\ src\\\ to\\\ sink\\\ ns\"\,\ flags\=\(GstTracerValueFlags\)GST_TRACER_VALUE_FLAGS_AGGREGATED\,\ min\=\(guint64\)0\,\ max\=\(guint64\)18446744073709551615\;";'
NAT_STRUCTURE = Structure(PLAIN_STRUCTURE)
GI_STRUCTURE = Gst.Structure.from_string(PLAIN_STRUCTURE)[0]
# native python impl
def nat_parse_plain():
s = Structure(PLAIN_STRUCTURE)
def nat_parse_nested():
s = Structure(NESTED_STRUCTURE)
def nat_get_name():
return NAT_STRUCTURE.name
def nat_get_value():
return NAT_STRUCTURE.values['thread-id']
# gstreamer impl via gi
def gi_parse_plain():
s = Gst.Structure.from_string(PLAIN_STRUCTURE)[0]
def gi_parse_nested():
s = Gst.Structure.from_string(NESTED_STRUCTURE)[0]
def gi_get_name():
return GI_STRUCTURE.get_name()
def gi_get_value():
return GI_STRUCTURE.get_value('thread-id')
# perf test
def perf(method, n, flavor):
t = timeit.timeit(method + '()', 'from __main__ import ' + method, number=n)
print("%6s: %lf s, (%lf calls/s)" % (flavor, t, (n / t)))
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('-i', '--iterations', default=10000, type=int,
help='number of iterations (default: 10000)')
args = parser.parse_args()
n = args.iterations
print("parse_plain:")
t = perf('nat_parse_plain', n, 'native')
t = perf('gi_parse_plain', n, 'gi')
print("parse_nested:")
t = perf('nat_parse_nested', n, 'native')
t = perf('gi_parse_nested', n, 'gi')
print("get_name:")
t = perf('nat_get_name', n, 'native')
t = perf('gi_get_name', n, 'gi')
print("get_value:")
t = perf('nat_get_value', n, 'native')
t = perf('gi_get_value', n, 'gi')

View file

@ -0,0 +1,96 @@
import logging
import unittest
from tracer.structure import Structure
logging.basicConfig(level=logging.INFO)
BAD_NAME = r'foo bar'
BAD_KEY = r'foo, bar'
BAD_TYPE1 = r'foo, bar=['
BAD_TYPE2 = r'foo, bar=(int'
EMPTY_STRUCTURE = r'foo;'
SINGLE_VALUE_STRUCTURE = r'foo, key=(string)"value";'
MISC_TYPES_STRUCTURE = r'foo, key1=(string)"value", key2=(int)5, key3=(boolean)1;'
NESTED_STRUCTURE = r'foo, nested=(structure)"bar\,\ key1\=\(int\)0\,\ key2\=\(int\)5\;";'
REGRESSIONS = [
r'query, thread-id=(guint64)139839438879824, ts=(guint64)220860464, pad-ix=(uint)8, element-ix=(uint)9, peer-pad-ix=(uint)9, peer-element-ix=(uint)8, name=(string)accept-caps, structure=(structure)"GstQueryAcceptCaps\,\ caps\=\(GstCaps\)\"audio/mpeg\\\,\\\ mpegversion\\\=\\\(int\\\)4\\\,\\\ framed\\\=\\\(boolean\\\)true\\\,\\\ stream-format\\\=\\\(string\\\)raw\\\,\\\ level\\\=\\\(string\\\)2\\\,\\\ base-profile\\\=\\\(string\\\)lc\\\,\\\ profile\\\=\\\(string\\\)lc\\\,\\\ codec_data\\\=\\\(buffer\\\)1210\\\,\\\ rate\\\=\\\(int\\\)44100\\\,\\\ channels\\\=\\\(int\\\)2\"\,\ result\=\(boolean\)false\;", have-res=(boolean)0, res=(boolean)0;',
r'message, thread-id=(guint64)139838900680560, ts=(guint64)1000451258, element-ix=(uint)2, name=(string)tag, structure=(structure)"GstMessageTag\,\ taglist\=\(taglist\)\"taglist\\\,\\\ datetime\\\=\\\(datetime\\\)2009-03-05T12:57:08Z\\\,\\\ private-qt-tag\\\=\\\(sample\\\)\\\{\\\ 00000019677373740000001164617461000000010000000030:None:R3N0U2VnbWVudCwgZmxhZ3M9KEdzdFNlZ21lbnRGbGFncylHU1RfU0VHTUVOVF9GTEFHX05PTkUsIHJhdGU9KGRvdWJsZSkxLCBhcHBsaWVkLXJhdGU9KGRvdWJsZSkxLCBmb3JtYXQ9KEdzdEZvcm1hdClHU1RfRk9STUFUX1RJTUUsIGJhc2U9KGd1aW50NjQpMCwgb2Zmc2V0PShndWludDY0KTAsIHN0YXJ0PShndWludDY0KTAsIHN0b3A9KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTUsIHRpbWU9KGd1aW50NjQpMCwgcG9zaXRpb249KGd1aW50NjQpMCwgZHVyYXRpb249KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTU7AA__:YXBwbGljYXRpb24veC1nc3QtcXQtZ3NzdC10YWcsIHN0eWxlPShzdHJpbmcpaXR1bmVzOwA_\\\,\\\ 0000001e6773746400000016646174610000000100000000313335353130:None:R3N0U2VnbWVudCwgZmxhZ3M9KEdzdFNlZ21lbnRGbGFncylHU1RfU0VHTUVOVF9GTEFHX05PTkUsIHJhdGU9KGRvdWJsZSkxLCBhcHBsaWVkLXJhdGU9KGRvdWJsZSkxLCBmb3JtYXQ9KEdzdEZvcm1hdClHU1RfRk9STUFUX1RJTUUsIGJhc2U9KGd1aW50NjQpMCwgb2Zmc2V0PShndWludDY0KTAsIHN0YXJ0PShndWludDY0KTAsIHN0b3A9KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTUsIHRpbWU9KGd1aW50NjQpMCwgcG9zaXRpb249KGd1aW50NjQpMCwgZHVyYXRpb249KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTU7AA__:YXBwbGljYXRpb24veC1nc3QtcXQtZ3N0ZC10YWcsIHN0eWxlPShzdHJpbmcpaXR1bmVzOwA_\\\,\\\ 0000003867737364000000306461746100000001000000004244354241453530354d4d313239353033343539373733353435370000000000:None:R3N0U2VnbWVudCwgZmxhZ3M9KEdzdFNlZ21lbnRGbGFncylHU1RfU0VHTUVOVF9GTEFHX05PTkUsIHJhdGU9KGRvdWJsZSkxLCBhcHBsaWVkLXJhdGU9KGRvdWJsZSkxLCBmb3JtYXQ9KEdzdEZvcm1hdClHU1RfRk9STUFUX1RJTUUsIGJhc2U9KGd1aW50NjQpMCwgb2Zmc2V0PShndWludDY0KTAsIHN0YXJ0PShndWludDY0KTAsIHN0b3A9KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTUsIHRpbWU9KGd1aW50NjQpMCwgcG9zaXRpb249KGd1aW50NjQpMCwgZHVyYXRpb249KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTU7AA__:YXBwbGljYXRpb24veC1nc3QtcXQtZ3NzZC10YWcsIHN0eWxlPShzdHJpbmcpaXR1bmVzOwA_\\\,\\\ 0000009867737075000000906461746100000001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000:None:R3N0U2VnbWVudCwgZmxhZ3M9KEdzdFNlZ21lbnRGbGFncylHU1RfU0VHTUVOVF9GTEFHX05PTkUsIHJhdGU9KGRvdWJsZSkxLCBhcHBsaWVkLXJhdGU9KGRvdWJsZSkxLCBmb3JtYXQ9KEdzdEZvcm1hdClHU1RfRk9STUFUX1RJTUUsIGJhc2U9KGd1aW50NjQpMCwgb2Zmc2V0PShndWludDY0KTAsIHN0YXJ0PShndWludDY0KTAsIHN0b3A9KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTUsIHRpbWU9KGd1aW50NjQpMCwgcG9zaXRpb249KGd1aW50NjQpMCwgZHVyYXRpb249KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTU7AA__:YXBwbGljYXRpb24veC1nc3QtcXQtZ3NwdS10YWcsIHN0eWxlPShzdHJpbmcpaXR1bmVzOwA_\\\,\\\ 000000986773706d000000906461746100000001000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000:None:R3N0U2VnbWVudCwgZmxhZ3M9KEdzdFNlZ21lbnRGbGFncylHU1RfU0VHTUVOVF9GTEFHX05PTkUsIHJhdGU9KGRvdWJsZSkxLCBhcHBsaWVkLXJhdGU9KGRvdWJsZSkxLCBmb3JtYXQ9KEdzdEZvcm1hdClHU1RfRk9STUFUX1RJTUUsIGJhc2U9KGd1aW50NjQpMCwgb2Zmc2V0PShndWludDY0KTAsIHN0YXJ0PShndWludDY0KTAsIHN0b3A9KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTUsIHRpbWU9KGd1aW50NjQpMCwgcG9zaXRpb249KGd1aW50NjQpMCwgZHVyYXRpb249KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTU7AA__:YXBwbGljYXRpb24veC1nc3QtcXQtZ3NwbS10YWcsIHN0eWxlPShzdHJpbmcpaXR1bmVzOwA_\\\,\\\ 0000011867736868000001106461746100000001000000007631302e6c736361636865332e632e796f75747562652e636f6d0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000:None:R3N0U2VnbWVudCwgZmxhZ3M9KEdzdFNlZ21lbnRGbGFncylHU1RfU0VHTUVOVF9GTEFHX05PTkUsIHJhdGU9KGRvdWJsZSkxLCBhcHBsaWVkLXJhdGU9KGRvdWJsZSkxLCBmb3JtYXQ9KEdzdEZvcm1hdClHU1RfRk9STUFUX1RJTUUsIGJhc2U9KGd1aW50NjQpMCwgb2Zmc2V0PShndWludDY0KTAsIHN0YXJ0PShndWludDY0KTAsIHN0b3A9KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTUsIHRpbWU9KGd1aW50NjQpMCwgcG9zaXRpb249KGd1aW50NjQpMCwgZHVyYXRpb249KGd1aW50NjQpMTg0NDY3NDQwNzM3MDk1NTE2MTU7AA__:YXBwbGljYXRpb24veC1nc3QtcXQtZ3NoaC10YWcsIHN0eWxlPShzdHJpbmcpaXR1bmVzOwA_\\\ \\\}\\\,\\\ container-format\\\=\\\(string\\\)\\\"ISO\\\\\\\ MP4/M4A\\\"\\\;\"\;";',
]
class TestStructure(unittest.TestCase):
def test_handles_bad_name(self):
structure = None
with self.assertRaises(ValueError):
structure = Structure(BAD_NAME)
def test_handles_bad_key(self):
structure = None
with self.assertRaises(ValueError):
structure = Structure(BAD_KEY)
def test_handles_bad_type1(self):
structure = None
with self.assertRaises(ValueError):
structure = Structure(BAD_TYPE1)
def test_handles_bad_type2(self):
structure = None
with self.assertRaises(ValueError):
structure = Structure(BAD_TYPE2)
def test_parses_empty_structure(self):
structure = Structure(EMPTY_STRUCTURE)
self.assertEqual(structure.text, EMPTY_STRUCTURE)
def test_parses_name_in_empty_structure(self):
structure = Structure(EMPTY_STRUCTURE)
self.assertEqual(structure.name, 'foo')
def test_parses_single_value_structure(self):
structure = Structure(SINGLE_VALUE_STRUCTURE)
self.assertEqual(structure.text, SINGLE_VALUE_STRUCTURE)
def test_parses_name(self):
structure = Structure(SINGLE_VALUE_STRUCTURE)
self.assertEqual(structure.name, 'foo')
def test_parses_key(self):
structure = Structure(SINGLE_VALUE_STRUCTURE)
self.assertIn('key', structure.types)
self.assertIn('key', structure.values)
def test_parses_type(self):
structure = Structure(SINGLE_VALUE_STRUCTURE)
self.assertEqual(structure.types['key'], 'string')
def test_parses_string_value(self):
structure = Structure(MISC_TYPES_STRUCTURE)
self.assertEqual(structure.values['key1'], 'value')
def test_parses_int_value(self):
structure = Structure(MISC_TYPES_STRUCTURE)
self.assertEqual(structure.values['key2'], 5)
def test_parses_boolean_value(self):
structure = Structure(MISC_TYPES_STRUCTURE)
self.assertEqual(structure.values['key3'], True)
def test_parses_nested_structure(self):
structure = Structure(NESTED_STRUCTURE)
self.assertEqual(structure.text, NESTED_STRUCTURE)
def test_nested_structure_has_sub_structure(self):
structure = Structure(NESTED_STRUCTURE)
self.assertEqual(structure.types['nested'], 'structure')
self.assertIsInstance(structure.values['nested'], Structure)
def test_regressions(self):
for s in REGRESSIONS:
structure = Structure(s)

2
validate/.gitignore vendored Normal file
View file

@ -0,0 +1,2 @@
*~
build*/

504
validate/COPYING Normal file
View file

@ -0,0 +1,504 @@
GNU LESSER GENERAL PUBLIC LICENSE
Version 2.1, February 1999
Copyright (C) 1991, 1999 Free Software Foundation, Inc.
51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
[This is the first released version of the Lesser GPL. It also counts
as the successor of the GNU Library Public License, version 2, hence
the version number 2.1.]
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
Licenses are intended to guarantee your freedom to share and change
free software--to make sure the software is free for all its users.
This license, the Lesser General Public License, applies to some
specially designated software packages--typically libraries--of the
Free Software Foundation and other authors who decide to use it. You
can use it too, but we suggest you first think carefully about whether
this license or the ordinary General Public License is the better
strategy to use in any particular case, based on the explanations below.
When we speak of free software, we are referring to freedom of use,
not price. Our General Public Licenses are designed to make sure that
you have the freedom to distribute copies of free software (and charge
for this service if you wish); that you receive source code or can get
it if you want it; that you can change the software and use pieces of
it in new free programs; and that you are informed that you can do
these things.
To protect your rights, we need to make restrictions that forbid
distributors to deny you these rights or to ask you to surrender these
rights. These restrictions translate to certain responsibilities for
you if you distribute copies of the library or if you modify it.
For example, if you distribute copies of the library, whether gratis
or for a fee, you must give the recipients all the rights that we gave
you. You must make sure that they, too, receive or can get the source
code. If you link other code with the library, you must provide
complete object files to the recipients, so that they can relink them
with the library after making changes to the library and recompiling
it. And you must show them these terms so they know their rights.
We protect your rights with a two-step method: (1) we copyright the
library, and (2) we offer you this license, which gives you legal
permission to copy, distribute and/or modify the library.
To protect each distributor, we want to make it very clear that
there is no warranty for the free library. Also, if the library is
modified by someone else and passed on, the recipients should know
that what they have is not the original version, so that the original
author's reputation will not be affected by problems that might be
introduced by others.
Finally, software patents pose a constant threat to the existence of
any free program. We wish to make sure that a company cannot
effectively restrict the users of a free program by obtaining a
restrictive license from a patent holder. Therefore, we insist that
any patent license obtained for a version of the library must be
consistent with the full freedom of use specified in this license.
Most GNU software, including some libraries, is covered by the
ordinary GNU General Public License. This license, the GNU Lesser
General Public License, applies to certain designated libraries, and
is quite different from the ordinary General Public License. We use
this license for certain libraries in order to permit linking those
libraries into non-free programs.
When a program is linked with a library, whether statically or using
a shared library, the combination of the two is legally speaking a
combined work, a derivative of the original library. The ordinary
General Public License therefore permits such linking only if the
entire combination fits its criteria of freedom. The Lesser General
Public License permits more lax criteria for linking other code with
the library.
We call this license the "Lesser" General Public License because it
does Less to protect the user's freedom than the ordinary General
Public License. It also provides other free software developers Less
of an advantage over competing non-free programs. These disadvantages
are the reason we use the ordinary General Public License for many
libraries. However, the Lesser license provides advantages in certain
special circumstances.
For example, on rare occasions, there may be a special need to
encourage the widest possible use of a certain library, so that it becomes
a de-facto standard. To achieve this, non-free programs must be
allowed to use the library. A more frequent case is that a free
library does the same job as widely used non-free libraries. In this
case, there is little to gain by limiting the free library to free
software only, so we use the Lesser General Public License.
In other cases, permission to use a particular library in non-free
programs enables a greater number of people to use a large body of
free software. For example, permission to use the GNU C Library in
non-free programs enables many more people to use the whole GNU
operating system, as well as its variant, the GNU/Linux operating
system.
Although the Lesser General Public License is Less protective of the
users' freedom, it does ensure that the user of a program that is
linked with the Library has the freedom and the wherewithal to run
that program using a modified version of the Library.
The precise terms and conditions for copying, distribution and
modification follow. Pay close attention to the difference between a
"work based on the library" and a "work that uses the library". The
former contains code derived from the library, whereas the latter must
be combined with the library in order to run.
GNU LESSER GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
0. This License Agreement applies to any software library or other
program which contains a notice placed by the copyright holder or
other authorized party saying it may be distributed under the terms of
this Lesser General Public License (also called "this License").
Each licensee is addressed as "you".
A "library" means a collection of software functions and/or data
prepared so as to be conveniently linked with application programs
(which use some of those functions and data) to form executables.
The "Library", below, refers to any such software library or work
which has been distributed under these terms. A "work based on the
Library" means either the Library or any derivative work under
copyright law: that is to say, a work containing the Library or a
portion of it, either verbatim or with modifications and/or translated
straightforwardly into another language. (Hereinafter, translation is
included without limitation in the term "modification".)
"Source code" for a work means the preferred form of the work for
making modifications to it. For a library, complete source code means
all the source code for all modules it contains, plus any associated
interface definition files, plus the scripts used to control compilation
and installation of the library.
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running a program using the Library is not restricted, and output from
such a program is covered only if its contents constitute a work based
on the Library (independent of the use of the Library in a tool for
writing it). Whether that is true depends on what the Library does
and what the program that uses the Library does.
1. You may copy and distribute verbatim copies of the Library's
complete source code as you receive it, in any medium, provided that
you conspicuously and appropriately publish on each copy an
appropriate copyright notice and disclaimer of warranty; keep intact
all the notices that refer to this License and to the absence of any
warranty; and distribute a copy of this License along with the
Library.
You may charge a fee for the physical act of transferring a copy,
and you may at your option offer warranty protection in exchange for a
fee.
2. You may modify your copy or copies of the Library or any portion
of it, thus forming a work based on the Library, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) The modified work must itself be a software library.
b) You must cause the files modified to carry prominent notices
stating that you changed the files and the date of any change.
c) You must cause the whole of the work to be licensed at no
charge to all third parties under the terms of this License.
d) If a facility in the modified Library refers to a function or a
table of data to be supplied by an application program that uses
the facility, other than as an argument passed when the facility
is invoked, then you must make a good faith effort to ensure that,
in the event an application does not supply such function or
table, the facility still operates, and performs whatever part of
its purpose remains meaningful.
(For example, a function in a library to compute square roots has
a purpose that is entirely well-defined independent of the
application. Therefore, Subsection 2d requires that any
application-supplied function or table used by this function must
be optional: if the application does not supply it, the square
root function must still compute square roots.)
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Library,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Library, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote
it.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Library.
In addition, mere aggregation of another work not based on the Library
with the Library (or with a work based on the Library) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
3. You may opt to apply the terms of the ordinary GNU General Public
License instead of this License to a given copy of the Library. To do
this, you must alter all the notices that refer to this License, so
that they refer to the ordinary GNU General Public License, version 2,
instead of to this License. (If a newer version than version 2 of the
ordinary GNU General Public License has appeared, then you can specify
that version instead if you wish.) Do not make any other change in
these notices.
Once this change is made in a given copy, it is irreversible for
that copy, so the ordinary GNU General Public License applies to all
subsequent copies and derivative works made from that copy.
This option is useful when you wish to copy part of the code of
the Library into a program that is not a library.
4. You may copy and distribute the Library (or a portion or
derivative of it, under Section 2) in object code or executable form
under the terms of Sections 1 and 2 above provided that you accompany
it with the complete corresponding machine-readable source code, which
must be distributed under the terms of Sections 1 and 2 above on a
medium customarily used for software interchange.
If distribution of object code is made by offering access to copy
from a designated place, then offering equivalent access to copy the
source code from the same place satisfies the requirement to
distribute the source code, even though third parties are not
compelled to copy the source along with the object code.
5. A program that contains no derivative of any portion of the
Library, but is designed to work with the Library by being compiled or
linked with it, is called a "work that uses the Library". Such a
work, in isolation, is not a derivative work of the Library, and
therefore falls outside the scope of this License.
However, linking a "work that uses the Library" with the Library
creates an executable that is a derivative of the Library (because it
contains portions of the Library), rather than a "work that uses the
library". The executable is therefore covered by this License.
Section 6 states terms for distribution of such executables.
When a "work that uses the Library" uses material from a header file
that is part of the Library, the object code for the work may be a
derivative work of the Library even though the source code is not.
Whether this is true is especially significant if the work can be
linked without the Library, or if the work is itself a library. The
threshold for this to be true is not precisely defined by law.
If such an object file uses only numerical parameters, data
structure layouts and accessors, and small macros and small inline
functions (ten lines or less in length), then the use of the object
file is unrestricted, regardless of whether it is legally a derivative
work. (Executables containing this object code plus portions of the
Library will still fall under Section 6.)
Otherwise, if the work is a derivative of the Library, you may
distribute the object code for the work under the terms of Section 6.
Any executables containing that work also fall under Section 6,
whether or not they are linked directly with the Library itself.
6. As an exception to the Sections above, you may also combine or
link a "work that uses the Library" with the Library to produce a
work containing portions of the Library, and distribute that work
under terms of your choice, provided that the terms permit
modification of the work for the customer's own use and reverse
engineering for debugging such modifications.
You must give prominent notice with each copy of the work that the
Library is used in it and that the Library and its use are covered by
this License. You must supply a copy of this License. If the work
during execution displays copyright notices, you must include the
copyright notice for the Library among them, as well as a reference
directing the user to the copy of this License. Also, you must do one
of these things:
a) Accompany the work with the complete corresponding
machine-readable source code for the Library including whatever
changes were used in the work (which must be distributed under
Sections 1 and 2 above); and, if the work is an executable linked
with the Library, with the complete machine-readable "work that
uses the Library", as object code and/or source code, so that the
user can modify the Library and then relink to produce a modified
executable containing the modified Library. (It is understood
that the user who changes the contents of definitions files in the
Library will not necessarily be able to recompile the application
to use the modified definitions.)
b) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (1) uses at run time a
copy of the library already present on the user's computer system,
rather than copying library functions into the executable, and (2)
will operate properly with a modified version of the library, if
the user installs one, as long as the modified version is
interface-compatible with the version that the work was made with.
c) Accompany the work with a written offer, valid for at
least three years, to give the same user the materials
specified in Subsection 6a, above, for a charge no more
than the cost of performing this distribution.
d) If distribution of the work is made by offering access to copy
from a designated place, offer equivalent access to copy the above
specified materials from the same place.
e) Verify that the user has already received a copy of these
materials or that you have already sent this user a copy.
For an executable, the required form of the "work that uses the
Library" must include any data and utility programs needed for
reproducing the executable from it. However, as a special exception,
the materials to be distributed need not include anything that is
normally distributed (in either source or binary form) with the major
components (compiler, kernel, and so on) of the operating system on
which the executable runs, unless that component itself accompanies
the executable.
It may happen that this requirement contradicts the license
restrictions of other proprietary libraries that do not normally
accompany the operating system. Such a contradiction means you cannot
use both them and the Library together in an executable that you
distribute.
7. You may place library facilities that are a work based on the
Library side-by-side in a single library together with other library
facilities not covered by this License, and distribute such a combined
library, provided that the separate distribution of the work based on
the Library and of the other library facilities is otherwise
permitted, and provided that you do these two things:
a) Accompany the combined library with a copy of the same work
based on the Library, uncombined with any other library
facilities. This must be distributed under the terms of the
Sections above.
b) Give prominent notice with the combined library of the fact
that part of it is a work based on the Library, and explaining
where to find the accompanying uncombined form of the same work.
8. You may not copy, modify, sublicense, link with, or distribute
the Library except as expressly provided under this License. Any
attempt otherwise to copy, modify, sublicense, link with, or
distribute the Library is void, and will automatically terminate your
rights under this License. However, parties who have received copies,
or rights, from you under this License will not have their licenses
terminated so long as such parties remain in full compliance.
9. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Library or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Library (or any work based on the
Library), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Library or works based on it.
10. Each time you redistribute the Library (or any work based on the
Library), the recipient automatically receives a license from the
original licensor to copy, distribute, link with or modify the Library
subject to these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties with
this License.
11. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Library at all. For example, if a patent
license would not permit royalty-free redistribution of the Library by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Library.
If any portion of this section is held invalid or unenforceable under any
particular circumstance, the balance of the section is intended to apply,
and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
12. If the distribution and/or use of the Library is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Library under this License may add
an explicit geographical distribution limitation excluding those countries,
so that distribution is permitted only in or among countries not thus
excluded. In such case, this License incorporates the limitation as if
written in the body of this License.
13. The Free Software Foundation may publish revised and/or new
versions of the Lesser General Public License from time to time.
Such new versions will be similar in spirit to the present version,
but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
specifies a version number of this License which applies to it and
"any later version", you have the option of following the terms and
conditions either of that version or of any later version published by
the Free Software Foundation. If the Library does not specify a
license version number, you may choose any version ever published by
the Free Software Foundation.
14. If you wish to incorporate parts of the Library into other free
programs whose distribution conditions are incompatible with these,
write to the author to ask for permission. For software which is
copyrighted by the Free Software Foundation, write to the Free
Software Foundation; we sometimes make exceptions for this. Our
decision will be guided by the two goals of preserving the free status
of all derivatives of our free software and of promoting the sharing
and reuse of software generally.
NO WARRANTY
15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
DAMAGES.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Libraries
If you develop a new library, and you want it to be of the greatest
possible use to the public, we recommend making it free software that
everyone can redistribute and change. You can do so by permitting
redistribution under these terms (or, alternatively, under the terms of the
ordinary General Public License).
To apply these terms, attach the following notices to the library. It is
safest to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least the
"copyright" line and a pointer to where the full notice is found.
<one line to give the library's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this library; if not, write to the Free Software
Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
Also add information on how to contact you by electronic and paper mail.
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the library, if
necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the
library `Frob' (a library for tweaking knobs) written by James Random Hacker.
<signature of Ty Coon>, 1 April 1990
Ty Coon, President of Vice
That's all there is to it!

42
validate/README Normal file
View file

@ -0,0 +1,42 @@
== Gst-Validate
The goal of GstValidate is to be able to detect when elements are not
behaving as expected and report it to the user so he knows how things
are supposed to work inside a GstPipeline. In the end, fixing issues
found by the tool will ensure that all elements behave all together in
the expected way.
The easiest way of using GstValidate is to use one of its command-line
tools, located at tools/ directory. It is also possible to monitor
GstPipelines from any application by using the LD_PRELOAD gstvalidate
lib. The third way of using it is to write your own application that
links and uses libgstvalidate.
== BUILDING
Getting the code:
Releases are available at <URL>, download and extract the tarball. If you
want to use latest git version, do:
git clone <URI>
After cloning or extracting from a tarball, enter the gst-validate directory:
cd gst-validate
Build with:
meson build --prefix=<installation-prefix>
ninja -C build
sudo ninja -C build install (only if you want to install it)
Replace <installation-prefix> with your desired installation path, you can omit
the --prefix argument if you aren't going to install it or if you want the
default /usr/local. It is possible to use gst-validate CLI tools without
installation.
== INSTRUCTIONS
If you are looking for informations on how to use gst-validate -> docs/validate-usage.txt
If you are looking for informations on gst-validate design -> docs/validate-design.txt

View file

@ -0,0 +1,142 @@
# GStreamer
# Copyright (C) 2015 Mathieu Duponchelle <mathieu.duponchelle@opencreed.com>
# Copyright (C) 2021 Stéphane Cerveau <scerveau@collabora.com>
#
# bash/zsh completion support for gst-validate
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Library General Public
# License as published by the Free Software Foundation; either
# version 2 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# Library General Public License for more details.
#
# You should have received a copy of the GNU Library General Public
# License along with this library; if not, write to the
# Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
# Boston, MA 02110-1301, USA.
_GST_HELPERDIR="${BASH_SOURCE[0]%/*}/../helpers"
if [[ ! -f $_GST_HELPERDIR/gst ]]; then
_GST_HELPERDIR="$(pkg-config --variable=bashhelpersdir gstreamer-1.0)"
else
_GST_HELPERDIR=`cd "$_GST_HELPERDIR"; pwd`
fi
# Common definitions
. "$_GST_HELPERDIR"/gst
_gst_validate_all_arguments ()
{
_gst_all_arguments gst-validate-1.0
}
_gst_complete_compatible_elements ()
{
COMPREPLY=( $(compgen -W "$($_GST_HELPER --compatible-with $previous_element)" -- $cur) )
}
_gst_complete_all_elements ()
{
COMPREPLY=( $(compgen -W "$($_GST_HELPER -l)" -- $cur) )
}
_gst_complete_element_properties ()
{
COMPREPLY=( $(compgen -W "$($_GST_HELPER --element-properties $previous_element)" -- $cur) )
}
_gstvalidate___exclude_ () { _gst_mandatory_argument gst-validate-1.0; }
_gst_validate_main ()
{
local i=1 command function_exists previous_element have_previous_element=0 completion_func
while [[ $i -ne $COMP_CWORD ]];
do
local var
var="${COMP_WORDS[i]}"
if [[ "$var" == "-"* ]]
then
command="$var"
fi
i=$(($i+1))
done
i=1
while [[ $i -ne $COMP_CWORD ]];
do
local var
var="${COMP_WORDS[i]}"
if [[ "$var" == "-"* ]]
then
i=$(($i+1))
continue
fi
$(gst-inspect-1.0 --exists $var)
if [ $? -eq 0 ]
then
previous_element="$var"
have_previous_element=1
fi
i=$(($i+1))
done
if [[ "$command" == "--gst"* ]]; then
completion_func="_${command//-/_}"
else
completion_func="_gstlaunch_${command//-/_}"
fi
# Seems like bash doesn't like "exclude" in function names
if [[ "$completion_func" == "_gstlaunch___exclude" ]]
then
completion_func="_gstvalidate___exclude_"
fi
declare -f $completion_func >/dev/null 2>&1
function_exists=$?
if [[ "$cur" == "-"* ]]; then
_gst_validate_all_arguments
elif [ $function_exists -eq 0 ]
then
$completion_func
elif [ $have_previous_element -ne 0 ] && [[ "$prev" == "!" ]]
then
_gst_complete_compatible_elements
elif [ $have_previous_element -ne 0 ]
then
_gst_complete_element_properties
else
_gst_complete_all_elements
fi
}
_gst_validate_func_wrap ()
{
local cur prev
cur="${COMP_WORDS[COMP_CWORD]}"
prev="${COMP_WORDS[COMP_CWORD-1]}"
$1
}
# Setup completion for certain functions defined above by setting common
# variables and workarounds.
# This is NOT a public function; use at your own risk.
_gst_validate_complete ()
{
local wrapper="__launch_wrap${2}"
eval "$wrapper () { _gst_validate_func_wrap $2 ; }"
complete -o bashdefault -o default -o nospace -F $wrapper $1 2>/dev/null \
|| complete -o default -o nospace -F $wrapper $1
}
_gst_validate_complete gst-validate-1.0 _gst_validate_main

View file

@ -0,0 +1,241 @@
### This file contains either validate specific suppressions or bugs that we
### can't easily address because they are lower in the stack.
### All the other suppressions should be added ton gstreamer/tests/check/gstreamer.supp
### Each set of suppression rules should be prefixed by either:
### - FIXED: if the bug/leak has been fixed upstream but we keep the rule
### because the fix may not be deployed yet (because it's lower in the
### stack and not in gst itself).
### - PENDING: if the bug/leak hasn't been fixed yet. In this case, please
### add an url to the bug report.
# PENDING: https://bugs.freedesktop.org/show_bug.cgi?id=90073
{
Leak in mesa fixed with http://lists.freedesktop.org/archives/mesa-dev/2015-April/082101.html
Memcheck:Leak
fun:malloc
fun:read_packet
fun:_xcb_in_read
fun:_xcb_conn_wait
fun:wait_for_reply
fun:xcb_wait_for_reply
fun:dri3_open
fun:dri3_create_screen
fun:AllocAndFetchScreenConfigs
fun:__glXInitialize
fun:glXQueryVersion
}
{
Leak in mesa fixed with http://lists.freedesktop.org/archives/mesa-dev/2015-April/082100.html
Memcheck:Leak
...
fun:get_render_node_from_id_path_tag
fun:loader_get_user_preferred_fd
fun:dri3_create_screen
fun:AllocAndFetchScreenConfigs
fun:__glXInitialize
fun:glXQueryVersion
}
# FIXED
{
Fixed in pixman master
Memcheck:Leak
fun:memalign
fun:allocate_and_init
fun:tls_get_addr_tail
}
# PENDING: https://bugzilla.gnome.org/show_bug.cgi?id=748417
{
Ignore all the invalid memory errors from libvpx
Memcheck:Cond
obj:/usr/lib64/libvpx.so*
}
{
Ignore all the invalid memory errors from libvpx
Memcheck:Value8
obj:/usr/lib64/libvpx.so*
}
# PENDING: https://bugzilla.gnome.org/show_bug.cgi?id=747110
{
https://bugzilla.gnome.org/show_bug.cgi?id=747110
Memcheck:Addr4
...
fun:aac_decode_frame_int
fun:aac_decode_frame
}
# PENDING: https://bugzilla.gnome.org/show_bug.cgi?id=752989
{
https://bugzilla.gnome.org/show_bug.cgi?id=752989
Memcheck:Value4
...
fun:aac_decode_frame_int
fun:aac_decode_frame
}
# PENDING: https://bugs.freedesktop.org/show_bug.cgi?id=90194
{
https://bugs.freedesktop.org/show_bug.cgi?id=90194
Memcheck:Param
ioctl(generic)
fun:ioctl
fun:drmIoctl
fun:drmPrimeHandleToFD
}
# PENDING: https://bugzilla.gnome.org/show_bug.cgi?id=749232
# x264 errors
{
<insert_a_suppression_name_here>
Memcheck:Cond
...
fun:x264_encoder_encode
}
{
<insert_a_suppression_name_here>
Memcheck:Value8
...
fun:x264_encoder_encode
}
{
<insert_a_suppression_name_here>
Memcheck:Cond
...
fun:x264_lookahead_thread
}
{
<insert_a_suppression_name_here>
Memcheck:Value8
...
fun:x264_lookahead_thread
}
{
<insert_a_suppression_name_here>
Memcheck:Cond
...
fun:x264_threadpool_thread
}
{
<insert_a_suppression_name_here>
Memcheck:Value8
obj:/usr/lib64/libx264.so.*
}
{
<insert_a_suppression_name_here>
Memcheck:Cond
obj:/usr/lib64/libx264.so.*
}
# PENDING: https://bugzilla.gnome.org/show_bug.cgi?id=749428
# Theora encoder
{
<insert_a_suppression_name_here>
Memcheck:Value8
...
fun:theora_enc_handle_frame
}
{
<insert_a_suppression_name_here>
Memcheck:Cond
...
fun:theora_enc_handle_frame
}
{
<insert_a_suppression_name_here>
Memcheck:Value8
fun:oc_enc_tokenize_ac
}
# FIXED
{
Fixed with mesa master
Memcheck:Cond
fun:lp_build_blend_factor_unswizzled
...
fun:gst_glimage_sink_on_draw
}
# FIXED
{
Fixed with mesa master
Memcheck:Leak
match-leak-kinds: definite
fun:calloc
...
fun:_do_convert_draw
fun:_do_convert_one_view
}
# FIXED
{
Fixed with mesa master
Memcheck:Leak
match-leak-kinds: definite
fun:malloc
...
fun:gst_gl_shader_compile
}
# FIXED
{
Fixed with mesa master
Memcheck:Leak
match-leak-kinds: definite
fun:calloc
...
fun:_draw_checker_background
fun:_draw_background
fun:gst_gl_video_mixer_callback
}
{
#https://bugs.freedesktop.org/show_bug.cgi?id=8215
#https://bugs.freedesktop.org/show_bug.cgi?id=8428
#FcPattern uses 'intptr_t elts_offset' instead of 'FcPatternEltPtr elts',
#which confuses valgrind.
font_config_bug_2
Memcheck:Leak
fun:*alloc
...
fun:Fc*Add*
}
{
#Same root cause as font_config_bug_2.
#The 'leak' here is a copy of rule values, as opposed to new values.
font_config_bug_3
Memcheck:Leak
fun:*alloc
...
fun:FcConfigValues
}
{
#Same root cause as font_config_bug_2.
#The 'leak' is copies of font or pattern values into returned pattern values.
font_config_bug_4
Memcheck:Leak
fun:*alloc
...
fun:FcValue*
fun:FcFontRenderPrepare
}
{
font_config_bug_6
Memcheck:Leak
fun:*alloc
...
obj:*/libfontconfig.so.*
}

View file

@ -0,0 +1 @@
subdir('scenarios')

View file

@ -0,0 +1,5 @@
description, duration=15.0
set-restriction, playback-time=5.0, restriction-caps="video/x-raw,framerate=(fraction)5/1"
set-restriction, playback-time=10.0, restriction-caps="video/x-raw,framerate=(fraction)30/1"
eos, playback-time=15.0
stop, playback-time=15.0

View file

@ -0,0 +1,7 @@
description, duration=25.0
set-restriction, playback-time=5.0, restriction-caps="video/x-raw,framerate=(fraction)5/1"
set-restriction, playback-time=10.0, restriction-caps="video/x-raw,height=20,width=20,framerate=(fraction)5/1"
set-restriction, playback-time=15.0, restriction-caps="video/x-raw,height=20,width=20,framerate=(fraction)30/1"
set-restriction, playback-time=20.0, restriction-caps="video/x-raw,height=720,width=1280,framerate=(fraction)30/1"
eos, playback-time=25.0
stop, playback-time=25.0

View file

@ -0,0 +1,5 @@
description, duration=15.0
set-restriction, playback-time=5.0, restriction-caps="video/x-raw,height=480,width=854"
set-restriction, playback-time=10.0, restriction-caps="video/x-raw,height=720,width=1280"
eos, playback-time=15.0
stop, playback-time=15.0

View file

@ -0,0 +1,14 @@
description, duration=55.0, min-media-duration=470.0, seek=true, reverse-playback=true
include,location=includes/default-seek-flags.scenario
seek, name=backward-seek, playback-time=0.0, rate=-1.0, start=0.0, stop=310.0, flags="$(default_flags)"
seek, name=forward-seek, playback-time=305.0, rate=1.0, start=305.0, flags="$(default_flags)"
seek, name=Fast-forward-seek, playback-time=310.0, rate=2.0, start=310.0, flags="$(default_flags)"
seek, name=Fast-backward-seek, playback-time=320.0, rate=-2.0, start=0.0, stop=320.0, flags="$(default_flags)"
seek, name=Fast-forward-seek, playback-time=310.0, rate=4.0, start=310.0, flags="$(default_flags)"
seek, name=Fast-backward-seek, playback-time=330.0, rate=-4.0, start=0.0, stop=330.0, flags="$(default_flags)"
seek, name=Fast-forward-seek, playback-time=310.0, rate=8.0, start=310.0, flags="$(default_flags)"
seek, name=Fast-backward-seek, playback-time=350.0, rate=-8.0, start=0.0, stop=350.0, flags="$(default_flags)"
seek, name=Fast-forward-seek, playback-time=310.0, rate=16.0, start=310.0, flags="$(default_flags)"
seek, name=Fast-backward-seek, playback-time=390.0, rate=-16.0, start=0.0, stop=390.0, flags="$(default_flags)"
seek, name=Fast-forward-seek, playback-time=310.0, rate=32.0, start=310.0, flags="$(default_flags)"
seek, name=Fast-backward-seek, playback-time=470.0, rate=-32.0, start=310.0, stop=470.0, flags="$(default_flags)"

View file

@ -0,0 +1,4 @@
description, duration=20.0
emit-signal, target-element-name=camerabin0, signal-name=start-capture, playback-time=10.0
eos, playback-time=20.0

View file

@ -0,0 +1,8 @@
description, duration=0, summary="Set state to NULL->PLAYING->NULL 20 times", need-clock-sync=true, min-media-duration=1.0, live_content_compatible=True, handles-states=true, ignore-eos=true
foreach, i=[0, 40],
actions = {
"set-state, state=playing",
"set-state, state=null",
}
stop;

View file

@ -0,0 +1,6 @@
description, summary="Disable subtitle track while pipeline is PAUSED", min-subtitle-track=2, duration=5.0, handles-states=true, needs_preroll=true
pause;
switch-track, name="Disable subtitle", type=text, disable=true
wait, duration=0.5
play;
stop, playback-time=2.0

Some files were not shown because too many files have changed in this diff Show more