mutter/tests
2011-07-06 16:51:49 +02:00
..
accessibility a11y: get_default_attributes implementation on cally-text 2011-07-06 16:51:49 +02:00
conform build: check COGL_HAS_GLES2 to check for gles2 support 2011-07-04 15:41:38 +01:00
data script: Rename "state" → "states" 2011-06-13 13:47:08 +01:00
interactive Add ClutterDropAction 2011-06-20 15:25:53 +01:00
micro-bench tests: Check return value of clutter_init_with_args instead of error 2011-02-28 14:10:06 +00:00
performance tests: Add performance tracking framework 2011-07-04 17:27:48 +01:00
Makefile.am tests: Add performance tracking framework 2011-07-04 17:27:48 +01:00
README tests: Add performance tracking framework 2011-07-04 17:27:48 +01:00

Outline of test categories:

The conform/ tests should be non-interactive unit-tests that verify a single
feature is behaving as documented. See conform/ADDING_NEW_TESTS for more
details.

The performance/ tests are performance tests, both focused tests testing single
metrics and larger tests. These tests are used to report one or more
performance markers for the build of Clutter. Each performance marker is picked
up from the standard output of running the tests from strings having the form
"\n@ marker-name: 42.23" where 'marker-name' and '42.23' are the key/value pairs
of a single metric. Each test can provide multiple key/value pairs. Note that
if framerate is the feedback metric the test should forcibly enable FPS
debugging itself. The file test-common.h contains utility function helping to
do fps reporting.

The interactive/ tests are any tests whose status can not be determined without
a user looking at some visual output, or providing some manual input etc. This
covers most of the original Clutter tests. Ideally some of these tests will be
migrated into the conformance/ directory so they can be used in automated
nightly tests.

The accessibility/ tests are tests created to test the accessibility support of
clutter, testing some of the atk interfaces.

The data/ directory contains optional data (like images and ClutterScript
definitions) that can be referenced by a test.

Other notes:

• All tests should ideally include a detailed description in the source
explaining exactly what the test is for, how the test was designed to work,
and possibly a rationale for the approach taken for testing.

• When running tests under Valgrind, you should follow the instructions
available here:

        http://live.gnome.org/Valgrind

and also use the suppression file available inside the data/ directory.