mutter/tests
Neil Roberts b6b9ac0b85 Add a cogl-version header
This adds a version header which contains macros to define which
version of Cogl the application is being compiled against. This helps
applications that want to support multiple incompatible versions of
Cogl at compile time.

The macros are called COGL_VERSION_{MAJOR,MINOR,MICRO}. This does not
match Clutter which names them CLUTTER_{MAJOR,MINOR,MICRO}_VERSION but
I think the former is nicer and it at least matches Cairo and Pango.

The values of the macro are defined to COGL_VERSION_*_INTERNAL which
is generated by the configure script into cogl-defines.h.

There is also a macro for the entire version as a string called
COGL_VERSION_STRING.

The internal utility macros for encoding a 3 part version number into
a single integer have been moved into the new header so they can be
used publicly as a convenient way to check if the version is within a
particular range. There is also a COGL_VERSION_CHECK macro for the
very common case that a feature will be used since a particular
version of Cogl. There is a macro called COGL_VERSION which contains
the pre-encoded version of Cogl being compiled against for
convenience.

Unlike in Clutter this patch does not add any runtime version
identification mechanism.

A test case is also added which just contains static asserts to sanity
check the macros.

Reviewed-by: Robert Bragg <robert@linux.intel.com>

(cherry picked from commit 3480cf140dc355fa87ab3fbcf0aeeb0124798a8f)
2012-08-06 14:27:40 +01:00
..
conform Add a cogl-version header 2012-08-06 14:27:40 +01:00
data Starts porting Cogl conformance tests from Clutter 2011-09-08 15:48:07 +01:00
micro-perf tests: Adds journal micro benchmark 2012-08-06 14:27:39 +01:00
Makefile.am tests: Adds journal micro benchmark 2012-08-06 14:27:39 +01:00
README Starts porting Cogl conformance tests from Clutter 2011-09-08 15:48:07 +01:00

Outline of test categories:

The conform/ tests:
-------------------
These tests should be non-interactive unit-tests that verify a single
feature is behaving as documented. See conform/ADDING_NEW_TESTS for more
details.

Although it may seem a bit awkward; all the tests are built into a
single binary because it makes building the tests *much* faster by avoiding
lots of linking.

Each test has a wrapper script generated though so running the individual tests
should be convenient enough. Running the wrapper script will also print out for
convenience how you could run the test under gdb or valgrind like this for
example:

  NOTE: For debugging purposes, you can run this single test as follows:
  $ libtool --mode=execute \
            gdb --eval-command="b test_cogl_depth_test" \
            --args ./test-conformance -p /conform/cogl/test_cogl_depth_test
  or:
  $ env G_SLICE=always-malloc \
    libtool --mode=execute \
            valgrind ./test-conformance -p /conform/cogl/test_cogl_depth_test

By default the conformance tests are run offscreen. This makes the tests run
much faster and they also don't interfere with other work you may want to do by
constantly stealing focus. CoglOnscreen framebuffers obviously don't get tested
this way so it's important that the tests also get run onscreen every once in a
while, especially if changes are being made to CoglFramebuffer related code.
Onscreen testing can be enabled by setting COGL_TEST_ONSCREEN=1 in your
environment.

The micro-bench/ tests:
-----------------------
These should be focused performance tests, ideally testing a
single metric. Please never forget that these tests are synthetic and if you
are using them then you understand what metric is being tested. They probably
don't reflect any real world application loads and the intention is that you
use these tests once you have already determined the crux of your problem and
need focused feedback that your changes are indeed improving matters. There is
no exit status requirements for these tests, but they should give clear
feedback as to their performance. If the framerate is the feedback metric, then
the test should forcibly enable FPS debugging.

The data/ directory:
--------------------
This contains optional data (like images) that can be referenced by a test.


Misc notes:
-----------
• All tests should ideally include a detailed description in the source
explaining exactly what the test is for, how the test was designed to work,
and possibly a rationale for the approach taken for testing.

• When running tests under Valgrind, you should follow the instructions
available here:

        http://live.gnome.org/Valgrind

and also use the suppression file available inside the data/ directory.