The "editable" property is documented to default to TRUE, but is
initialized to FALSE in the _init() function.
Third party code would be affected if we changed the default to be
TRUE, so we have to change the default value in the GParamSpec.
https://bugzilla.gnome.org/show_bug.cgi?id=654726
When emitting a new-frame signal, priv->elapsed_time is passed as a
parameter. This is a gint64. The closure marshal uses an INT. On some
platforms, this is not received correctly by signal handlers (they
receive 0). One solution is to cast priv->elapsed_time to a gint when
emitting the signal.
We cannot change the signature of the signal without breaking ABI.
https://bugzilla.gnome.org/show_bug.cgi?id=654066
If the meta for the animation property is not found, the name of the
property to look for is still from the token, and we need to free the
memory allocated for it.
https://bugzilla.gnome.org/show_bug.cgi?id=654656
Clutter may be used together with GTK+, which indirectly may use
XInput2 too, so the cookie data must persist when both are handling
events.
What happens now in a nutshell is, Clutter is only guaranteed to allocate
the cookie itself after XNextEvent(), and only frees the cookie if its
XGetEventData() call allocated the cookie data.
The X[Get|Free]EventData() calls happen now in clutter-event-x11.c as
hypothetically different event translators could also handle other set
of X Generic Events, or other libraries handling events for that matter.
When picking we need to disable dithering to be sure that the hardware
will not modify the colors we use as actor identifiers. Clutter was
manually calling glEnable/Disable GL_DITHER to handle this, but that was
a layering violation since Cogl is intended to handle all interactions
with OpenGL. Since we are now striving for GL vs GLES to be a runtime
choice we need to remove this last direct usage of GL from Clutter so it
doesn't have to be linked with GL at build time.
Signed-off-by: Neil Roberts <neil@linux.intel.com>
This updates _clutter_paint_volume_get_stage_paint_box to try and
calculate more stable paint-box sizes for fixed sized paint-volumes by
not basing the size on the volume's sub-pixel position.
So the aim is that for a given rectangle defined with floating point
coordinates we want to determine a stable quantized size in pixels that
doesn't vary due to the original box's sub-pixel position.
The reason this is important is because effects will use this API to
determine the size of offscreen framebuffers and so for a fixed-size
object that may be animated across the screen we want to make sure that
the stage paint-box has an equally stable size so that effects aren't
made to continuously re-allocate a corresponding fbo.
The other thing we consider is that the calculation of this box is
subject to floating point precision issues that might be slightly
different to the precision issues involved with actually painting the
actor, which might result in painting slightly leaking outside the
user's calculated paint-volume. This patch now adds padding to consider
this too.
Signed-off-by: Neil Roberts <neil@linux.intel.com>
Signed-off-by: Emmanuele Bassi <ebassi@linux.intel.com>
Instead of relying on C to round the floating point allocation to
integers by flooring the values we now use CLUTTER_NEARBYINT to round
the allocation's position and size to the nearest integers instead. Using
floor leads to rather unstable rounding for the width and height when
there may be tiny fluctuations in the floating point width/height.
Signed-off-by: Neil Roberts <neil@linux.intel.com>
Signed-off-by: Emmanuele Bassi <ebassi@linux.intel.com>
This is a replacement for the nearbyint function which always rounds to
the nearest integer. nearbyint is a C99 function so it might not always
be available but also it seems in glibc it is defined as a function call
so this macro could end up faster anyway. We can't just add 0.5 because
it will break for negative numbers.
Signed-off-by: Neil Roberts <neil@linux.intel.com>
Signed-off-by: Emmanuele Bassi <ebassi@linux.intel.com>
The implementation of _clutter_actor_set_default_paint_volume which
simply uses the actor's allocation to determine a paint-volume was
needlessly using the allocation rounded to integers by internally using
clutter_actor_get_allocation_geometry instead of
clutter_actor_get_allocation_box. This was introducing a lot of
instability into the paint-volume due to the way rounding was done.
The code has now been updated to use clutter_actor_get_allocation_box
so we are dealing with the floating point allocation instead.
Signed-off-by: Neil Roberts <neil@linux.intel.com>
Signed-off-by: Emmanuele Bassi <ebassi@linux.intel.com>
If we're building on/for Windows, set 'win32' as the default flavour; if
we're building on OS X, set 'osx' as the default flavour. For everything
else, use 'glx'.
If we're building on/for Windows, set 'win32' as the default flavour; if
we're building on OS X, set 'osx' as the default flavour. For everything
else, use 'glx'.
This adds a public function to get the bounds of the current clipped
redraw on a stage. This should only be called while the stage is being
painted. The function diverts to a virtual function on the
ClutterStageWindow implementation. If the function isn't implemented
or it returns FALSE then the entire stage is reported. The clip bounds
are in integer pixel coordinates in the stage's coordinate space.
http://bugzilla.clutter-project.org/show_bug.cgi?id=2421
Rather than using the #ifdefs and assuming that only one Cogl driver
is compiled in (which is no longer true), the test now calls
glGetString to check the GL_VERSION. This is kind of a hack but the
test is already calling GL functions directly anyway.
test-cogl-materials had a weird comment about glReadPixels using
inverted coordinates but the test now uses cogl_read_pixels instead of
glReadPixels so it is irrelevant.
Cogl recently renamed symbols with the form
cogl_onscreen_<platform>_blah to be consistent with other platform
specific APIs so they are now named like cogl_<platform>_onscreen_blah.
This makes the corresponding change to clutter.
Cogl changed has changed the name of cogl_context_egl_get_egl_context to
cogl_egl_context_get_egl_context to be consistent with other platform
specific symbols.
Since some experimental API in Cogl that Clutter uses has changed this
bumps our dependency up to 1.7.3 before landing the corresponding build
fixes for clutter to bring it in line with the Cogl changes.
The class is of dubious utility, now that we have a complex animation
API in ClutterAnimator and ClutterState, as opposed to a simple one
in ClutterBehaviour. The Score API also suffers from some naïve design
issues that made it far less useful than intended.
This adds a simple test which creates a material and a copy of
it. Both materials are then immediately unref'd. The test checks
whether the materials were actually freed by registering some user
data with a destroy callback. This should catch a bug that Cogl had
where it add an extra reference to the parent when a pipeline is
copied.
Signed-off-by: Robert Bragg <robert@linux.intel.com>
Commit 0ede622f51 inadvertently made it so that shaders are applied
during picking. This was making test-shader fail to respond to clicks.
The commit also makes it so that culling is applied during
picking. Presumably this is also unintentional because the commit
message does not mention it. However I think it may make sense to do
culling during picking so it might as well stay that way.
https://bugzilla.gnome.org/show_bug.cgi?id=653959