ClutterInputDeviceX11 has been made private, so we cannot access it from
outside of clutter-input-device-core-x11.c. We should have simple
accessors for the min/max keycode, which is the only detail that we use.
Clutter-WARNING **: Unable to compile the GLSL
shader: Fragment shader failed to compile with the following errors:
The attached patch (against current git) should print out more
information what makes it easier to answer user feedback.
https://bugzilla.gnome.org/show_bug.cgi?id=664252
While working through the Python3/pygobject bindings, I came across a missing
(allow-none) in clutter_state_set_key(). This allows the API to specify to None
as the source_target.
https://bugzilla.gnome.org/show_bug.cgi?id=664996
The builtin effects ClutterColorizeEffect, ClutterDesaturateEffect and
ClutterShaderEffect all have properties which only affect the
rendering of the final texture not the contents of it. When these
properties are updated we should queue a repaint of the effect not
the actor so that we don't waste time repainting the contents of the
offscreen buffer.
https://bugzilla.gnome.org/show_bug.cgi?id=665052
Reviewed-by: Emmanuele Bassi <ebassi@linux.intel.com>
There was an #ifdef'd section of code for profiling that was using the
wrong variable name so it would not build.
Reviewed-by: Emmanuele Bassi <ebassi@linux.intel.com>
Previously the offscreen effect was keeping track of the size of the
texture so that it could detect when a different size is requested and
create a new texture. However this breaks if a subclass overrides
create_texture to make the texture bigger because in that case the
size of the texture will always be different from the calculated size
of the actor. This patch makes it also track the size of the fbo that
was requested before being passed through create_texture() and it
instead uses that to detect when a new FBO is needed.
https://bugzilla.gnome.org/show_bug.cgi?id=665040
Reviewed-by: Emmanuele Bassi <ebassi@linux.intel.com>
It should be possible to define markers in ClutterScript when
describing a ClutterTimeline.
The syntax is a trivial:
"markers" : [
{ "name", <marker-name>, "time" : <msecs> }
]
While at it, we should document it inside the API reference, as well
as fleshing out the ClutterTimeline description.
To allow language bindings to properly override Script.connect_signals()
they'll need access access to Script.connect_signals_full().
Thanks to Jeremy Moles for reporting.
Since we have a _clutter_debug_message() function compiled in
unconditionally we have no further need for the equivalent conditional
version defined in clutter-profile.[ch]: we can simply do the work in
one function.
We still ship clutter_get_show_fps() and clutter_get_debug_enabled() as
public entry points. Yet another case of missing API review prior to the
1.0 release, so really the bucket stops around my desk.
Let's deprecate these two useless functions, and reduce the API
footprint of Clutter.
This function should have never been made public in the first place; its
output depends on a configuration option of Clutter, and it's basically
useful only for internal debugging.
Make it consistent across the various build options (with or without
profiling enabled), and add a timestamp using the monotonic clock to
every debug message.
The clutter_get_timestamp() output depends on whether Clutter was
compiled with debugging support — it's meant to be used only by the
debugging notes, and it should not be used for anything else.
Instead of calling cogl_set_depth_test_enabled and
cogl_set_backface_culling_enabled ClutterDeformEffect now uses the
experimental CoglPipeline API. Those global state functions will soon
be deprecated in Cogl and they are implemented by flushing a temporary
override pipline which isn't ideal.
Using the new culling API we can also avoid having a separate buffer
of indices for the back of the texture by just changing the culling
mode to cull front baces instead of the back.
https://bugzilla.gnome.org/show_bug.cgi?id=663636
This changes ClutterDeformEffect to use a CoglAttributeBuffer with a
CoglPrimitive instead of the old CoglVertexBuffer. The old vertex
buffer code is now implemented in terms of the attribute buffer code
and it will eventually be deprecated. Using CoglPrimitives should be
slightly more efficient.
This also changes the struct we store the vertices to be
CoglVertexP3T2C4 instead of CoglTextureVertex. The latter is
technically not compatible with neither vertex buffers nor attribute
buffers because it contains a CoglColor and the internal members of
that are private so it is not valid to assume it contains 4 bytes and
use that as an attribute. Also it contains padding so it ends up
redundantly creating a larger buffer. CoglTextureVertex is in the
public API for the deform_vertex virtual so we still have to maintain
that. Instead of directly manipulating the array to upload, the
application is now passed a stack allocated temporary struct which
gets converted to a CoglVertexP3T2C4. This also means that we can map
the buffer as write only and still let the application read-write the
vertex.
The paint debug code to draw line strips for the deform mesh was
previously trying to set a red source material. However this wasn't
working because the material color was being overwritten by the color
attribute in the vertex buffer. This patch fixes that by creating a
seperate primitive for the lines and not adding the color
attribute. The lines code was also drawing both the front and back
indices. I don't think that entirely makes sense so I've just changed
it to draw only the front indices. Maybe painting both would make more
sense if backface culling was still enabled.
https://bugzilla.gnome.org/show_bug.cgi?id=663636
When invalidating the deform effect, we are invalidating the vertices
shaping the deformation of an actor. Therefore, there is no need to
trigger a redraw of the associated actor, we can just repaint the
effect.
Signed-off-by: Lionel Landwerlin <lionel.g.landwerlin@linux.intel.com>
https://bugzilla.gnome.org/show_bug.cgi?id=663720
* deprecate-default-stage:
evdev: do not associate device with stage
evdev: don't even process events without a default stage
docs: Note default stage deprecation in README
docs: Remove clutter_stage_get_default()
stage: Deprecate the default stage
script: Do not use clutter_stage_get_default()
cally/actor: Do not use the default stage as a fallback
Try to mop up the default stage mess
performance/*: Do not use clutter_stage_get_default()
interactive/*: Do not use clutter_stage_get_default()
Merge with a11y
micro-bench/*: Do not use clutter_stage_get_default()
accessibility/*: Do not use clutter_stage_get_default()
conform/*: Do not use clutter_stage_get_default()
The VBLANK environmental variable is done universally in clutter-main.c
as in commits e8562089 (main: Add a sync-to-vblank global flag) and
db211a21 (Remove per-backend CLUTTER_VBLANK envvar), so remove these things
here as well.
https://bugzilla.gnome.org/show_bug.cgi?id=663999
The VBLANK environmental variable is done universally in clutter-main.c
as in commits e8562089 (main: Add a sync-to-vblank global flag) and
db211a21 (Remove per-backend CLUTTER_VBLANK envvar), so remove these things
here as well.
-Make the contents of config.h.win32.in more like config.h.in
-Define CLUTTER_INPUT_WIN32 accordingly (no GDK3 defines yet, until
GDK3 on Windows is more stable)
The evdev system is a bit different from other input systems in
Clutter because it's completly decorrelated from anything graphic.
In the case of embedded devices with no proper windowing system, you
might want to not implicitly create a default stage when you're
receiving the first input event.
This patch changes this behavior by not forwarding any event if you
don't have a default stage.
Signed-off-by: Lionel Landwerlin <lionel.g.landwerlin@linux.intel.com>
https://bugzilla.gnome.org/show_bug.cgi?id=651718
A lot of the example code in the cookbook and the API reference still
uses the default stage — sometimes as if it were a non-default one,
which once again demonstrates how the default stage was a flawed concept
that just confused people.
Using the default stage as a fallback is wrong in all circumstances.
In this specific case, if an actor is not associated to a stage then it
cannot possibly be the key focus.