We cannot deprecate ClutterAlpha yet. We cannot also implement
ClutterAlpha in terms of ClutterTimeline, because multiple Alpha
instances can be attached to the same Timeline. So we can start
with a "soft" deprecation: just a warning in the documentation
stating that ClutterAlpha will be deprecated, and removed, in the
future, and that newly-written code should use ClutterTimeline
instead.
We can use ClutterTimeline and its progress mode inside
ClutterAnimation; obviously, we have to maintain the invariants because
of the ClutterAnimation:alpha property, but if all you set is the :mode
property using one of the Clutter animation modes then we can skip the
ClutterAlpha entirely.
Instead of having the easing functions be dependent of ClutterAlpha, and
static to the clutter-alpha.c source file, we should make them generic
and move them to their own internal header and source files. This will
allow to re-use them in the near future.
Since Cogl has started restricting what cogl 1.x api is exposed when
COGL_ENABLE_EXPERIMENTAL_2_0_API is defined and since we build all
Clutter internals with COGL_ENABLE_EXPERIMENTAL_2_0_API defined this
patch makes a first pass at reducing our internal use of the Cogl 1.x
api.
The most notable api that's no longer exposed to us internally is
the cogl_material_ api so this switches all Clutter internals to use the
cogl_pipeline_ api instead. This patch also makes quite a bit of
progress removing internal uses of CoglHandle although there is still
more to go.
The experimental cogl_texture_pixmap_x11_new() api was recently changed
to take an explicit context argument and return a GError on failures.
This updates Clutter's use of the api accordingly.
We were only exposing clutter_backend_get_cogl_context() if
COGL_ENABLE_EXPERIMENTAL_2_0_API had been defined but the CoglContext
api is also available if COGL_ENABLE_EXPERIMENTAL_API has been defined.
As it was it meant that code opting into the experimental Cogl api
but not limiting to the 2.0 only api would have to #define
COGL_ENABLE_EXPERIMENTAL_2_0_API before including clutter.h but make
sure it wasn't defined when including cogl.h which was particularly
awkward.
Recently the cogl_framebuffer_swap_* apis were moved into the
cogl_onscreen_* namespace since only CoglOnscreen framebuffers can be
double buffered. This renames all uses of the cogl_framebuffer_swap_*
apis in Clutter.
The experimental cogl_pipeline_new() api was recently changed so it
explicitly takes a CoglContext. This updates all calls to
cogl_pipeline_new() in clutter accordingly.
ATK_ROLE_CANVAS is not a suitable role, as the user (in general) can't
draw on the Stage. CallyStage implements AtkWindow, so the proper role
is ATK_ROLE_WINDOW
Removing atkcomponent, focus_tracker, etc. Emitting focus state change
from the stage. Now things are more simple, and stop to use some
of the soon-to-be-deprecated signals on ATK.
It should be possible to do:
clutter_stage_set_fullscreen (stage, TRUE);
clutter_actor_show (stage);
and have the stage be full screen as soon as it is shown.
Currently, we need to call clutter_actor_realize() prior to calling
set_fullscreen(), otherwise the backing X window will not be set,
and ClutterStageX11 will silently discard the change.
If set_fullscreen() was called prior to realization, ClutterStageX11
should delay setting the fullscreen hint until the realize() chain
has been successfully executed.
http://bugzilla.clutter-project.org/show_bug.cgi?id=2515
If you execute the following sequence :
stage = clutter_stage_new ();
clutter_actor_set_size (stage, 1280, 800);
clutter_actor_realize (stage);
Then you end up creating an onscreen buffer of size 1280x800 but
ClutterStageX11 storing the stage size at 640x480.
This patch resync the 2 implementation by using the ClutterStage's
size in both classes when realizing.
Signed-off-by: Lionel Landwerlin <llandwerlin@gmail.com>
https://bugzilla.gnome.org/show_bug.cgi?id=667540
Unconditionally creating CoglPipeline and CoglSnippets inside the class
initialization functions does not seem to be enough when dealing with
headless builds.
Our last resort is to lazily create the base pipeline the first time we
try to copy it, during the instance initialization.
The class initialization function may be called when Clutter hasn't been
fully initialized — for instance, when scanning the source with gtk-doc
or with the introspection scanner.
The allocation code for BoxLayout contains a sequence of brain farts
that make it barely working since the synchronization of the layout
algorithm to the one in GtkBox.
The origin of the layout is inverted, and it doesn't take into
consideration a modified allocation origin (for actors the provide
padding or margin).
The pack-start property is broken, and it only works because we walk the
children list backwards; this horribly breaks when a child changes
visibility. Plus, we count invisible children, which leads to
allocations getting insane origins (either close to -MAX_FLOAT or
MAX_FLOAT).
Finally, the allocation is applied twice even for non-animated cases.
https://bugzilla.gnome.org/show_bug.cgi?id=669291
* clutter_wayland_input_device_get_wl_input_device for the input device
* clutter_wayland_stage_get_wl_surface for the Wayland surface
* clutter_wayland_stage_get_wl_shell_surface for the shell surface
This converts the blur, colorize and desaturate effects to use
snippets instead of CoglPrograms. Cogl can handle the snippets much
more efficiently than programs so this should be a performance win. It
also fixes the problem that Cogl would end up recompiling the program
for every instance of the effects because Clutter was not reusing the
same program.
Reviewed-by: Emmanuele Bassi <ebassi@linux.intel.com>
The blur effect needs to pass a uniform to the GLSL shader so that it
can know the texture coordinate offset from one texel to another. To
calculate this the blur effect was previously using the allocation
size of the actor rounded up to the next power of two. Presumably the
assumption was that Cogl would round up the size of the texture to the
next power of two when allocating the texture. However this is not be
true if the driver supports NPOT textures. Also it doesn't take into
account the paint volume of the actor which may cause the texture to
be a completely different size. This patch just changes to directly
use the size of the texture.
Reviewed-by: Emmanuele Bassi <ebassi@linux.intel.com>
Sometimes a subclass of ClutterOffscreenEffect wants to paint with a
completely custom material. In that case it is awkward to modify the
material returned owned by ClutterOffscreenEffect so it makes more
sense to just get the texture and manage its own material.
Reviewed-by: Emmanuele Bassi <ebassi@linux.intel.com>
All of the pipelines used for ClutterTexture actors share a common
pipeline ancestor created with cogl_pipeline_copy. Previously this
ancestor had a dummy 1x1 texture attached to it so that it would end
up with the same state as the child pipelines that will render with a
texture. Cogl now has a mechanism to specify that a texture will be
used with a pipeline layer without having to create an actual texture.
This patch makes it use that to avoid having an unused texture.
Reviewed-by: Emmanuele Bassi <ebassi@linux.intel.com>
If we have N children and the user passes N (or a number beyond N) to
clutter_actor_insert_child_at_index, we should respond by adding the
child at the end, not silently doing nothing.
This should avoid trying to fix the origin of a paint volume set from
the allocation's origin, and thus breaking everything.
A PaintVolume for an actor is defined to be relative to the actor's
modelview unless specifically modified by internal functions; the origin
of an actor's allocation is, on the other hand, parent-relative.
There are times when we don't want to remove all children and count of
the reference count to drop to 0 to ensure destruction; there are cases,
such as managed environments, where it's preferable to ensure that the
children of an actor get actually destroyed.
ClutterActor has a background-color property, now; we should use it for
the Stage, re-implement the color property in terms of background-color.
and deprecate the Stage property.
Being able to easily set the number of repeats has been a request for
the animation framework for some time now. The usual way to implement
this is: connect to the ::completed signal, use a static counter, and
stop the timeline when the counter hits a specific spot.
In the same light as the :auto-reverse property, we can make it easier
to implement this common functionality by adding a :repeat-count
property that, when set, limits the amount of loops that a Timeline can
perform before stopping itself.
In fact, we can implement the :loop property in terms of the
:repeat-count property just by using a sentinel value mapping to
"infinity", and map loop=FALSE to repeat-count=0, and loop=TRUE to
repeat-count=-1.
The clutter_timeline_clone() method was a pretty dumb idea when it was
introduced, back when we still had the ClutterEffectTemplate and the
clutter_effect_* animation API. It has since become an API wart: we
cannot change or add new properties to be cloned without the risk of
breaking existing code. All in all, cloning a GObject is just a matter
of calling g_object_new() with the wanted properties.
Let's deprecate this throwback of the Olden Days™, so that we can remove
it for good once we break for 2.0.