As all the relevant backends are expected to provide
ClutterPadButtonEvents, it makes no sense to split the information,
plus all other event fields are now available and might be needed
in the future.
https://bugzilla.gnome.org/show_bug.cgi?id=771098
Let the backend initialize the cursor tracker, and change all call
sites to get the cursor tracker from the backend instead of from the
screen. It wasn't associated with the screen anyway, so the API was
missleading.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
And remove the wayland-specific handling. This works for both Wayland and
X11 (provided the compositor receives pad events through a passive grab
there).
https://bugzilla.gnome.org/show_bug.cgi?id=773779
Even without a compositor grab, key events may still be expected to
be processed by the compositor and not applications, for instance
when using ctrl-alt-tab to keynav in the top bar. On X11, focus is
moved to the stage window in that case, so that events are processed
before they are dispatched by the window manager. On wayland, we need
to handle this case ourselves, so make sure to not pass key events to
wayland in that case, and move the key focus back to the stage when
appropriate.
https://bugzilla.gnome.org/show_bug.cgi?id=758167
Unsetting it in meta_display_handle_event() will make the pointer
emulation checks fail on TOUCH_END event handlers across clutter
actors, the sequence should still be considered as pointer emulating
at that time.
As we don't have a way to hook this post clutter event handling,
instead unset/reset it lazily on the next pointer emulating TOUCH_BEGIN
event, the checks would already fail on other sequences, even if the
pointer emulating touch ended earlier. The only extra thing we need
to take care about is sequence collision, at which point it's safe to
just unset the stored sequence if its new incarnation isn't flagged/
deemed as pointer emulating.
https://bugzilla.gnome.org/show_bug.cgi?id=756754
They otherwise fall through paths that enable bypass_clutter, this
is necessary so they can be picked by captured-event handlers
along the actor hierarchy.
https://bugzilla.gnome.org/show_bug.cgi?id=752248
When running as an X11 compositor we do this for every event we see on
the X event stream. As a wayland compositor we don't go through that
code path but since we see all events we can easily do this on motion
events.
In fact, we don't even need this caching when we're a wayland
compositor since we can always find where the pointer is without a
round trip but we're sharing the current monitor logic with the X
path so let's keep it as is for now.
https://bugzilla.gnome.org/show_bug.cgi?id=748478
The set/unset branches of meta_display_update_pointer_emulating_sequence()
have been split and put directly where it makes sense. The pointer emulated
sequence will be updated before processing the CLUTTER_TOUCH_BEGIN, and
after processing the CLUTTER_TOUCH_END, this way the checks on this hold
true during all the sequence lifetime.
https://bugzilla.gnome.org/show_bug.cgi?id=738411
We've long used a switch statement on the grab operation to determine
where events should go. The issue with MetaGrabOp is that it's a mixture
of a few different things, including event routing, state management,
and the behavior to choose during operations.
This leads to poorly defined event routing and hard-to-follow logic,
since it's sometimes unclear what should point where, and our utility
methods for determining grab operations apart can be poorly named.
To fix this, establish the concept of a "event route", which describes
where events should be routed to.
Instead of returning a value based on whether or not we handled it, we
have this logic: either we have taken a grab on the window, in which
case we have a grab op and have handled it ourselves, or we did not take
a grab and *need* to replay the event to the window.
Handle this in events.c by checking the grab operation in the same way
that we check the other grab ops.
MetaGestureTracker has been separating the "did I handle an event?" and the
"should the event be filtered out?" questions, merge this and make
handle_event() reply to "should the event be only handled by me?".
If a sequence wasn't accepted yet by the gesture tracker, the event will
go through (eg. not handled exclusively by the gesture tracker) and it'll
still be processed by Clutter, triggering gesture actions, and maybe
changing the sequence into other state.
https://bugzilla.gnome.org/show_bug.cgi?id=733631
This function tells the obvious on X11, and implements a similar mechanism
on wayland to determine the "pointer emulating" sequence, or one to stick
with when implementing single-touch behavior.
https://bugzilla.gnome.org/show_bug.cgi?id=733631
It isn't necessary. As an X11 compositor, we'll only see the event
if we have the grab on the window, anyway.
This was causing issues moving windows as a Wayland compositor.
It's been long enough. We can mandate support for these, at least
at build-time. The code doesn't actually compile without either
of these, so just consider that unsupported.
This doesn't particularly matter, since we fall through into a default
case that does nothing right below, but this matches the other paths
and it prevents us from falling into a trap if we add other event types
below.
If we start a grab op from a keybind / menu, we'll handle the
ButtonPress and drop the grab then, never giving the window a chance
to handle what it needs to do before the grab is dropped.
This means that if you use Alt+F7 to move a window around, move it
to a side-tiling or maximization area, and then left-click, it will
just hang there in the sky.
The reason we don't simply use gdk_window_add_filter directly is
because of some twisted idea that any GDK symbol being used from
core/ is a layer violation. While we certainly want to keep any
serious GDK code out of ui/, event handling is quite important
to have in core/, so simply use a GDK event filter directly.
We previously separated out MetaDisplay and MetaScreen. mutter
would only manage one screen, but we still kept a list of screens
for simplicity.
With Wayland support, we no longer care about the ability to
manage more than one screen at a time. Remove this by killing
the list of screens, in favor of having just one MetaScreen
in MetaDisplay.
We also kill off active_screen at the same time, since it's
not necessary anymore.
A future cleanup should merge MetaDisplay and MetaScreen. To avoid
breaking API, we should probably keep MetaScreen around as a dummy
type.
display.c is getting a bit crowded. Move most of the handling
out to another file, events.c.
The long-term goal is to have generic event handling here, with
backend-specific handling for the types of windows and such.