This function tells the obvious on X11, and implements a similar mechanism
on wayland to determine the "pointer emulating" sequence, or one to stick
with when implementing single-touch behavior.
https://bugzilla.gnome.org/show_bug.cgi?id=733631
The current GNOME Shell Alt-F2 restart looks very messy and also
provides no indication to the user what is going on. We need to
restart the compositor to switch in and out of stereo mode, so
add a framework for doing this more cleanly:
Additions:
meta_restart(): restarts the compositor with a message
MetaDisplay::show-restart-message: signal the embedding
shell to show a message
MetaDisplay::restart: signal the embedding shell to restart
itself.
meta_is_restart(): indicates whether the current instance is a
restart so we can suppress login animations.
A helper program meta-restart-helper holds the composite overlay
window up during the restart to avoid visual artifacts.
https://bugzilla.gnome.org/show_bug.cgi?id=733026
With get_input_region existing, get_input_rect is a misnomer. Really,
it's about the geometry of the output surface, and it's only used that
way in the compositor code.
Way back when in GNOME 3.2, get_input_rect was added when we added
invisible borders. get_outer_rect was always synonymous with server-side
geometry of the toplevel. get_outer_rect was used for both user-side
policy (the "frame rect") and to get the geometry of the window.
Invisible borders were meant to extend the input region of the frame
window silently. Since most users of get_outer_rect cared about the
frame rect, we kept that the same and added a new method, get_input_rect
to get the full rect of the framed window with all invisible borders for
input kept on.
As time went on and CSD and Wayland became a reality, the relationship
between the server-side geometry and the "frame rect" became more
complicated, as can be evidenced by the recent commits. Since clients
don't tend to be framed anymore, they set their own input region.
get_buffer_rect is also sort of a poor name, since X11 doesn't really
have buffers, but we don't really have many other alternatives.
This doesn't change any of the code, nor the meaning. It will always
refer to the rectangle where the toplevel should be placed.
Avoid populating *_VERSION constants through cflags in pkg-config-file
which could be overridden by the project using it. Properly prefix the
defines with META_ to make gi-scanner happy.
When opening the window menu without an associated control - e.g.
by right-clicking the titlebar or by keyboard - using coordinates
for the menu position is appropriate. However when the menu is
associated with a window button, the expected behavior in the
shell can be implemented much easier with the full button geometry:
the menu will point to the center of the button's bottom edge
rather than align to the left/right side of the titlebar as it
does now, and the clickable area where a release event does not
dismiss the menu will match the actual clickable area in mutter.
So add an additional show_window_menu_for_rect() function and
use it when opening the menu from a button.
https://bugzilla.gnome.org/show_bug.cgi?id=731058
The last commit added support for the "appmenu" button in decorations,
but didn't actually implement it. Add a new MetaWindowMenuType parameter
to the show_window_menu () functions and use it to ask the compositor
to display the app menu when the new button is activated.
https://bugzilla.gnome.org/show_bug.cgi?id=730752
We want to synchronize the button layouts of our server side
decorations and GTK+'s client side ones. However each currently
may contain buttons not supported by the other, which makes this
unnecessarily tricky.
So add support for a new "appmenu" button in the layout, to display
the fallback app menu in the decorations.
https://bugzilla.gnome.org/show_bug.cgi?id=730752
It looks weird to have Alt+Space pop up under the cursor instead
of the top-left corner of the window, and the Wayland request will
pass through the coordinates as well.
Add it to the compositor interface, and extend the
_GTK_SHOW_WINDOW_MENU ClientMessage to support it as well.
Grab operations are now always taken on the backend connection, and
this breaks GTK+'s event handling.
Instead of taking a grab op, just do the handling ourselves. The
GTK+ connection will get an implicit grab, which means pointer /
keyboard events won't be sent to the rest of mutter, which is good.
Compositors haven't been able to manage more than one screen for
quite a while. Merge MetaCompScreen into MetaCompositor, and update
the API to match.
We still keep MetaScreen in the public compositor API for compatibility
purposes.
We previously separated out MetaDisplay and MetaScreen. mutter
would only manage one screen, but we still kept a list of screens
for simplicity.
With Wayland support, we no longer care about the ability to
manage more than one screen at a time. Remove this by killing
the list of screens, in favor of having just one MetaScreen
in MetaDisplay.
We also kill off active_screen at the same time, since it's
not necessary anymore.
A future cleanup should merge MetaDisplay and MetaScreen. To avoid
breaking API, we should probably keep MetaScreen around as a dummy
type.
Which is used for Wayland popup grabs.
The issue here is that we don't want the code that raises or focuses
windows based on mouse ops to run while a client has a grab.
We still keep the "old" grab infrastructure in place for now, but
ideally we'd replace it eventually with a better grab-op infrastructure.
... and individually. It turns out that updating the opaque region
was causing the shape region to be updated, which was causing a new
shape mask to be generated and uploaded to the GPU. Considering
GTK+ regenerates the opaque region on pretty much any focus change,
this is not good.
We need a MetaWaylandSurface to build a MetaSurfaceActor, but
we don't have one until we get the set_window_xid() call from
XWayland. On the other hand, plugins expect to see the window
actor right from when the window is created, so we need this
empty state.
Based on a patch by Jasper St. Pierre.
The input region was set on the shaped texture, but the shaped texture
was never picked properly, as it was never set to be reactive. Move the
pick implementation and reactivity to the MetaSurfaceActor, and update
the code everywhere else to expect a MetaSurfaceActor.
Traditionally, WMs unmap windows when minimizing them, and map them
when restoring them or wanting to show them for other reasons, like
upon creation.
However, as metacity morphed into mutter, we optionally chose to keep
windows mapped for the lifetime of the window under the user option
"live-window-previews", which makes the code keep windows mapped so it
can show window preview for minimized windows in other places, like
Alt-Tab and Expose.
I removed this preference two years ago mechanically, by removing all
the if statements, but never went through and cleaned up the code so
that windows are simply mapped for the lifetime of the window -- the
"architecture" of the old code that maps and unmaps on show/hide was
still there.
Remove this now.
The one case we still need to be careful of is shaded windows, in which
we do still unmap the client window. In the future, we might want to
show previews of shaded windows in the overview and Alt-Tab. In that
we'd also keep shaded windows mapped, and could remove all unmap logic,
but we'd need a more complex method of showing the shaded titlebar, such
as using a different actor.
At the same time, simplify the compositor interface by removing
meta_compositor_window_[un]mapped API, and instead adding/removing the
window on-demand.
https://bugzilla.gnome.org/show_bug.cgi?id=720631
We want to remove a bunch of auxilliary duties from the MetaWindowActor
and MetaSurfaceActor, including some details of how culling is done.
Move the unobscured region culling code to the MetaShapedTexture, which
helps the actor become "more independent".
https://bugzilla.gnome.org/show_bug.cgi?id=720631
We no longer unmap the toplevel windows during normal operation. The
toplevel state is tied to the window's lifetime.
Call meta_compositor_add_window / meta_compositor_remove_window instead...
Traditionally, WMs unmap windows when minimizing them, and map them
when restoring them or wanting to show them for other reasons, like
upon creation.
However, as metacity morphed into mutter, we optionally chose to keep
windows mapped for the lifetime of the window under the user option
"live-window-previews", which makes the code keep windows mapped so it
can show window preview for minimized windows in other places, like
Alt-Tab and Expose.
I removed this preference two years ago mechanically, by removing all
the if statements, but never went through and cleaned up the code so
that windows are simply mapped for the lifetime of the window -- the
"architecture" of the old code that maps and unmaps on show/hide was
still there.
Remove this now.
The one case we still need to be careful of is shaded windows, in which
we do still unmap the client window. Theoretically, we might want to
show previews of shaded windows in the overview and Alt-Tab, so we remove
the complex unmap tracking for this later.
Currently the only way to move a window to another monitor via
keyboard is to start a move operation and move it manually using
arrow keys. We do have all the bits of a dedicated keybinding in
place already, so offer it as a more comfortable alternative.
https://bugzilla.gnome.org/show_bug.cgi?id=671054
Make MetaWindowActor chain up to the generic default MetaCullable
implementation, and remove the helper methods for MetaSurfaceActor
and MetaShapedTexture.
For clarity, rename meta_window_get_outer_rect() to match terminology
we use elsewhere. The old function is left as a deprecated
compatibility wrapper.
There are extensive places in the code where we convert between the client
rectangle and the frame rectangle. Instead of manually doing it use
new helper functions on MetaWindow and the existing meta_window_get_outer_rect().
This fixes a number of bugs where the computation was being done incorrectly,
most of these bugs are with the recently added custom frame extents, but
some relate to invisible borders or even simply to confusion between the
window and frame rectangle.
Switch the placement code to place the frame rectangle rather
than the client window - this simplifies things considerably.
https://bugzilla.gnome.org/show_bug.cgi?id=707194
Use the "hotplug_mode_update" connector property indicating that the
screen settings should be updated: get a new preferred mode on hotplug
events to handle dynamic guest resizing (where you resize the host
window and the guest resizes with it).
https://bugzilla.gnome.org/show_bug.cgi?id=711216
When X clients change the keyboard map, the also update a property
on the root window. We can notice that and rebuild our data structures
with the new values, as well as inform the wayland clients.
This is a terrible hack, and it's not how we want to implement things
in 3.12, but it's enough to have the same keyboard layout in the
shell, in X clients and in wayland clients in 3.10, until we decide
on the fate of the keyboard g-s-d plugin.
https://bugzilla.gnome.org/show_bug.cgi?id=707446
Switching meta/util.h to gi18n.h was wrong, mutter is a library
and needs gi18n-lib.h, but that cannot be included from a public
header (since it depends on config.h or command line options),
so split util.h into a public and a private part.
https://bugzilla.gnome.org/show_bug.cgi?id=707897
clutter_stage_show_cursor()/hide_cursor() only works in the X11
backend (where someone else is in charge of showing the cursor),
and even then, it has confusing effects when running nested wayland,
so an abstraction layer is needed.
https://bugzilla.gnome.org/show_bug.cgi?id=707474
When we get a damage event we update the window by calling
meta_shaped_texture_update_area which queues a redraw on the actor.
We can avoid that for obscured regions by comparing the damage area to
our visible area.
This patch causes _NET_WM_FRAME_DRAWN messages to be not sent in some cases
where they should be sent; they will be added back in a later commit.
https://bugzilla.gnome.org/show_bug.cgi?id=703332
When drawing entirely opaque regions, we traditionally kept blending on
simply because it made the code more convenient and obvious to handle.
However, this can cause lots of performance issues on GPUs that aren't
too powerful, as they have to readback the buffer underneath.
Keep track of the opaque region set by windows (through _NET_WM_OPAQUE_REGION,
Wayland opaque_region hints, standard RGB32 frame masks or similar), and draw
those rectangles separately through a different path with blending turned off.
https://bugzilla.gnome.org/show_bug.cgi?id=707019
When running as a wayland compositor, we can't use the xserver's
IDLETIME, because that's updated only in response to X events.
But we have all the events ourselves, so we can just run the timer
in process.
https://bugzilla.gnome.org/show_bug.cgi?id=706005
Remove grab window and cursor from the API, and just grab always
on the stage window with no cursor.
This is mainly to remove the X11 usage in the public API, in preparation
for implementing this in wayland.
https://bugzilla.gnome.org/show_bug.cgi?id=705917
Under X, we need to use XFixes to watch the cursor changing, while
on wayland, we're in charge of setting and painting the cursor.
MetaCursorTracker provides the abstraction layer for gnome-shell,
which can thus drop ShellXFixesCursor. In the future, it may grow
the ability to watch for pointer position too, especially if
CursorEvents are added to the next version of XInput2, and thus
it would also replace the PointerWatcher we use for gnome-shell's
magnifier.
https://bugzilla.gnome.org/show_bug.cgi?id=705911
We want to show a dialog when a display change happens from the
control center. To do so, add a new vfunc to MetaPlugin and
call it when a configuration change is requested via DBus.
https://bugzilla.gnome.org/show_bug.cgi?id=705670