If the client doesn't set a geometry using xdg_shell, we'll compute its
geometry based on its surface and subsurfaces.
Yet, we translate that as a window (re)size only when there is a pending
geometry, that we don't have when we computed the geometry by ourself.
Make sure we set the pending new geometry flag when computing the
geometry when it actually changed.
https://bugzilla.gnome.org/show_bug.cgi?id=782213
Commit 0fd9e38175 fixed setting the out parameter for the x coordinate
when using the X11 backend, but broke compilation when the backend is
not available ...
Really fix the issue by running the X11-specific code when the X11
backend is available and in use, and display the one-time warning
otherwise.
https://bugzilla.gnome.org/show_bug.cgi?id=781902
GLib now generates the prototypes for the generated marshallers, so it's
not necessary to include the header any more.
This fixes a build failure in GNOME Continuous with GLib master, caused
by -Werror=redundant-decls.
Due to an accidental swap of an else statement and a preprocessor #else,
the output x coordinate is currently only set when not using the X11
windowing system, whoops.
https://bugzilla.gnome.org/show_bug.cgi?id=781902
A client can still commit state to a destroyed subsurface. It wont
update anything on the screen, since the subsurface will not be
visible, but mutter should still handle it and not crash.
https://bugzilla.gnome.org/show_bug.cgi?id=781391
The ARB_robustness extension defined the following tokens as
returned by GetGraphicsResetStatusARB (see spec at [1]):
GUILTY_CONTEXT_RESET_ARB 0x8253
INNOCENT_CONTEXT_RESET_ARB 0x8254
UNKNOWN_CONTEXT_RESET_ARB 0x8255
These tokens might not be defined in some GL implementations,
such as Mesa 13's implementation of GLES 2.0, so we need to
define them ourselves not to break those builds.
[1] https://www.khronos.org/registry/OpenGL/extensions/ARB/ARB_robustness.txthttps://bugzilla.gnome.org/show_bug.cgi?id=781398
g_hash_table_insert() doesn't replace the key. This was a problem
because the key was owned by the value inserted into the hash table, so
when a value was removed, the key was freed, meaning that the key in
the hash table was no pointing to freed memory. Fix this by using
g_hash_table_replace() instead, which work the same except that it
replaces the key with the one passed. This means that the key of a
value in the hash table is always the key owned by the value.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
The guard for handling size differences between keys were broken, it
only checked if the key passed by the second argument ended up being
shorter.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
An inactive monitor will not be assigned to a logical monitor, so don't
try to match against those. This avoids a dereferencing a NULL when the
main output of an inactive monitor doesn't have an assigned CRTC.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
Instead of looking at the GTK+ settings, check the logical monitor
state and determine the UI scaling factor given the maximum logical
monitor scale. This is only enabled when the monitor config manager
feature is enabled, as only then can a scale be explicitly configured.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
This adds a function to be used by gnome-shell to get the logical
monitor given a connector name. For now, use the same index integer
method to reference a logical monitor, but this should be revisited by
providing a better API later.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
The connector returned is the one of the main output. In other words,
for tiled monitors, it is the connector of the (0, 0) tile, and for
non-tiled, it is simply the connector of the output.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
The UI scaling depends on whether the framebuffers are scaled. Enable
the caller to determine the what scale its UI should be drawn in, in
relation to the stage coordinate space by calling this function. A new
singal "ui-scaling-factor-changed" is added in order to liston for for
changes.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
Window scaling is a clutter feature used to enable automatic scaling of
stage windows when running under as an application in windowing system.
Clutter in mutter does not support running as a stand-alone application
toolkit, so lets remove this unused feature.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
When told to, MetaMonitorConfigStore will save the current
configuration state by replacing the monitors-experimental.xml file
(while backing a backup).
https://bugzilla.gnome.org/show_bug.cgi?id=777732
Test that configuration works as expected when the backend doesn't
support handling the transform and an intermediate offscreen
framebuffer is used.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
In order to test deriving the logical state from the underlying
configuration, as is always done on X11, make the test backend derive
the state when stage views are disabled.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
Derive the logical monitor position not by looking at the main output
(the (0, 0) tile), but the one that is placed on the top-left corner.
This might be the non-main output on certain transformations.
https://bugzilla.gnome.org/show_bug.cgi?id=777732
Only the first output of the first monitor of the primary logical
monitor should be made primary. This fixes an issue where the wrong
logical monitor ended up as primary when the logical state was derived.
https://bugzilla.gnome.org/show_bug.cgi?id=777732