For GObject properties, we follow the convention of all-lowercase,
dash-separated names. Those translate to underscores in getters/setters,
so exempt them from the newly added "camelcase" rule.
https://gitlab.gnome.org/GNOME/gnome-shell/merge_requests/627
While we aren't using those destructured variables, they are still useful
to document the meaning of those elements. We don't want eslint to keep
warning about them though, so mark them accordingly.
https://gitlab.gnome.org/GNOME/gnome-shell/merge_requests/627
Those unused arguments aren't bugs - unbeknownst to eslint, they all
correspond to valid signal parameters - but they don't contribute
anything to clarity, so just remove them anyway.
https://gitlab.gnome.org/GNOME/gnome-shell/merge_requests/627
Braces are optional for single-line arrow functions, but there's a
subtle difference:
Without braces, the expression is implicitly used as return value; with
braces, the function returns nothing unless there's an explicit return.
We currently reflect that in our style by only omitting braces when the
function is expected to have a return value, but that's not very obvious,
not an important differentiation to make, and not easy to express in an
automatic rule.
So just omit braces consistently as mandated by gjs' coding style.
https://gitlab.gnome.org/GNOME/gnome-shell/merge_requests/608
Commit f285f2c6 changed Scripting.createTestWindow() to accept a parameter
object instead of a parameter list but forgot to remove the width and height
arguments. This breaks the "core" test as all windows are created with default
settings.
Instead of always logging frame timestamps for every frame - which
was using >26 bytes of memory per frame, or 5MB per hour of continuous
redrawing - make frame timestamps something that defaults off and is
turned turned on using a new ShellGlobal::frame-timestamps property by
the perf scripts.
https://bugzilla.gnome.org/show_bug.cgi?id=732350
js2-mode is no longer developed and we recommend js-mode these days,
so switch the modelines to specify that, and make them consistent
across all files.
https://bugzilla.gnome.org/show_bug.cgi?id=660358
Add metrics:
overviewFps5Windows
overviewFps10Windows
overviewFps5Maximzed
overviewFps10Maximized
overviewFps5Alpha
overviewFps10Alpha
To have more numbers to show how performance varies with different
numbers of windows and different types of windows (maximized,
with an alpha channel.)
https://bugzilla.gnome.org/show_bug.cgi?id=644265
* Run gnome-shell-perf-helper during performance tests
* Use MUTTER_WM_CLASS_FILTER to omit all other windows
* Add new Scripting methods: createTestWindow,
waitTestWindows, destroyTestWindows
* Create a single 640x480 test window for testing overview
animation performance.
https://bugzilla.gnome.org/show_bug.cgi?id=644265
Compute a frame rate for the period between:
- User sees first frame of overview animation
- User sees fully zoomed-out overview
And replace the current Frames count metrics with this. The
previous frame count metrics were actually over the period from
the start of the animation until the window labels finished
animating in; here we are careful to look at a more restricted
period.
https://bugzilla.gnome.org/show_bug.cgi?id=619521
Switch from having separate METRICS and METRIC_DESCRIPTIONS objects
in a perf module to a single METRICS array. This is done so the
perf module can define the units for each metric.
In addition to improving the output in the web interface, the purpose
of having units is to give some clue about how to pick from multiple
values from different runs. In particular, with the assumption that
"noise" on the system will increase run times, for time values we want
to pick the smallest values, while for "rate" values, we want to pick
the largest value.
https://bugzilla.gnome.org/show_bug.cgi?id=618189
Add some basic statistics for allocated memory based on mallinfo(),
and use that to define two metrics:
usedAfterOverview: bytes used after the overview is shown once
leakedAfterOverview: additional bytes used when the overview is
shown a second time.
https://bugzilla.gnome.org/show_bug.cgi?id=618189
We want to be able to summarize the behavior of the shell's
performance in a series of "metrics", like the latency between
clicking on the Activities button and seeing a response.
This patch adds the ability to create a script under perf/
in a special format that automates a series of actions in the
shell, writing events to the performance log, then collects
statistics as the log as replayed and turns them into a set
of metrics.
The script is then executed by running as gnome-shell
--perf=<script>.
The 'core' script which is added here will be the primary
performance measurement script that we use for shell performance
regression testing; right now it has a couple of placeholder
metrics.
https://bugzilla.gnome.org/show_bug.cgi?id=618189