Always force-track the cursor position (so that the X11 backend can keep
it up to date), and if the cursor wasn't part of the sampled
framebuffer when reading pixels into CPU memory, draw it in an extra
pass using cairo after the fact. The cairo based cursor painting only
happens on the X11 backend, as we otherwise inhibit the hw cursor.
https://gitlab.gnome.org/GNOME/mutter/-/merge_requests/1391
During animation or other things that cause multiple frames in a row
being painted, we might skip recording frames if the max framerate is
reached.
Doing so means we might end up skipping the last frame in a series,
ending with the last frame we sent was not the last one, making things
appear to get stuck sometimes.
Handle this by creating a timeout if we ever throttle, and at the time
the timeout callback is triggered, make sure we eventually send an up to
date frame.
This is handle differently depending on the source type. A monitor
source type reports 1x1 pixel damage on each view its monitor overlaps,
while a window source type simply records a frame from the surface
directly, except without recording a timestamp, so that timestamps
always refer to when damage actually happened.
https://gitlab.gnome.org/GNOME/mutter/-/merge_requests/1361
Now that we don't use the record function to early out depending on
implicit state (don't record pixels if only cursor moved for example),
let it simply report an error when it fails, as we should no longer ever
return without pixels if nothing failed.
https://gitlab.gnome.org/GNOME/mutter/-/merge_requests/1361
Both do more or less the same but with different methods - one puts
pixels into a buffer using the CPU, the other puts pixels into a buffer
using the GPU.
However, they are behaving slightly different, which they shouldn't.
Lets first address the misleading disconnect in naming, and later we'll
make them behave more similarly.
https://gitlab.gnome.org/GNOME/mutter/-/merge_requests/1361
Implement PipeWire's add_buffer and remove buffer, try and export
a DMA buffer first and, on failure, fallback to memfd.
When DMA buffers are successfully created and shared, blit the
framebuffer contents when drawing instead of downloading the pixels.
https://gitlab.gnome.org/GNOME/mutter/merge_requests/1086
Make the monitor implementation do things strictly related to its own
source type, leaving the Spa related logic and cursor read back in the
generic layer, later to be reused by the window source type
implementation.
https://gitlab.gnome.org/GNOME/mutter/merge_requests/413
The 'cursor-mode', which currently is limited to RecordMonitor(), allows
the user to either do screen casts where the cursor is hidden, embedded
in the framebuffer, or sent as PipeWire stream metadata.
The latter allows the user to get cursor updates sent, including the
cursor sprite, without requiring a stage paint each frame. Currently
this is done by using the cursor sprite texture, and either reading
directly from, or drawing to an offscreen framebuffer which is read from
instead, in case the texture is scaled.
https://gitlab.gnome.org/GNOME/mutter/merge_requests/357
To be able to cast windows, which by definition can change in size
dynamically, we need a way to specify the video crop meta to adjust to
the window size whenever it changes.
Add VideoCrop support with a new optional hook `get_videocrop()` in the
`ScreenCastStreamSrcClass` which, if defined, can let the child specify
a rectangle for the video cropping area.
https://gitlab.gnome.org/GNOME/mutter/merge_requests/306
This commit adds basic screen casting and remote desktoping
functionalty. This works by exposing two D-Bus API services:
org.gnome.Mutter.ScreenCast and org.gnome.Mutter.RemoteDesktop.
The remote desktop API is used to create remote desktop sessions. For
each session, a D-Bus object is created, and an application can manage
the session by sending messages to the session object. A remote desktop
session the user to emit input events using the D-Bus methods on the
session object. To get framebuffer content, the application should
create an associated screen cast session.
The screen cast API is used to create screen cast sessions. One can so
far either create stand-alone screen cast sessions, or a screen cast
session associated with a remote desktop session. A remote desktop
associated screen cast session is managed by the remote desktop session.
So far only remote desktop managed screen cast sessions are implemented.
Each screen cast session may have one or more streams. A screen cast
stream is a stream of buffers of some part of the compositor content.
So far API exists for creating streams of monitors and windows, but
only monitor streams are implemented.
When a screen cast session is started, the one PipeWire stream is
created for each screen cast stream created for the session. When this
has happened, a PipeWireStreamAdded signal is emitted on the stream
object, passing a unique identifier. The application may use this
identifier to find the associated stream being advertised by the
PipeWire daemon.
The remote desktop and screen cast functionality must be explicitly be
enabled at ./configure time by passing --enable-remote-desktop to
./configure. Doing this will build both screen cast and remote desktop
support.
To actually enable the screen casting and remote desktop, the user must
enable the experimental feature. See
org.gnome.mutter.experimental-features.
https://bugzilla.gnome.org/show_bug.cgi?id=784199