cogl_read_pixels() no longer asserts that the format passed in is
RGBA_8888 but instead accepts any format. The appropriate GL enums for
the format are passed to glReadPixels so OpenGL should be perform a
conversion if neccessary.
It currently assumes glReadPixels will always give us premultiplied
data. This will usually be correct because the result of the default
blending operations for Cogl ends up with premultiplied data in the
framebuffer. However it is possible for the framebuffer to be in
whatever format depending on what CoglMaterial is used to render to
it. Eventually we may want to add a way for an application to inform
Cogl that the framebuffer is not premultiplied in case it is being
used for some special purpose.
If the requested format is not premultiplied then Cogl will convert
it. The tests have been changed to read the data as premultiplied so
that they won't be affected by the conversion. Picking in Clutter has
been changed to use COGL_PIXEL_FORMAT_RGB_888 because it doesn't need
the alpha component. clutter_stage_read_pixels is left unchanged
because the application can't specify a format for that so it seems to
make most sense to store unpremultiplied values.
http://bugzilla.openedhand.com/show_bug.cgi?id=1959