[Xfce-bugs] [Bug 922] Weird "banding" problem rendering smooth gradients in desktop backdrop and gqview
bugzilla-daemon at xfce.org
bugzilla-daemon at xfce.org
Tue Apr 26 02:30:34 CEST 2005
Do NOT reply to this email. To make further comments on this bug, use
the URL below:
http://bugzilla.xfce.org/show_bug.cgi?id=922
------- Additional Comments From bjt23 at cornell.edu 2005-04-26 00:30 UTC -------
Yep, as I expected. The 16-bit display is the problem. AFAIK, there are a
couple possibilities here:
gdk_pixbuf_scale() sucks at lower bit depths.
gdk_pixbuf_composite() sucks at lower bit depths.
gdk_pixbuf_render_pixmap_and_mask() sucks at lower bit depths.
Now... The GdkPixbuf that's initially created for the backdrop uses 8 bits per
pixel, and _scale() and _composite() shouldn't be able to change that, and
shouldn't care about the X visual being used (as they're all client-side).
So that leaves gdk_pixmap_render_pixmap_and_mask(). Perhaps it's not dithering
properly?
OR, maybe I'm going about this the wrong way. Maybe it's dithering like it's
supposed to, but that's just a bad thing to make it do. Perhaps the initial
GdkPixbuf should be created with the bit depth of the X visual used for the
desktop window. Of course, this would totally break the current gradient code,
and make writing a replacement a PITA.
Anyone else on xfce-bugs with a clue, feel free to chime in ^_~.
--
Configure bugmail: http://bugzilla.xfce.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug, or are watching the assignee.
More information about the Xfce-bugs
mailing list