Need help on 16 bit TrueColor

Need help on 16 bit TrueColor

Post by Quyen D Nguye » Thu, 10 Feb 2000 04:00:00


I have a problem in dealing with color in the mode of 16 bit TrueColor.
I hope someone would give me some hints how to solve the problem.

Several years ago, when I had a X terminal on my desk, I developed a
simple image display program to display a 2-D image (8bit per sample).
In order to display this pseudo gray scale image, I used a 256 entry
colormap using XStoreColors function. Because at that time, I had access
only to 8bit PseudoColor visual (cheap X terminal), using the
XStoreColor function method was working fine.  A few months ago, I had a
new SUN workstation with 24 bit display card. The original code was not
working because the colormap is not write able. I modified the code to
use the XAllocColor functions to find out the index of required color
and mapped them before I copy to the correct pixmap. On the SUN
workstation, the new code is working fine. It's amazing that I don't
have the flashing color problem; but each time I update color table in
order to display the original image in different color scheme, I have to
redraw the pixmap. At this point, my SUN workstation is fast enough so
it was not a big problem (however, I would love to hear any solution how
to update the color table without redrawing the pixmap).

My real problem at this time is porting the new code to my linux
machine. Because of limited function (or cheap) display card, I am able
to run the X server at 16 bit. When I call the XAllocColor functions,
the return color is only approximate color. Moreover, since my 8bit
image is gray scale and some return colors don't have the same values
for the red, green and blue components, my gray scale image become a
strange pseudo color image.  The man page for XAllocColor  mentioned
that it returns the pixel value of the color "closest" to the specified
RGB elements supported by the hardware.   How do I locate the correct
color for my linux machine?

In different version, I avoid the XAllocColor function and instead
compute the pixel value of required color based on the red, blue and
green components (using the mask of each color). The display show the
same image as I use the XAllocColor function.

I notice that for the linux machine the return value for DisplayCells is
only 64. For SUN workstation, the return value is 256. Since my image is
8bit per sample,  I want to use a color table of 256 entries; each entry
will represent the pseudo color of the image pixel. Could my problem be
related to only 64 colors available for the X server?  Is there a way to
start the KDE windows with 256 colors in my linux machine?

For 24 bit TrueColor visual, it is clear that I can use three bytes to
represent 3 basic colors, red, green and blue. For 16 bit TrueColor
visual, in my linux computer, the red and green colors are stored in 5
bit fields and the blue color is stored in 6 bit field (strange
combination, but these values are from the color mask fields in the
visual structure).  Based on this, I should be able to allocate any
colors given the red, blue and green colors for 16 bit TrueColor. But I
did not. Where did I go wrong?

If anyone have any suggestions, please let me know. Thank you for your
time in advance.

For my linux machine, if I force  the display to run at 8bit, my
original code (using 8bit PseudoColor method) is working fine.  I would
like to rewrite my application so it could run on the TrueColor visuals
with either 24 bit or 16 bit. So far, the 16 bit TrueColor visual gave
me a headache.

Quyen D Nguyen

work phone:  818-354-9526
office: T 1025 D