11 Color

11.010 My texture map colors reverse blue and red, yellow and cyan, etc. What's happening?

Your texture image has the reverse byte ordering of what OpenGL is expecting. One way to handle this is to swap bytes within your code before passing the data to OpenGL.

Under OpenGL 1.2, you may specify GL_BGR or GL_BGRA as the "format" parameter to glDrawPixels(), glGetTexImage(), glReadPixels(), glTexImage1D(), glTexImage2D(), and glTexImage3D(). In previous versions of OpenGL, this functionality might be available in the form of the EXT_bgra extension (using GL_BGR_EXT and GL_BGRA_EXT as the "format" parameter).

11.020 How do I render a color index into an RGB window or vice versa?

There isn't a way to do this. However, you might consider opening an RGB window with a color index overlay plane, if it works in your application.

If you have an array of color indices that you want to use as a texture map, you might want to consider using GL_EXT_paletted_texture, which lets an application specify a color index texture map with a color palette.

11.030 The colors are almost entirely missing when I render in Microsoft Windows. What's happening?

The most probable cause is that the Windows display is set to 256 colors. To change it, you can increase the color depth by clicking the right mouse button on the desktop, then select Properties, the Settings tab, and change the number of colors in the Color Palette to a higher number.

11.040 How do I specify an exact color for a primitive?

First, you'll need to know the depth of the color buffer you are rendering to. For an RGB color buffer, you can obtain these values with the following code:

GLint redBits, greenBits, blueBits;

glGetIntegerv (GL_RED_BITS, &redBits);
glGetIntegerv (GL_GREEN_BITS, &greenBits);
glGetIntegerv (GL_BLUE_BITS, &blueBits);

If the depth value for each component is at least as large as your required color precision, you can specify an exact color for your primitives. Specify the color you want to use into the most significant bits of three unsigned integers and use glColor3ui() to specify the color.

If your color buffer isn't deep enough to accurately represent the color you desire, you'll need a fallback strategy. Trimming off the least significant bits of each color component is an acceptable alternative. Again, use glColor3ui() (or glColor3us(), etc.) to specify the color with your values stored in the most significant bits of each parameter.

In either event, you'll need to ensure that any state that could affect the final color has been disabled. The following code will accomplish this:

glDisable (GL_BLEND);
glDisable (GL_DITHER);
glDisable (GL_FOG);
glDisable (GL_LIGHTING);
glDisable (GL_TEXTURE_1D);
glDisable (GL_TEXTURE_2D);
glDisable (GL_TEXTURE_3D);
glShadeModel (GL_FLAT);

11.050 How do I render each primitive in a unique color?

You need to know the depth of each component in your color buffer. The previous question contains the code to obtain these values. The depth tells you the number of unique color values you can render. For example, if you use the code from the previous question, which retrieves the color depth in redBits, greenBits, and blueBits, the number of unique colors available is 2^(redBits+greenBits+blueBits).

If this number is greater than the number of primitives you want to render, there is no problem. You need to use glColor3ui() (or glColor3us(), etc) to specify each color, and store the desired color in the most significant bits of each parameter. You can code a loop to render each primitive in a unique color with the following:

/*
   Given: numPrims is the number of primitives to render.
   Given void renderPrimitive(unsigned long) is a routine to render the primitive specified by the given parameter index.
   Given GLuint makeMask (GLint) returns a bit mask for the number of bits specified.
 */

GLuint redMask = makeMask(redBits) << (greenBits + blueBits);
GLuint greenMask = makeMask(greenBits) << blueBits;
GLuint blueMask = makeMask(blueBits);
int redShift = 32 - (redBits+greenBits+blueBits);
int greenShift = 32 - (greenBits+blueBits);
int blueShift = 32 - blueBits;
unsigned long indx;

for (indx=0; indx<numPrims, indx++) {
   glColor3ui (indx & redMask << redShift,
               indx & greenMask << greenShift,
               indx & blueMask << blueShift);
   renderPrimitive (indx);
}

Also, make sure you disable any state that could alter the final color. See the question above for a code snippet to accomplish this.

If you're using this for picking instead of the ususal Selection feature, any color subsequently read back from the color buffer can easily be converted to the indx value of the primitive rendered in that color.