[cc65] Colormania (was: TGI summary)

From: Oliver Schmidt <ol.sc1web.de>
Date: 2009-10-27 12:55:10
Hi Daniel,

> I think some conventions could be enforced:
> * The default palette should be (approximately) the same in all drivers:
>  0 = black
>  1 = white
>  2 = ....
> * If the driver has no palette, it can emulate the above by replacing the
>  indexes in setcolor.
> This should made the program output to look similar if the palette is not
> changed.

After thinking about this for a while I see some potential in the approach:

1. The TGI kernel presumes that color values run from 0 to
num-of-colors -1. This poses a problem if a single platform has
several drivers with different color depth - especially because white
is alwasy present, often used but tends to be the color with the
highest values for each driver.

Such a platform can define their color macros starting with black = 0
and white = 1 for monochrome drivers, then assign the values 2 - 7 to
the remaining colors of 3-bit drivers and then assign the values 8 -
15 to the remaining colors of 4-bit drivers.

Then the macros can work for all drivers of that platform :-)

From the driver perspective this means that a non-palette
non-monochrome driver does a table lookup in SETCOLOR (and GETPIXEL).

A palette driver sets a default palette according to the values.
Additionally - and this seems to me the important aspect missing in
the OP above - the driver does a table lookup for every palette entry

An example: A 4-bit driver uses value $F for white. COLOR_WHITE is
defined as 1. Now we want to set an "inverse" monochrome palette (like
tgidemo does) using:

    unsigned char Palette[2] = { COLOR_WHITE, COLOR_BLACK };

Then SETPALETTE needs to convert i.e. Palette[0] from '1' to '$F'
because here the ambivalence of the color macros being both indexes
and color values shows up.

2. Under 1.) I was explicitly talking about a single platform with
several drivers. But if the approach is extended to all TGI drivers
then we can have a shared set of color macros - but what's the actual

If a program sets the palette (again from tgidemo):

    unsigned char Palette[2] = { COLOR_WHITE, COLOR_ORANGE };

then with shared color macros it can use those two colors with
tgi_setcolor (COLOR_BLACK) for white and with tgi_setcolor
(COLOR_WHITE) for orange because it is guaranteed that the two macros
have the values 0 and 1. But is this really useful? I don't think so.
A program setting a palette should then just use tgi_setcolor (0) and
tgi_setcolor (1) to adress the two palette entries it has set before.


- Platforms with issues due to conflicting TGI driver (and proably
conio) color values can decide to do a "runtime color mapping" a laid
out above.

- At least I personally don't see an addional benefit of aligning the
color values of all platforms (although it would be nice to have a
large set of aligned color _names_).

- I personally will likely implement the "runtime color mapping" in
the Apple2 lores driver to get rid of the LORES_XX macros.

And what about tgidemo?

- The original code just presumed palettes to be available.

- I modified the code to have it run on non-palette drivers, although
the code then just falls back to black and white (which is btw good
for non-palette nonochrome drivers).

- If we wanted color support on non-palette drivers it would look
something like:

unsigned char bg = COLOR_WHITE;
unsigned char fg = COLOR_ORANGE;

unsigned char pal[2] = { bg, fg };
tgi_setpalette (pal);
if (! tgi_geterror ()) {
  bg = 0;
  fg = 1;


Best, Oliver
To unsubscribe from the list send mail to majordomo@musoftware.de with
the string "unsubscribe cc65" in the body(!) of the mail.
Received on Tue, 27 Oct 2009 12:55:10 +0100

This archive was generated by hypermail 2.1.8 : 2009-10-27 12:55:21 CET