Re: [cc65] Re: TGI colors revisited

From: Karri Kaksonen <karri1sipo.fi>
Date: 2011-05-04 08:49:25
On 04.05.2011 00:16, Oliver Schmidt wrote:
>
> So the result is: If you don't want to redesign the roles of the TGI
> kernel and driver on tgi_init() then tgi_getdefpalette() comes for
> free.

I like the possibility to call tgi_getdefpalette. It allows me to code like:

tgi_getdefpalette
copy the RGB values indexed by TGI_COLOR_BLUE to my own palette
tgi_setpalette

After this TGI_COLOR_BLUE works as before even if the rest of the colors 
in my own palette may be different.

>
>> In order to allow the palette manipulation that you have described
>> earlier, we must use the brush table as the TGI palette.  But, after the
>> palette is changed, there might not be a one-to-one match between the
>> values that were given to tgi_setcolor() and the values that are
>> returned when tgi_getpixel() is used on "old" pixels (ones that were
>> drawn before the palette was changed)!
> If I understand you right than this approach wouldn't fit the
> semantics of a palette. A palette means - at least from my perspective
> - that setting a (different) palette immediately causes re-coloring of
> all already displayed pixels.

All indexed palettes work this way. This effect is frequently used for 
"animation" in games. A good example i Boing in Amiga where the sphere 
"rotates" due to manipulating the indexed palette.

> The usual approach would rather be to translate the values in the TGI
> palette into values for the 12-bit palette on calls to
> tgi_setpalette(). A call to tgi_getpalette() could either return a
> shadow copy of the last TGI palette given to tgi_setpalette() or
> alternatively translate values in the 12-bit palette back into values
> in the TGI palette. Or do I miss the point why this generally doesn't
> work on the Lynx?

This is a low-level driver. The tgi_getpalette returns a pointer to the 
hardware palette in the graphics chip. There is no stored "palette" in 
the driver.

The only stored palette in the tgi driver is the default palette.

Also the tgi_setpalette will just write to the chips registers directly. 
It will not store a copy of the palette anywhere.

Some programmers even change the palette at start of each scanline to 
get more colors than 16 on screen at once.

I like the current TGI-driver. What we could do is to choose the best 16 
colors and give consistent names to them. The indexes could also be in 
some kind of order of importance. Then every driver can pick as many 
colors of the palette as it can.

Here would be my suggestion of a great palette for games in order of 
importance:
TGI_COLOR_BLACK 0
TGI_COLOR_WHITE 1

TGI_COLOR_RED 2
TGI_COLOR_GREEN 3

TGI_COLOR_BLUE 4
TGI_COLOR_GREY 5
TGI_COLOR_YELLOW 6
TGI_COLOR_NAVYBLUE 7

TGI_COLOR_ORANGE 8
TGI_COLOR_GREENPEA 9
TGI_COLOR_LIGHTGREY 10
TGI_COLOR_DARKGREY 11
TGI_COLOR_AMBER 12
TGI_COLOR_SKYBLUE 13
TGI_COLOR_FLESH 14
TGI_COLOR_BURGUNDY 15

I suppose the the available colors are 2, 4, 8, 16 depending on the driver.

I don't believe in the machine generated palette idea. It would not 
produce nice colors for such a limited palette. You would need at least 
64 colors for making it work.
--
Karri

----------------------------------------------------------------------
To unsubscribe from the list send mail to majordomo@musoftware.de with
the string "unsubscribe cc65" in the body(!) of the mail.
Received on Wed May 4 08:50:11 2011

This archive was generated by hypermail 2.1.8 : 2011-05-04 08:50:14 CEST