Re: [cc65] hex encoded string bug?

From: <Maspethrose71aol.com>
Date: 2010-11-25 13:04:38
Why not just define a hex escape sequence in a string literal as if it were 
 a byte constant?


-------------------
Joseph Rose, a.k.a. Harry  Potter
Working magic in the computer community...or at least striving to!  :(
 
In a message dated 11/25/2010 5:53:26 A.M. Eastern Standard Time,  
thefox@aspekt.fi writes:
On Thu, Nov 25, 2010 at 12:32 PM, Stefan Wessels  <swessels@email.com> 
wrote:
> I think there may be a bug in the way  cc65 deals with hex encoded 
strings.  The following program has output as  shown when run on the C64 and OS X. 
 I can't see any reason for the first  character coming out as 0x48 and the 
5th as 0xc0.

Not a bug. It applies a  character map to string literals based on
system, in this case it maps from  ASCII 'h' (0x68) to PETSCII 'h'
(0x48).

Not really sure why you're  using string literals anyway, you could use
something like this  instead:

const unsigned char level_data[] = {
0x68, 0x03, ...  etc
};

unsigned char *levelBuffer[1] =
{
level_data
};

-thefox
----------------------------------------------------------------------
To  unsubscribe from the list send mail to majordomo@musoftware.de with
the  string "unsubscribe cc65" in the body(!) of the  mail.

----------------------------------------------------------------------
To unsubscribe from the list send mail to majordomo@musoftware.de with
the string "unsubscribe cc65" in the body(!) of the mail.
Received on Thu Nov 25 13:04:55 2010

This archive was generated by hypermail 2.1.8 : 2010-11-25 13:04:59 CET