previous | start | next

Mixing Signed and Unsigned Integers I

Assigning an unsigned integer to a signed integer or the reverse, simply copies the bytes without change.

The difference is how the same bits are now interpreted.

Example:

 unsigned char uc = 130; // 0x82
 char c;

 c = uc; // c is 0x82 (surprise? no), but as a signed char this is -126      
   


previous | start | next