posted 23 years ago
char c = 100; //ok in declaration as long as value w/in range
int i = 100;
char c = i; //need cast...I think because you're not using a literal here, so compiler can't check
c = c + i; //need cast
c += i; //ok, but I'm not sure why.
I know that c += 1 is fine because automatic promotion of the literal to int doesn't happen with +=, but I'm confused with c += i not needing a cast.
Anyone?