char c = 100; //ok in declaration as long as value w/in range int i = 100; char c = i; //need cast...I think because you're not using a literal here, so compiler can't check c = c + i; //need cast c += i; //ok, but I'm not sure why. I know that c += 1 is fine because automatic promotion of the literal to int doesn't happen with +=, but I'm confused with c += i not needing a cast. Anyone?
Just to add more: char c = 100;//Ok because 100 is constant and valid within char's range. int i =100; char c = i;//invalid because i is a variable, whose value could go out of bounds of char's range Similarly c= c+ i;invalid because the result of an arithmetic operation is at least an int, which is wider that char c+=i;Ok because for op= type, an implicit cast takes place That's it, in brief. Herbert.