Please explain how assignment to character takes place:
char cf = '\101'; works fine.... what does it signify?
But char c = '\1001' ; gives compilation error.
if we assign unicode, why '\u00001' gives error even though it equals '\u0001'?
Joined: Apr 25, 2005
'\101' is an octal representation of a number. The largest printable character is 255 decimal, or 377 octal. What is interesting is that if you assign the decimal value of '\1001' (513) to a char, that will print '?'. Why that is, I don't know. As for the unicode values, they must be exactly 4 hex digits.