This week's book giveaway is in the OCPJP forum. We're giving away four copies of OCA/OCP Java SE 7 Programmer I & II Study Guide and have Kathy Sierra & Bert Bates on-line! See this thread for details.
for both of the cases the ASCII <edit>(not actually ASCII which is limited to 128 characters, but Unicode which is wider 16bit)</edit> values of the character will be taken into consideration.
range of char is 0 to 65535.
Compiler tries to validate the addition, whether the added integer will be in this range or not.
in first line :
here compiler can make certain at compile time that the resulting integer from adding 'a' (which is 97 numerically) with 1 will fall in range 0 to 65535 so it is happy.
But in this line
compiler can not be certain at compile time what the value of variable c will be. therefore it can not be sure that addition will fall in range of 0 to 65535. so it complains.
ashwin bhawsar wrote:
Then why compiler does not complains for other data types
An integer and a float has no "natural" limit that can be defined in the compiler such as a character. That is limited by the JVM and/or OS.
However, you might get some strange behavior when you try to go over the limit for a primitive. The spec tells us that to avoid RuntimeExceptions, the operation MAX_VALUE + 1 should send you to the other end of the scale, MIN_VALUE.
This little piece of code shows that:
Edit: Actually, I need to correct myself. The primitives in Java has fixed sizes that are independent of the OS/JVM-implementation. An integer is a 32 bit signed value that has a minimum value of -2,147,483,648 and a maximum value of 2,147,483,647. Always.
Sorry guys, but your replies didn't answer my question fully.
I now that you cannot directly assign out-of-range values to variables ( e.g: byte b=200;// illegal )
Some of you say that compiler throws exception for code involving chars, because the compiler cannot guarantee that var 'c' will not be out-of-range.
But this does not answer the question that 'Why its legal for other data-types?' like the code below shows
Someone told me that, when we use character literals in an expression ( like: 2+'a'+3 ) the character literals are converted into respective ASCII values for the sake of calculation. But when we use variables in expressions like (char c = 'a' + 1; char d= c + 1; ) the character variables are not converted into respective ASCII values.But this does not answer my question for other data-types like int, for e.g:
So if you guys have any answers, it would be great if you could post them along with URL links.
Piyush Joshi wrote: . . . but Unicode which is wider 16bit . . .
That is a bit out of date. Unicode now goes up to (I think) 20 bits. But that doesn't really concern us here.
I believe 'a' + 65439 is permitted, but because of an overflow error it will return 0. You have to cast the expression to (char), however.
Try recalculating those figures into hexadecimal, which would make the arithmetic easier to visualise.