Tryinh to add more to what Corey said ...
The size of an int is 32 bits. The size of a char is 16. When you try to assign an int value to char, the compiler tries to see if the value being assigned can be fit within the bit-size of the value that it is being assigned to. If yes, it will go ahead and do it. If not, it will give a loss od pression compile time error. So when you declare an int literal value to be final, the compile will know that this value is not going to change anymore. The number of bits required for holding that value is constant.
Hence, this program is also not going to work.
Reason: By the time, the compiler tries to do assignment in the second statement it knows that the number of bits required for this assignment is more than 16.
It is interesting to note that the following think doesnot work either.
I am wondering why only char, byte and short can be assigned values of different primitives (for example char) and why not int.
Thanks,
-skd