A Java char is, as always, a 16-bit integral type. Some characters in a String may be represented using more than one char, though. The Javadoc for the Character class (here) explains some of the issues.
The sizes of primitives do not change from version to version; this is to ensure backwards compatibility, as well as cross-platform compatibility. As shown on the Primitive Data Types page of the Java Tutorial:
The char data type is a single 16-bit Unicode character. It has a minimum value of '\u0000' (or 0) and a maximum value of '\uffff' (or 65,535 inclusive).
[ August 06, 2008: Message edited by: Mark Vedder ]
Joined: Nov 25, 2007
As far as my knowledge the default size of the char is 16 bit only. But by doing some settings, we can add extra bits to the char, that time the char will be 32 bit. [ August 06, 2008: Message edited by: minal silimkar ]
Originally posted by minal silimkar: As far as my knowledge the default size of the char is 16 bit only. But by doing some settings, we can add extra bits to the char, that time the char will be 32 bit.
[ August 06, 2008: Message edited by: minal silimkar ]
A char is always 16 bits. If you read the documentation that Ernest referenced in his reply, you will see that sometimes Unicode supplementary characters are represented as a pair of char values; so this Unicode Code Point uses 32-bits, but again, the char itself, as a primitive data type, is 16-bits.
You might also be thinking of casting a char to an int. ints use 32-bits. But once you do that, you no longer have a char; you have an int. So the char is not 32-bit.
Well as everybody is saying a char is 16bit. But Java supports characters which are above the range of char through supplementary characters. I think this would help in understanding of supplementary characters- Sun Tutorial on Supplementary Characters
If you don't want to read it in whole it basically tells that in earlier versions of UTF16 encoding range of \u0000 to \uD7FF was used to represent all the characters. Rest of the range i.e. \uD800 to \uFFFF was left blank. But when new characters were added to the encoding scheme the total number of characters exceeded 65,536. So in Unicode v.3.1 the new character were added as supplementary characters. This means that 32bits were needed to store them. They were represented as a pair of 16 bits. In this pair the first 16 bit must be in the range of \uD800 to \uDBFF. The second 16 bit must be in the range of \uDC00 to \uFFFF.
So to represent a supplementary character a possible combination of two 16 bit values can be \uD815 \uF253 or \uDA5F \E8F3 and so on......
Do you get it??? :roll: [ August 06, 2008: Message edited by: AnkitJi Garg ]