• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

what is size of char in java5?

 
Ranch Hand
Posts: 136
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
What is the size of chat data type in java5? Is it 32 bit supplimentary character set or 16 bit unicode character set? Upto java4 size of char was 16 bit.
 
author and iconoclast
Posts: 24207
46
Mac OS X Eclipse IDE Chrome
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
A Java char is, as always, a 16-bit integral type. Some characters in a String may be represented using more than one char, though. The Javadoc for the Character class (here) explains some of the issues.
 
Ranch Hand
Posts: 624
IntelliJ IDE Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
The sizes of primitives do not change from version to version; this is to ensure backwards compatibility, as well as cross-platform compatibility. As shown on the Primitive Data Types page of the Java Tutorial:

The char data type is a single 16-bit Unicode character. It has a minimum value of '\u0000' (or 0) and a maximum value of '\uffff' (or 65,535 inclusive).


[ August 06, 2008: Message edited by: Mark Vedder ]
 
Minal Silimkar-Urankar
Ranch Hand
Posts: 136
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
As far as my knowledge the default size of the char is 16 bit only. But by doing some settings, we can add extra bits to the char, that time the char will be 32 bit.
[ August 06, 2008: Message edited by: minal silimkar ]
 
Bartender
Posts: 2856
10
Firefox Browser Fedora Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by minal silimkar
But by doing some settings, we can add extra bits to the char, that time the char will be 32 bit.



What settings are you talking about
Will you please explain in detail.
 
Mark Vedder
Ranch Hand
Posts: 624
IntelliJ IDE Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by minal silimkar:
As far as my knowledge the default size of the char is 16 bit only. But by doing some settings, we can add extra bits to the char, that time the char will be 32 bit.

[ August 06, 2008: Message edited by: minal silimkar ]



A char is always 16 bits. If you read the documentation that Ernest referenced in his reply, you will see that sometimes Unicode supplementary characters are represented as a pair of char values; so this Unicode Code Point uses 32-bits, but again, the char itself, as a primitive data type, is 16-bits.

You might also be thinking of casting a char to an int. ints use 32-bits. But once you do that, you no longer have a char; you have an int. So the char is not 32-bit.
 
Sheriff
Posts: 9708
43
Android Google Web Toolkit Hibernate IntelliJ IDE Spring Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Well as everybody is saying a char is 16bit. But Java supports characters which are above the range of char through supplementary characters.
I think this would help in understanding of supplementary characters-
Sun Tutorial on Supplementary Characters

If you don't want to read it in whole it basically tells that in earlier versions of UTF16 encoding range of \u0000 to \uD7FF was used to represent all the characters. Rest of the range i.e. \uD800 to \uFFFF was left blank. But when new characters were added to the encoding scheme the total number of characters exceeded 65,536. So in Unicode v.3.1 the new character were added as supplementary characters. This means that 32bits were needed to store them. They were represented as a pair of 16 bits. In this pair the first 16 bit must be in the range of \uD800 to \uDBFF. The second 16 bit must be in the range of \uDC00 to \uFFFF.

So to represent a supplementary character a possible combination of two 16 bit values can be \uD815 \uF253 or \uDA5F \E8F3 and so on......

Do you get it??? :roll:
[ August 06, 2008: Message edited by: AnkitJi Garg ]
 
Danger, 10,000 volts, very electic .... tiny ad:
a bit of art, as a gift, that will fit in a stocking
https://gardener-gift.com
reply
    Bookmark Topic Watch Topic
  • New Topic