Win a copy of Learn Spring Security (video course) this week in the Spring forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

about unicode.

 
jaman tai
Ranch Hand
Posts: 37
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
what's wrong with the following:
char a = '\u000A';
why not error when compile
char a = '\u100A';

what is the rule of set a unicode to a char?
 
Kris Krason
Greenhorn
Posts: 25
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
char a = '\u000A';

unicode value of \u000A is so called line feed, you can't also have a \u000D becouse its a carriage return.

Unicode transformation is done before compilation, so using either of those values so, your code would look like :

char a = '
';

And that's incorrect in java.
 
jaman tai
Ranch Hand
Posts: 37
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thank you Krzysztof. I am wondering if all code between '\u0000' to '\uffff' can be assign to char except only '\u000A' and '\u000D'.
 
Kris Krason
Greenhorn
Posts: 25
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Well,
I think that e.g. unicode char for ' or \ would be a problem here also
It depends on where you put it, if inside "" than the ' isn't a problem but \ still could be.
 
jaman tai
Ranch Hand
Posts: 37
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
sorry, i dont know what u mean? we can not use " to assign a value to a char.
 
Kris Krason
Greenhorn
Posts: 25
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Ok, i mixed the char and String

unicode can be used everywhere in the code. It's just like letter a you could use \u representation of it.

so, e.g.:
int a = 1;

is equivalent to:
\u0069\u006e\u0074 \u0061 = 1;

So, going back to your first question. It's not what unicode characters you can use in java source (you can use any), unicode is just a diffrent way of writing a letter, sign, etc.

Look at those:
char a = '\u0061'; // its correct, because it is actually char a = 'a';
char a = '\u0027'; // its NOT correct, because it is actually char a = ''';
 
srikanth reddy
Ranch Hand
Posts: 252
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
hi krason...
can you tell me like '\u000A','\u000D'....are there any unicodes where u get compile time error...

thanks

sri
 
Kris Krason
Greenhorn
Posts: 25
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hmm,

my previous post was all about it

\u000A and \u000D don't always give compile time error !

They do only when you insert them where "enter" souldn't be, like in a char or String.

Unicode is not magic, it's just another way to write some signs/letters/digits.

If you prefer strait answers, then if, and only if you have a char declaration, then the following are incorrect :

char a = '\u000A'; // '
char a = '\u000D'; // '
char a = '\u0027'; // '''
char a = '\u005c'; // '\'


but:

char a = '\u005cn';
char a = '\u005cr';
char a = '\u005c\u005c';
char a = '\u005c\u0027';

are CORRECT.
[ October 13, 2005: Message edited by: Krzysztof Krason ]
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic