I find this curious - to generate a random character, it seems like you have to first assign the output of the Random.nextInt() method to an int (or double or whatever) and THEN cast that to a char. I'm wondering why you can't just cast the output of the Random.nextInt() method directly to a char? For example:
will work to give you the lowercase characters randomly, but gives me a compiler error:
When I do that I'm getting a different error: Possible loss of precision
But that is easy to solve just add parentheses
"Any fool can write code that a computer can understand. Good programmers write code that humans can understand." --- Martin Fowler
Please correct my English.
Joined: May 21, 2007
Yeah, I had thought about adding parentheses but hadn't gotten around to it. That does make the compiler happy.
I'm still wondering why the compiler doesn't like
I guess I assumed that the cast applied to the entire right side of the assignment; and I also thought that any int-sized or smaller expression results in an int. So if the result of rand.nextInt() is an int, and you add a char to it, you'll get an int, which you then can cast to a char.
The result of any mathematical operator is always at least an int; never byte, short or char. So even though (char)rand.nextInt(26) is a char, and 'a' is a char, the result of adding them is an int. You'll need to add first, then cast:
Side note: combining operators with assignment (e.g. +=) already has this cast in it. So the following is also allowed:
The previous is clearer though; after all, what is the char value of 25, for instance? I don't know without looking it up, whereas 25 + 'a' is clearly 'z'.