This week's book giveaway is in the Mac OS forum. We're giving away four copies of a choice of "Take Control of Upgrading to Yosemite" or "Take Control of Automating Your Mac" and have Joe Kissell on-line! See this thread for details.
Why are chars compatible w/ ints? And what another things abouta method with a primitive return type, being able to "return any value or variable that can be implicitly converted to the declared return type"? I'm having a hard time with this. Will someone please explain? P.S. the citation above comes from the Java 5 Certification book, by Bates and Sierra. It's on pg. 124.
As you must be knowing that int are 32 bit signed values that means there values may range from -65536 to 65535
Actually, the range of 32 bits (signed) is much larger. What you are describing is for a 17 bit signed datatype.
float(32 bit) -----> int(32 bit) //can not possible Because if 23.56 ----> 23 ( loss = .56 )
Actually, while what you are describing is true -- the implicit casting rule is due to range (not precision). Meaning the range of a float can be smaller than the minimum value of an int, and can be larger than the maximum value of int.
Originally posted by Dan Silva: Can a byte or short be implicitly cast into a char? Thanks.
You do know that it is very simple to just try it out...
To answer your question, No. A byte or short can't be implicitedly casted to a char. The reverse is also disallowed. A char can't be implicitedly casted to a byte or a short.
It has to do with range. A byte or short has negative numbers which a char can't hold. And a char has values at its high range, which is out of range for a byte or short. This means that both directions needs to be casted explicitedly.