There's a difference between implicit and explicit casting. Implicit is when it just happens, explicit is when you have to add a cast yourself.
A Dog can be implicitly cast to an Animal, because a dog is always an animal:But an Animal cannot be implicitly cast to a Dog, because it might not be:
However, because an animal might be a dog, you can add the cast yourself. This is telling the compiler "trust me, it's a dog", and you'll get an error at run-time if it isn't:
Where this fails is if the cast can't possibly work. Try casting an Animal to a Daffodil, and even an explicit cast won't compile.
The general rule is:
- An implicit cast is allowed if it will always work
- An explicit cast is allowed if it will sometimes work
It is unfortunate that the same word “casting” is used for casting reference types and casting primitives. In the case of primitives, you can change the type (and value) of data, but you cannot change the type of a reference type.
There are more details in the Java Language Specification, but that is not usually easy to read. You find they are called narrowing conversions or similar.
Something that I think always gets overlooked with casting objects...
you don't actually change the object. If you had a Dog object, it is always and forever will be a Dog object.
Casting is telling the compiler that the object referred to is something other than the reference type. For example:
Animal a = new Dog();
This works, because a Dog IS-A Animal. But now, suppose I do this:
Dog d = a;
The compiler will probably complain. It doesn't know that a actually refers to a Dog object, it just knows it refers to some kind of Animal. So you do this:
Dog d = (Dog) a;
Here, you are saying "I know for a fact that the thing a is referring to really IS a dog, so this is OK". The compiler believes you, since it is possible that a can point to a dog. You may get a run-time error if a is pointing to a Chicken, but the compiler can't know that.
regardless, the only object ever created here is a Dog. Casting it or assigning it to a different reference variable doesn't change that.
There are only two hard things in computer science: cache invalidation, naming things, and off-by-one errors
Joined: Sep 28, 2011
I understand why Integers are explicitly cast as Doubles.
But why do we need to cast objects as different types?