From the sample code that I have seen here at the Ranch, I get the impression that only int are used for integer values. To me this seems like waste of system resources - like writing bloat-ware. I know that the hardware can handle it but still... I only need to think of how long it takes the newest version of my text editor (UltraEdit) to load, compared to a few years ago when it was a smaller program. Anyway, I am curious: Would declaring bytes or shorts in an assignment give cause for another nitpick?
“We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.”
-- Donald Knuth
If you are making an array with hundreds or thousands of small integer values, then you could consider using a smaller form of integer, but if you are talking about a few local variables, then your savings are going to be too small and temporary anyway. Also, if I remember correctly, shorts and bytes have to be widened to ints to do arithmetic anyway.
Bloatware, if that's a fair appellation for UltraEdit, comes from adding excessive features to an application to make it seem more appealing or encourage people to upgrade, not as a result of the code using four-byte integers when two or one byte might be able to do the job. You also open yourself up to overflow errors, which can be hard to diagnose. So, yes, using a short or byte in a Cattle Drive assignment, unless you have a better reason than "it saves two bytes", is going to get you nitpicked.
Joined: Apr 19, 2012
That makes sense. Thanks for saving me from yet another smack on the head in the Cattle Drive.
And so I always remember, I shall loop a hundred times "I shall not be evil by prematurely optimizing."
Something that's interesting to think about is thinking about data size.
I say nothing if the person gives me a reason about trying to use the smallest type possible. I do think, however, that ints are easiest to deal with and most common as far as primitive numbers are concerned. Also, when I modify code, I kindof think of most primitive numbers as ints.... and I think a multitude of people do, too. It's typically good to do what's going to cause people to spend fewer cycles figuring you out.
Someone said to me this week that if you have a serious memory constraint you don't use Java. That said, if using a short instead of an int is going to make or break you, you have bigger problems. Not to say you should be wasteful.... it's important to be mindful of what we're doing.... but it's just as important to code in a way that's not going to leave someone scratching their head.
When you do things right, people won't be sure you've done anything at all.
Joined: Apr 19, 2012
All so true. I just came across this passage in the Oracle Java Tutorial under Primitive Data Types :
As with byte, the same guidelines apply: you can use a short to save memory in large arrays, in situations where the memory savings actually matters. [my underline]
Which goes to show (again) that reading the documentation does help...
Greg Charles wrote:Also, if I remember correctly, shorts and bytes have to be widened to ints to do arithmetic anyway.
Yes, there are a number of operations that automatically widen to int, by definition in the JLS.
Additionally, JVM implementations may explicitly use at least 4 bytes for each individual integer primitive anyway, to get word alignment, and even if they don't, it will probably end up that way anyway thanks to the compiler and/or hardware.
subject: Ok to use byte or short primitives for assignments?