I am executing a JPA createNativeQuery, which returns a set of customer numbers that match user-defined selection criteria. For some reason, createNativeQuery doesn't like it when you specify java.lang.Long as the class. It complains about not finding the descriptor. So, the default return type, when this query is executed is BigDecimal. Well, for reasons I won't go into, I need the list to be a set of Long customer numbers. So, I create a for loop to iterate over the set of BigDecimals as such:
Now, here's the problem. In running this in debug mode, I can see that customerNumberList has 36 elements. But, when I perform customerNumbers.add(customerNumber.longValue()), customerNumbers is not increased one at a time. It often adds several null elements and then fills them. So, what I end up with, after the loop completes is a customerNumbers list with 38 elements (the last two elements being null).
Is this normal? Or is it a bug?
And, yes, I know I can do:
But, I'm just wondering if there is a better way, and possibly some insights into the logic behind the internals of why ArrayList works the way it does.
Question. When you say that an array list has 36 elements, do you mean that the size() method is returning a value of 36? Or do you mean that you are using the debugger to inspect the internal array in the list to see how big it is?
The API documentation wrote:Each ArrayList instance has a capacity. The capacity is the size of the array used to store the elements in the list. It is always at least as large as the list size. As elements are added to an ArrayList, its capacity grows automatically.
I can't claim that's an "insight", though, as it's from the public documentation for the class.
Joined: Aug 23, 2009
@Henry: Both. The size() method returns 38, the debugger shows the object as having an "elementCount" of 38.
@Paul: It seems to me that the API documentation is a bit ambiguous. I would interpret that as growing (by 1) as you add objects to the list. I just find it kind of odd that it adds a half dozen on an single add - not knowing if it's going to need 6 more or 1 more. It seems an inherently faulty way to do it. Now, as I indicated, if I set the size on initialization, it works fine. But, the fact that, if you don't know how big the ArrayList needs to be up-front, then it leaves you with the responsibility of trimming off the "fat". Or, making sure that on subsequent "for" loops over the derived ArrayList, you check for null entries and bypass them on processing.
It just seems like a very strange way to handle it. Is there really that much performance improvement by adding multiple elements on an add, rather than one at a time, as needed? I can't believe the performance outweighs the potential unintended consequences when developers either fail to trim the trailing null elements or code to bypass them.
R. Grimes wrote:It seems to me that the API documentation is a bit ambiguous.
It's deliberately vague.
I would interpret that as growing (by 1) as you add objects to the list.
It doesn't say any such thing. It doesn't say anything specific at all.
Is there really that much performance improvement by adding multiple elements on an add, rather than one at a time, as needed?
Well, yes. It does take a significant amount of time to create a new array and copy all of the existing data over into it from the old array. If you did that every single time you added one of the 25,000 entries you ended up with, there would be a heck of a lot of copying. Whereas if you did something which allocated bigger arrays and did less frequent copying, that would be more effective. Not to mention there would be a lot less arrays for the garbage collector to deal with.
I'm sure there's a mathematical optimum there and I'm sure that creating a new array for every add isn't that. I also expect that the engineers who designed the class didn't just pick some growth algorithm at random, either. I expect they did the math.
R. Grimes wrote: I can't believe the performance outweighs the potential unintended consequences when developers either fail to trim the trailing null elements or code to bypass them.
And there are no unintended consequences. Most developers don't "trim the trailing null elements or code to bypass them", because they don't have to. The ArrayList knows the correct size of the list, and will not allow programs to access those null elements, even though there is an array element entry all ready for them.
The only "consequence", if that, is that the arraylist is larger than it needs to be, due to the larger array.
R. Grimes wrote:@Henry: Both. The size() method returns 38, the debugger shows the object as having an "elementCount" of 38.
What Paul is talking about (and the API, and what Henry explains) is the 'capacity' change for the ArrayList, which is different the size. The ArrayList's capacity is the number of internal array indexes which can be used to store data. The size is the actual number of Objects in the ArrayList. Paul explained why the capacity changes in leaps rather than 1 by 1, and Henry explained that the actual capacity doesn't affect the end use of the contents in the ArrayList.
You seem to be saying that the actual size of the ArrayList, as returned by the size() method, is increasing beyond the expected 36 values. That should not happen. In the below code it does not:
When I ran this code and tracked the internals of the ArrayList via a debugger, the capacity of the ArrayList went from 10 to 16 to 25 to 38, but the size of the ArrayList tracked one for one with the number of values I was putting in. In the end the size() method returns 36, as it should. I think you should double check what you are viewing:
1) Are you sure the size() method returns 38 and not 36?
2) If size() returns 38, then are you sure you don't have 38 values in your BigDecimal list?
3) If size() returns 38 in your Long list, and only 36 values in your BigDecimal list, then it looks like you may have a bugged JRE.
Joined: Aug 23, 2009
Thanks, Henry, Paul, and Steve. Makes a lot more sense to me now. I appreciate your time in explaining this.