You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@commons.apache.org by Marcos César de Oliveira <mc...@hmsolucoes.com.br> on 2005/04/01 17:39:22 UTC

AbstractHashedMap deserialization

I start using the MultiKeyMap in a application of mine, including the
serialization of the maps.
Since then, this application grew very much in heap usage, and after
profiling the application, I see that the main offender is the
AbstracHashedMap class as stated bellow:

 rank   self  accum     bytes objs     bytes  objs trace name
    1 63.92% 63.92% 134480192    4 134480192     4 305914
org.apache.commons.collections.map.AbstractHashedMap$HashEntry[]
    2 31.93% 95.85%  67180256   74  67245808    75 305860
org.apache.commons.collections.map.AbstractHashedMap$HashEntry[]
    3  0.69% 96.53%   1447176 49316   1449480 49395 305471 char[]
    4  0.56% 97.10%   1183584 49316   1185480 49395 305470 java.lang.String
    5  0.50% 97.60%   1053032  279   2693600   656 302432 byte[]
    (continues...)

After some debug, I realized the cause is the following:

In the method "AbstractHashMap#doReadObject", the method call "threshold =
calculateThreshold(data.length, loadFactor);" is after the "put(key,
value);".
The "put" method uses the threshold to check the capacity of the data array.
In this checking, the threshold is zero, leading to a resize of the data
array.
At each serialization/deserialization the size of the data array doubles.

The question is:
I'm missing something, or this is a bug in deserialization of the
AbstractHashMap?

Thanks in advance. 


---------------------------------------------------------------------
To unsubscribe, e-mail: commons-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: commons-user-help@jakarta.apache.org