You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@directmemory.apache.org by Mark Greene <mg...@hubspot.com> on 2012/10/21 17:00:04 UTC

Memory Utilization

Stumbled upon the project recently and wanted to try it out. I'm playing
around with 0.2-Snapshot which I built from github.

I noticed when I added about 1 million objects to the cache my heap size
increased approximately the same as when I added it to just a plain
HashMap. I was wondering what could be causing that? My cache config is as
follows:

CacheService<String, String> cacheService = new DirectMemory<String,
String>().setNumberOfBuffers(10).setSize(100000000).setInitialCapacity(100).setConcurrencyLevel(4).setMemoryManager(
new MemoryManagerServiceImpl<String>()).newCacheService();

Re: Memory Utilization

Posted by "Raffaele P. Guidi" <ra...@gmail.com>.
Well this happens because the keys are probably almost the same size if the
payload (the cached object) itself. Keys are kept in the heap for fast
retrieval - the offheap approach is valuable when payloads are
significantly larger than their keys. Try with an integer key and a very
large object (or array) and you'll see a difference.

Ciao,
    R
Il giorno 21/ott/2012 17:00, "Mark Greene" <mg...@hubspot.com> ha scritto:

> Stumbled upon the project recently and wanted to try it out. I'm playing
> around with 0.2-Snapshot which I built from github.
>
> I noticed when I added about 1 million objects to the cache my heap size
> increased approximately the same as when I added it to just a plain
> HashMap. I was wondering what could be causing that? My cache config is as
> follows:
>
> CacheService<String, String> cacheService = new DirectMemory<String,
> String>().setNumberOfBuffers(10).setSize(100000000).setInitialCapacity(100).setConcurrencyLevel(4).setMemoryManager(
> new MemoryManagerServiceImpl<String>()).newCacheService();
>