You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@couchdb.apache.org by JRad <be...@gmail.com> on 2012/07/19 01:40:02 UTC

Estimating Server Memory

Howdy Couchees,
Consider having a db of 30,000,000 docs, each around 1k, leading to 40GB of
data.
We are indexing this data in 5 views generating totally about 16X indexes
for each doc,
I mean our views contain totally about 480,000,000 rows of (key, 1) data
with size of
about 170GB!  30 users are planned to be simultaneously querying views to
search for data!

How couchdb behaves in feeding system RAM to views? will it use it up to
all available memory?

How you estimate needed server RAM?
Should it be 256GB of RAM? to ensure all view indexes are warmed?
OR
May it be 128GB, 64GB of RAM and we depend on couchdb to access disk with
memory faults on view queries!?
I took a Xeon Qcore 2.4 (12MB cache) server,
Have you any experience/help on choosing the right server for this case?