You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@couchdb.apache.org by venkata subbarayudu <av...@gmail.com> on 2009/10/15 20:06:23 UTC

CouchDB eating up all available memory

Hi All,
             I am new to couchdb, and was installed couchdb:0.10.0a781732
(16G Ram,linux-centOS). I was using python-couchdb-0.6 to interact with
couchdb. I am doing a bulk insert (batchSize of 1000) to couchdb, and one
document does have roughly about 50 fields, out of which 10 are Integer and
the other are Textfields, I was using Hadoop Map/Reduce tasks (using python)
for writing to couchdb, I was trying to insert about 1 million records to
couchdb (to different databases), This causes to use all the available
memory(~16G), and even if the process (insertion) completes the memory
doesn't come down, I was not understanding whether is this really because of
couchdb. and If I see the CouchdbProcess memory it shows only 2GB. (similar
to : http://issues.apache.org/jira/browse/COUCHDB-325 ) and if the memory is
grown because of caches, is the cache files are created by couchdbProcess?.
is there a way to specify the MaxMemory allocated for Couchdb?. Please give
any suggestion on how to restrict memory for couchdb-process.

Thanks in advance for your help,
Subbarayudu.

Re: CouchDB eating up all available memory

Posted by Paul Davis <pa...@gmail.com>.
Subbarayudu,

Well if its affected by dropping kernel caches then its not actually
CouchDB doing the caching. This is just the kernel being smart in how
its allocating FS caches and so on. You should worry too much as its
soft so if something else needs the RAM it can be allocated which is
different than if CouchDB were actively trying to hold on to it.

HTH,
Paul Davis

On Thu, Oct 15, 2009 at 2:34 PM, venkata subbarayudu
<av...@gmail.com> wrote:
> Hi Paul Davis,
>       Thanks for your quick reply, By the time I check the memory, all of
> the tasks were completed, and even if I shutdown the couchdb server, I was
> not seeing a big. difference in free-memory, unless either I restart the
> system or drop_caches using '/proc/sys/vm/drop_caches'. I am just wondering
> what makes caches to take that much memory. (is this because of couchdb?),
> and also its helpful to me to know a way of specifying the MaxMemory
> allocated for couchdb (irrespective of whether the mem.consumption is
> because of couchdb or not).
>
>
> Subbarayudu.
>
> On Thu, Oct 15, 2009 at 11:45 PM, Paul Davis <pa...@gmail.com>wrote:
>
>> Venkata,
>>
>> CouchDB doesn't do any sort of caching. Are there any views building
>> when you're checking the memory consumption? Other than view building
>> and inserting CouchDB shouldn't be retaining any allocated memory. Its
>> always possible that there's a bug somewhere though. Can you reduce
>> the issue to script that can trigger the behavior?
>>
>> Paul Davis
>>
>>
>> On Thu, Oct 15, 2009 at 2:06 PM, venkata subbarayudu
>> <av...@gmail.com> wrote:
>> > Hi All,
>> >             I am new to couchdb, and was installed couchdb:0.10.0a781732
>> > (16G Ram,linux-centOS). I was using python-couchdb-0.6 to interact with
>> > couchdb. I am doing a bulk insert (batchSize of 1000) to couchdb, and one
>> > document does have roughly about 50 fields, out of which 10 are Integer
>> and
>> > the other are Textfields, I was using Hadoop Map/Reduce tasks (using
>> python)
>> > for writing to couchdb, I was trying to insert about 1 million records to
>> > couchdb (to different databases), This causes to use all the available
>> > memory(~16G), and even if the process (insertion) completes the memory
>> > doesn't come down, I was not understanding whether is this really because
>> of
>> > couchdb. and If I see the CouchdbProcess memory it shows only 2GB.
>> (similar
>> > to : http://issues.apache.org/jira/browse/COUCHDB-325 ) and if the
>> memory is
>> > grown because of caches, is the cache files are created by
>> couchdbProcess?.
>> > is there a way to specify the MaxMemory allocated for Couchdb?. Please
>> give
>> > any suggestion on how to restrict memory for couchdb-process.
>> >
>> > Thanks in advance for your help,
>> > Subbarayudu.
>> >
>>
>

Re: CouchDB eating up all available memory

Posted by venkata subbarayudu <av...@gmail.com>.
Hi Paul Davis,
       Thanks for your quick reply, By the time I check the memory, all of
the tasks were completed, and even if I shutdown the couchdb server, I was
not seeing a big. difference in free-memory, unless either I restart the
system or drop_caches using '/proc/sys/vm/drop_caches'. I am just wondering
what makes caches to take that much memory. (is this because of couchdb?),
and also its helpful to me to know a way of specifying the MaxMemory
allocated for couchdb (irrespective of whether the mem.consumption is
because of couchdb or not).


Subbarayudu.

On Thu, Oct 15, 2009 at 11:45 PM, Paul Davis <pa...@gmail.com>wrote:

> Venkata,
>
> CouchDB doesn't do any sort of caching. Are there any views building
> when you're checking the memory consumption? Other than view building
> and inserting CouchDB shouldn't be retaining any allocated memory. Its
> always possible that there's a bug somewhere though. Can you reduce
> the issue to script that can trigger the behavior?
>
> Paul Davis
>
>
> On Thu, Oct 15, 2009 at 2:06 PM, venkata subbarayudu
> <av...@gmail.com> wrote:
> > Hi All,
> >             I am new to couchdb, and was installed couchdb:0.10.0a781732
> > (16G Ram,linux-centOS). I was using python-couchdb-0.6 to interact with
> > couchdb. I am doing a bulk insert (batchSize of 1000) to couchdb, and one
> > document does have roughly about 50 fields, out of which 10 are Integer
> and
> > the other are Textfields, I was using Hadoop Map/Reduce tasks (using
> python)
> > for writing to couchdb, I was trying to insert about 1 million records to
> > couchdb (to different databases), This causes to use all the available
> > memory(~16G), and even if the process (insertion) completes the memory
> > doesn't come down, I was not understanding whether is this really because
> of
> > couchdb. and If I see the CouchdbProcess memory it shows only 2GB.
> (similar
> > to : http://issues.apache.org/jira/browse/COUCHDB-325 ) and if the
> memory is
> > grown because of caches, is the cache files are created by
> couchdbProcess?.
> > is there a way to specify the MaxMemory allocated for Couchdb?. Please
> give
> > any suggestion on how to restrict memory for couchdb-process.
> >
> > Thanks in advance for your help,
> > Subbarayudu.
> >
>

Re: CouchDB eating up all available memory

Posted by Paul Davis <pa...@gmail.com>.
Venkata,

CouchDB doesn't do any sort of caching. Are there any views building
when you're checking the memory consumption? Other than view building
and inserting CouchDB shouldn't be retaining any allocated memory. Its
always possible that there's a bug somewhere though. Can you reduce
the issue to script that can trigger the behavior?

Paul Davis


On Thu, Oct 15, 2009 at 2:06 PM, venkata subbarayudu
<av...@gmail.com> wrote:
> Hi All,
>             I am new to couchdb, and was installed couchdb:0.10.0a781732
> (16G Ram,linux-centOS). I was using python-couchdb-0.6 to interact with
> couchdb. I am doing a bulk insert (batchSize of 1000) to couchdb, and one
> document does have roughly about 50 fields, out of which 10 are Integer and
> the other are Textfields, I was using Hadoop Map/Reduce tasks (using python)
> for writing to couchdb, I was trying to insert about 1 million records to
> couchdb (to different databases), This causes to use all the available
> memory(~16G), and even if the process (insertion) completes the memory
> doesn't come down, I was not understanding whether is this really because of
> couchdb. and If I see the CouchdbProcess memory it shows only 2GB. (similar
> to : http://issues.apache.org/jira/browse/COUCHDB-325 ) and if the memory is
> grown because of caches, is the cache files are created by couchdbProcess?.
> is there a way to specify the MaxMemory allocated for Couchdb?. Please give
> any suggestion on how to restrict memory for couchdb-process.
>
> Thanks in advance for your help,
> Subbarayudu.
>