You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@couchdb.apache.org by Mike <mi...@wolman.co.uk> on 2015/02/24 22:17:21 UTC
couch as statsd backend
Hi Everyone,
In case its of use for anyone else - I have added a backend to statsd to
emit stats to couchdb:
https://github.com/sysadminmike/couch-statsd-backend
https://www.npmjs.com/package/couch-statsd-backend
Mike.
Re: couch as statsd backend
Posted by Mike <mi...@wolman.co.uk>.
Hi Alexander,
I got it up to about 28k doc inserts every 10 secs but my machine
couldnt generate anymore than that (or statds couldnt ingest them fast
enough not sure - may have been hitting a socket limit somewhere).
Flushing stats at Tue Feb 24 2015 14:23:25 GMT+0000 (GMT)
done: added 28005 docs
Flushing stats at Tue Feb 24 2015 14:23:35 GMT+0000 (GMT)
done: added 28005 docs
Flushing stats at Tue Feb 24 2015 14:23:45 GMT+0000 (GMT)
done: added 28005 docs
Flushing stats at Tue Feb 24 2015 14:23:56 GMT+0000 (GMT)
done: added 28006 docs
When i dropped the flush time to 3 secs:
lushing stats at Tue Feb 24 2015 14:30:36 GMT+0000 (GMT)
done: added 27693 docs
Flushing stats at Tue Feb 24 2015 14:30:38 GMT+0000 (GMT)
done: added 27770 docs
Flushing stats at Tue Feb 24 2015 14:30:39 GMT+0000 (GMT)
done: added 27737 docs
But couch didnt last too long and died after a few secs.
I didnt do any more than some quick load testing as I am not really
planning on collecting more than a few thousand a minute.
I didnt think it took up too much space either >2million metrics 0.5G
(after tuning on snappy and running a compact):
{
"db_name":"aatest",
"doc_count":2272446,
"doc_del_count":0,
"update_seq":2272446,
"purge_seq":0,
"compact_running":false,
"disk_size":562339953,
"data_size":559869189,
"instance_start_time":"1424789845662450",
"disk_format_version":6,
"committed_update_seq":2272446
}
Test machine is about 5 years old - Core2 Duo@ 3.00GHz with 4GB ram -
with lack of ram and zfs causing some slowness and running a few couch
instances and other jails so i suspect properly setup would push >40k
docs on ufs rather than zfs - im using normal hard disks as well so with
ssd drives >50k or more.
Mike.
Alexander Shorin wrote:
> Hi Mike!
>
> That's nice! Would you like to add it to our wiki with few notes about?
> But more interesting question is how does it works under load? Did you
> stress it with hundreds and thousands metrics per second? Very curious
> about.
> --
> ,,,^..^,,,
>
>
> On Wed, Feb 25, 2015 at 12:17 AM, Mike <mi...@wolman.co.uk> wrote:
>
>> Hi Everyone,
>>
>> In case its of use for anyone else - I have added a backend to statsd to
>> emit stats to couchdb:
>>
>> https://github.com/sysadminmike/couch-statsd-backend
>>
>> https://www.npmjs.com/package/couch-statsd-backend
>>
>> Mike.
>>
>>
>>
Re: couch as statsd backend
Posted by Alexander Shorin <kx...@gmail.com>.
Hi Mike!
That's nice! Would you like to add it to our wiki with few notes about?
But more interesting question is how does it works under load? Did you
stress it with hundreds and thousands metrics per second? Very curious
about.
--
,,,^..^,,,
On Wed, Feb 25, 2015 at 12:17 AM, Mike <mi...@wolman.co.uk> wrote:
> Hi Everyone,
>
> In case its of use for anyone else - I have added a backend to statsd to
> emit stats to couchdb:
>
> https://github.com/sysadminmike/couch-statsd-backend
>
> https://www.npmjs.com/package/couch-statsd-backend
>
> Mike.
>
>