You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@couchdb.apache.org by Ayhan Kesenci <a....@googlemail.com> on 2015/01/17 14:03:33 UTC

Couchdb to csv file

Hello i m working on a script in java to convert a whole couchdb in a csv
file,

i want to know how would be the easiest way to do that ?, I build up a
connection, and i know how to get a design document in a csv but i need the
whole couchdbfile

sincerly

Ayhan Kesenci

Re: Couchdb to csv file

Posted by "bchesneau@gmail.com" <bc...@gmail.com>.
> On 17 Jan 2015, at 14:03, Ayhan Kesenci <a....@googlemail.com> wrote:
> 
> Hello i m working on a script in java to convert a whole couchdb in a csv
> file,
> 
> i want to know how would be the easiest way to do that ?, I build up a
> connection, and i know how to get a design document in a csv but i need the
> whole couchdbfile
> 
> sincerly
> 
> Ayhan Kesenci


Just use a list and fetch it as a csv file.

- benoit.




Re: Couchdb to csv file

Posted by Sebastian Rothbucher <se...@googlemail.com>.
Hi,

Couch has an HTTP API, so you have to pull the right URL in Java (i.e. open
a java.net.Connection to the right URL). You can try the URL in the browser
before to check whether all the right data is there. So, you can point your
browser to http://{server}:{port}/{db}/_all_docs?include_docs=true and
check if you have all info in it. Then you can use sth like (new
java.net.URL("http://{server}:{port}/{db}/_all_docs?include_docs=true").openConnection())
to grab the URL connection and work from there. Using sth like
http://www.json.org/javadoc/org/json/JSONObject.html is certainly useful
when moving forward

Good luck
     Sebastian

On Sat, Jan 17, 2015 at 2:11 PM, Ayhan Kesenci <a....@googlemail.com>
wrote:

> Where i put this code in ?, i dont know how to work with curl, if its curl
>
> 2015-01-17 14:08 GMT+01:00 Aurélien Bénel <au...@utt.fr>:
>
> > > i need the whole couchdbfile
> >
> > Have you tried with `/{db}/_all_docs?include_docs=true` ?
> >
> > Reference:
> > http://docs.couchdb.org/en/latest/api/database/bulk-api.html#db-all-docs
> >
> >
> > Regards,
> >
> > Aurélien
>

Re: Couchdb to csv file

Posted by Ayhan Kesenci <a....@googlemail.com>.
Where i put this code in ?, i dont know how to work with curl, if its curl

2015-01-17 14:08 GMT+01:00 Aurélien Bénel <au...@utt.fr>:

> > i need the whole couchdbfile
>
> Have you tried with `/{db}/_all_docs?include_docs=true` ?
>
> Reference:
> http://docs.couchdb.org/en/latest/api/database/bulk-api.html#db-all-docs
>
>
> Regards,
>
> Aurélien

Re: Couchdb to csv file

Posted by Aurélien Bénel <au...@utt.fr>.
> i need the whole couchdbfile

Have you tried with `/{db}/_all_docs?include_docs=true` ?

Reference: http://docs.couchdb.org/en/latest/api/database/bulk-api.html#db-all-docs


Regards,

Aurélien

Re: Limitation on number of items per DB

Posted by Kiril Stankov <ki...@open-net.biz>.
Thanks a lot!

As for the last point, yes, I meant if the indices will get 
significantly slower when there are billions of documents.
------------------------------------------------------------------------
*With best regards,*
Kiril Stankov


            This Email disclaimer
            <http://open-net.biz/emailsignature.html> is integral part
            of this message.

On 1/20/2015 4:00 AM, Alexander Shorin wrote:
> On Tue, Jan 20, 2015 at 1:44 AM, Kiril Stankov <ki...@open-net.biz> wrote:
>> Is there any limit (practical or by design) for the number of docs per DB.
> There is no any limits for number of docs per DB.
>
>> Or may be on the DB size?
> Database size is limited by underlying file system constrains about
> maximum file size.
>
>> For example, should I switch to a new DB, when the number of docs gets to
>> few millions? Or may be earlier?
> It's all up to you and your application needs (so called archives).
>
>> Can I keep billions of docs per DB?
> Yes, there is no technical problem with.
>
>> Are indexes more effective on smaller number of docs?
> It depends what's do you mean under "effective". Building cold index
> over billion documents is indeed will be slower, much slower when you
> make the same over dozens ones. If you keep your indexes up-to-dates
> and doesn't change very often existed ones you may not even notice if
> they ever updates. If you mean about how effective will be querying
> such indexes which were build against billion document, then it's only
> limited by B+tree operation complexity. It's always O(log n) for
> average and worst cases.
>
>
> --
> ,,,^..^,,,


Re: Limitation on number of items per DB

Posted by Alexander Shorin <kx...@gmail.com>.
On Tue, Jan 20, 2015 at 1:44 AM, Kiril Stankov <ki...@open-net.biz> wrote:
>
> Is there any limit (practical or by design) for the number of docs per DB.

There is no any limits for number of docs per DB.

> Or may be on the DB size?

Database size is limited by underlying file system constrains about
maximum file size.

> For example, should I switch to a new DB, when the number of docs gets to
> few millions? Or may be earlier?

It's all up to you and your application needs (so called archives).

> Can I keep billions of docs per DB?

Yes, there is no technical problem with.

> Are indexes more effective on smaller number of docs?

It depends what's do you mean under "effective". Building cold index
over billion documents is indeed will be slower, much slower when you
make the same over dozens ones. If you keep your indexes up-to-dates
and doesn't change very often existed ones you may not even notice if
they ever updates. If you mean about how effective will be querying
such indexes which were build against billion document, then it's only
limited by B+tree operation complexity. It's always O(log n) for
average and worst cases.


--
,,,^..^,,,

Limitation on number of items per DB

Posted by Kiril Stankov <ki...@open-net.biz>.
Hi all,

Is there any limit (practical or by design) for the number of docs per 
DB. Or may be on the DB size?
For example, should I switch to a new DB, when the number of docs gets 
to few millions? Or may be earlier?\
Can I keep billions of docs per DB?
Are indexes more effective on smaller number of docs?

Thanks in advance!
------------------------------------------------------------------------
*With best regards,*
Kiril Stankov


            This Email disclaimer
            <http://open-net.biz/emailsignature.html> is integral part
            of this message.

On 1/17/2015 3:03 PM, Ayhan Kesenci wrote:
> Hello i m working on a script in java to convert a whole couchdb in a csv
> file,
>
> i want to know how would be the easiest way to do that ?, I build up a
> connection, and i know how to get a design document in a csv but i need the
> whole couchdbfile
>
> sincerly
>
> Ayhan Kesenci
>