You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@jackrabbit.apache.org by vijay makija <vi...@yahoo.com> on 2008/01/23 06:48:24 UTC

Maximum amount of data that can be handled efficiently

Hi,

What is maximum amount of data that can be handled
efficiently by Apache Jackrabbit?

Regards,
Vijay Makhija


      ____________________________________________________________________________________
Be a better friend, newshound, and 
know-it-all with Yahoo! Mobile.  Try it now.  http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ 


RE: Maximum amount of data that can be handled efficiently

Posted by Ard Schrijvers <a....@hippo.nl>.
Hello Vijay,

pls continue this thread on the user-list because it is clearly not a
dev-list question. 

Also I hope you do realize your question is kind of vague...are you
talking about how large one single binary data blob can be, or are you
referring to how many nodes Jackrabbit can handle? Obviously, it depends
on your hardware, clustering, data modelling of the repository, queries
you do, which persistence manager, response times you expect, number of
queries per second, etc etc.

Perhaps you can provide us (on the user-list) with info about what kind
of data and model you would like to have, and we might be able to hand
you some pointers. 

At the moment all I can say, is that if you model your content correctly
[1], and know what kind of queries might be havy [2], you should be able
to have 'a lot' of data being handled efficiently....And I know 'a lot'
is pretty vague, but it is the most sensible answer I can possibly give
at this moment regarding the info you give,

Regards Ard

[1] http://wiki.apache.org/jackrabbit/DavidsModel
[2]
http://mail-archives.apache.org/mod_mbox/jackrabbit-users/200801.mbox/%3
cF8E386B54CE3E6408F3A32ABB9A7908A7E69CC@hai02.hippointern.lan%3e  

> 
> Hi,
> 
> What is maximum amount of data that can be handled 
> efficiently by Apache Jackrabbit?
> 
> Regards,
> Vijay Makhija
> 
> 
>       
> ______________________________________________________________
> ______________________
> Be a better friend, newshound, and
> know-it-all with Yahoo! Mobile.  Try it now.  
> http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ 
> 
> 

RE: Maximum amount of data that can be handled efficiently

Posted by Ard Schrijvers <a....@hippo.nl>.
Hello Vijay,

pls continue this thread on the user-list because it is clearly not a
dev-list question. 

Also I hope you do realize your question is kind of vague...are you
talking about how large one single binary data blob can be, or are you
referring to how many nodes Jackrabbit can handle? Obviously, it depends
on your hardware, clustering, data modelling of the repository, queries
you do, which persistence manager, response times you expect, number of
queries per second, etc etc.

Perhaps you can provide us (on the user-list) with info about what kind
of data and model you would like to have, and we might be able to hand
you some pointers. 

At the moment all I can say, is that if you model your content correctly
[1], and know what kind of queries might be havy [2], you should be able
to have 'a lot' of data being handled efficiently....And I know 'a lot'
is pretty vague, but it is the most sensible answer I can possibly give
at this moment regarding the info you give,

Regards Ard

[1] http://wiki.apache.org/jackrabbit/DavidsModel
[2]
http://mail-archives.apache.org/mod_mbox/jackrabbit-users/200801.mbox/%3
cF8E386B54CE3E6408F3A32ABB9A7908A7E69CC@hai02.hippointern.lan%3e  

> 
> Hi,
> 
> What is maximum amount of data that can be handled 
> efficiently by Apache Jackrabbit?
> 
> Regards,
> Vijay Makhija
> 
> 
>       
> ______________________________________________________________
> ______________________
> Be a better friend, newshound, and
> know-it-all with Yahoo! Mobile.  Try it now.  
> http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ 
> 
>