You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@jackrabbit.apache.org by Charles Brooking <pu...@charlie.brooking.id.au> on 2008/11/09 17:00:37 UTC

WebDAV with large files

Hi all

I've installed Jackrabbit using jackrabbit-webapp-1.4.war and have been 
testing the 'repository/default/' WebDAV server. In particular, I 
decided to try uploading several large files.

What I found was that when transferring ten files, each of about 10MiB, 
memory consumption jumped by 100MiB. This suggests that transfer of 
large files or large numbers of files is impossible! Is there a way to 
save 'blob' files to disk in a streaming way?

Thanks
Charlie

Re: WebDAV with large files

Posted by Charles Brooking <pu...@charlie.brooking.id.au>.
Julian Reschke wrote:
>> What I found was that when transferring ten files, each of about 10MiB,
>> memory consumption jumped by 100MiB. This suggests that transfer of
>> large files or large numbers of files is impossible! Is there a way to
>> save 'blob' files to disk in a streaming way?
>
> Well, it really doesn't suggest that (yet) -- there's no guarantee that
> garbage is collected until more memory is needed (at least that's my
> understanding).

Good point. They were very unscientific tests, but I thought it wouldn't
hurt to bounce them off the list. As it turns out, I have done some more
testing today and have been unable to reproduce the problem. (Maybe it was
too late at night and I was seeing things.)

I have just now transferred several files, each over 1GiB, and the Tomcat
java process stayed at around 60MiB resident memory (plus 370MiB virtual).
This was with CATALINA_OPTS='-Xmx128m'. The operating system used almost
all of the remaining memory as cache, but Tomcat itself was completely
steady.

I was careful before to disable all IO handlers, and I can't think of any
other reason memory would have blown out, but I'll keep an eye on things.
Hopefully this excellent performance will continue!

Later
Charlie


Re: WebDAV with large files

Posted by Julian Reschke <ju...@gmx.de>.
Charles Brooking wrote:
> Hi all
> 
> I've installed Jackrabbit using jackrabbit-webapp-1.4.war and have been 
> testing the 'repository/default/' WebDAV server. In particular, I 
> decided to try uploading several large files.
> 
> What I found was that when transferring ten files, each of about 10MiB, 
> memory consumption jumped by 100MiB. This suggests that transfer of 
> large files or large numbers of files is impossible! Is there a way to 
> save 'blob' files to disk in a streaming way?

Well, it really doesn't suggest that (yet) -- there's no guarantee that 
garbage is collected until more memory is needed (at least that's my 
understanding).

So, did you *try* uploading larger files (such as in: larger than 
available memory)?

BR, Julian