You are viewing a plain text version of this content. The canonical link for it is here.
Posted to fop-users@xmlgraphics.apache.org by Arran Price <ar...@datamail.co.nz> on 2002/09/11 02:02:36 UTC

memory or maximum input filesizes

Hi all,

Im currently trying to do some bulk rendering and am getting
OutOfMemoryErrors at the same point (in multiple tests) in processing.
Im using fop-0.20.4 and have tried and experienced the same problem on a
mandrake intel platform and a solaris sun platform - both having 512MB RAM.
I can process an input file of 130 documents/pages but fail at 139th
doc/page when trying to use an input of 140
The input filesize for 130 is 2381kb  and for 140 is 2564kb my xsl is
roughly 500kb
I get same results for postscript and pdf output.

My questions:
Is this a known bug (couldnt find one in bugzilla)?
Is there simply a limitation in input filesize or similar?
If it is a limitation - am I correct in assuming adding memory would
increase the amount I can process? (or is there something else I need to
factor in as well).
Is there a get around that Im not aware of?

It would appear that fop tries to do all its rendering in memory and then
writes to disk at completion.  I couldnt find an option to make it write to
disk each page or similar so that it frees up the memory.

Any other advice gratefully appreciated.


thanks 

Arran

The information contained in this mail message is confidential and may also
be legally privileged. If you are not the intended recipient, please note
that any use, dissemination, further distribution, or reproduction of this
message in any form what so ever, is strictly prohibited.  If the mail is in
error, please notify me by return E-mail, delete your copy of the message,
and accept my apologies for any inconvenience caused.


RE: memory or maximum input filesizes

Posted by Roland Neilands <rn...@pulsemining.com.au>.
Arran,

> Im currently trying to do some bulk rendering and am getting
> OutOfMemoryErrors at the same point (in multiple tests) in processing.
> Im using fop-0.20.4 and have tried and experienced the same
> problem on a
> mandrake intel platform and a solaris sun platform - both
> having 512MB RAM.
> I can process an input file of 130 documents/pages but fail at 139th
> doc/page when trying to use an input of 140
> The input filesize for 130 is 2381kb  and for 140 is 2564kb my xsl is
> roughly 500kb
> I get same results for postscript and pdf output.
>
> My questions:
> Is this a known bug (couldnt find one in bugzilla)?
> Is there simply a limitation in input filesize or similar?
> If it is a limitation - am I correct in assuming adding memory would
> increase the amount I can process? (or is there something
> else I need to
> factor in as well).
> Is there a get around that Im not aware of?
>
> It would appear that fop tries to do all its rendering in
> memory and then
> writes to disk at completion.  I couldnt find an option to
> make it write to
> disk each page or similar so that it frees up the memory.

There are several mentions of this kind of thing through the list archive,
search for "out of memory" or "performance":
http://marc.theaimsgroup.com/?l=fop-user&r=1&w=2#

- try the java option -Xmx to allocate more memory
- Using "page x of y" or similar forward references forces the entire
document to be held in memory, avoid if possible
- Document structure: starting new page sequences more often may also help.

Regards,
Roland