You are viewing a plain text version of this content. The canonical link for it is here.
Posted to fop-users@xmlgraphics.apache.org by Mohit Sharma <Mo...@attbi.com> on 2003/05/22 03:34:06 UTC

Big/Huge XMLs

I have big/huge XMLs, and I need to convert them into PDFs using FOP. Benchmarking the latest FOP gives poor results, both memory-wise and processing-wise. Its just taking too much time. The XML cannot really be broken down into chunks, as its all part of a report. And I need to process a lot of reports overnight, and I don't have a cluster at my disposal to distribute the load.

Is there a way to speed up the processing time ?


Best,
Mohit Sharma

Re: Big/Huge XMLs

Posted by Jeremias Maerki <de...@greenmail.ch>.
In addition to Matt's comments there is: 
http://xml.apache.org/fop/running.html#memory

Also keep in mind that very high memory consumption often comes with a
decrease in speed for Java. So reducing memory consumption may improve
speed. Also avoid building a DOM for input. I you do that try changing
to SAX event generation.

On 22.05.2003 03:34:06 Mohit Sharma wrote:
> I have big/huge XMLs, and I need to convert them into PDFs using FOP.
> Benchmarking the latest FOP gives poor results, both memory-wise and
> processing-wise. Its just taking too much time. The XML cannot really be
> broken down into chunks, as its all part of a report. And I need to
> process a lot of reports overnight, and I don't have a cluster at my
> disposal to distribute the load.
> 
> Is there a way to speed up the processing time ?


Jeremias Maerki


---------------------------------------------------------------------
To unsubscribe, e-mail: fop-user-unsubscribe@xml.apache.org
For additional commands, e-mail: fop-user-help@xml.apache.org