You are viewing a plain text version of this content. The canonical link for it is here.
Posted to fop-users@xmlgraphics.apache.org by Stefan Hinz <st...@oracle.com> on 2012/10/24 20:49:22 UTC

FOP 1.1 out of memory

I like FOP 1.1 a lot: Unlike previous versions, it tells you which page 
it's processing, which can make debugging easier, and also it gives you 
that warm fuzzy feeling that you're somewhat in control of things. :-)

However, with really big books, I'm hitting a wall, like this:

Okt 24, 2012 8:21:16 PM org.apache.fop.events.LoggingEventListener 
processEvent
INFO: Rendered page #2630.
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit 
exceeded

That is, FOP stops at about 2600 pages with an out of memory error. On 
Stackoverflow (http://preview.tinyurl.com/94qute5), there's an 
indication why this happens:

"This message means that for some reason the garbage collector is taking 
an excessive amount of time (by default 98% of all CPU time of the 
process) and recovers very little memory in each run (by default 2% of 
the heap).
This effectively means that your program stops doing any progress and is 
busy running only the garbage collection at all time."

Does this mean it's a FOP 1.1 bug, or would there be anything I could do 
to give it/Java more memory and prevent it from failing?

The error happens on an 8 GB RAM 4-core machine. At the time FOP ended, 
there was like 2 GB RAM left.

-- 
Cheers,

Stefan Hinz <st...@oracle.com>, MySQL Documentation Manager

Phone: +49-30-82702940, Fax: +49-30-82702941, http://dev.mysql.com/doc

ORACLE Deutschland B.V.&  Co. KG
Registered Office: Riesstr. 25, 80992 Muenchen, Germany
Commercial Register: Local Court Of Munich, HRA 95603
Managing Director: Jürgen Kunz

General Partner: ORACLE Deutschland Verwaltung B.V.
Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
Register Of Chamber Of Commerce: Midden-Niederlande, No. 30143697
Managing Directors: Alexander van der Ven, Astrid Kepper, Val Maher

---------------------------------------------------------------------
To unsubscribe, e-mail: fop-users-unsubscribe@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-help@xmlgraphics.apache.org


Re: FOP 1.1 out of memory

Posted by mehdi houshmand <me...@gmail.com>.
Hi Stefan,

Thanks for your support, people have worked hard to give you that "warm
fuzzy feeling", so it's nice to know that work is appreciated.

As for your FO, you need to provide a little more information:

1) Tables in the FO tend to consume quite a lot of memory, especially when
they span multiple pages, are you using them for laying out the documents?

2) Fonts take up more room than they really should; are you using custom
fonts? Are you embedding fonts?

Those are the usual culprits that come to mind (though others may wish to
add to them), if you can upload the FO, then someone will take a look at it.

Hope that helps,

Mehdi

On 24 October 2012 19:49, Stefan Hinz <st...@oracle.com> wrote:

> I like FOP 1.1 a lot: Unlike previous versions, it tells you which page
> it's processing, which can make debugging easier, and also it gives you
> that warm fuzzy feeling that you're somewhat in control of things. :-)
>
> However, with really big books, I'm hitting a wall, like this:
>
> Okt 24, 2012 8:21:16 PM org.apache.fop.events.**LoggingEventListener
> processEvent
> INFO: Rendered page #2630.
> Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit
> exceeded
>
> That is, FOP stops at about 2600 pages with an out of memory error. On
> Stackoverflow (http://preview.tinyurl.com/**94qute5<http://preview.tinyurl.com/94qute5>),
> there's an indication why this happens:
>
> "This message means that for some reason the garbage collector is taking
> an excessive amount of time (by default 98% of all CPU time of the process)
> and recovers very little memory in each run (by default 2% of the heap).
> This effectively means that your program stops doing any progress and is
> busy running only the garbage collection at all time."
>
> Does this mean it's a FOP 1.1 bug, or would there be anything I could do
> to give it/Java more memory and prevent it from failing?
>
> The error happens on an 8 GB RAM 4-core machine. At the time FOP ended,
> there was like 2 GB RAM left.
>
> --
> Cheers,
>
> Stefan Hinz <st...@oracle.com>, MySQL Documentation Manager
>
> Phone: +49-30-82702940, Fax: +49-30-82702941, http://dev.mysql.com/doc
>
> ORACLE Deutschland B.V.&  Co. KG
> Registered Office: Riesstr. 25, 80992 Muenchen, Germany
> Commercial Register: Local Court Of Munich, HRA 95603
> Managing Director: Jürgen Kunz
>
> General Partner: ORACLE Deutschland Verwaltung B.V.
> Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
> Register Of Chamber Of Commerce: Midden-Niederlande, No. 30143697
> Managing Directors: Alexander van der Ven, Astrid Kepper, Val Maher
>
> ------------------------------**------------------------------**---------
> To unsubscribe, e-mail: fop-users-unsubscribe@**xmlgraphics.apache.org<fo...@xmlgraphics.apache.org>
> For additional commands, e-mail: fop-users-help@xmlgraphics.**apache.org<fo...@xmlgraphics.apache.org>
>
>

Re: FOP 1.1 out of memory

Posted by Stefan Hinz <st...@oracle.com>.
Thanks for confirming, Glenn!

On 26.10.2012 10:59, Glenn Adams wrote:
>
> On Thu, Oct 25, 2012 at 6:11 PM, Stefan Hinz <stefan.hinz@oracle.com
> <ma...@oracle.com>> wrote:
>
>     Thanks, Glenn, and thanks, Luis!
>
>     A combination of -nocs and -Xmx4096m made the build succeed. The
>     resulting PDF is 5742 pages, which might explain the issues I had.
>     My fop wrapper script looks like this:
>
>     #!/bin/sh
>     export FOP_OPTS=-Xmx4096m
>     exec /usr/bin/fop -nocs "$@"
>
>     BTW, is the -nocs option new to FOP 1.1? It doesn't seem to work
>     with FOP 1.0.
>
>
> yes, complex script (CS) support is new in 1.1
>


-- 
Cheers,

Stefan Hinz <st...@oracle.com>, MySQL Documentation Manager

Phone: +49-30-82702940, Fax: +49-30-82702941, http://dev.mysql.com/doc

ORACLE Deutschland B.V.&  Co. KG
Registered Office: Riesstr. 25, 80992 Muenchen, Germany
Commercial Register: Local Court Of Munich, HRA 95603
Managing Director: Jürgen Kunz

General Partner: ORACLE Deutschland Verwaltung B.V.
Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
Register Of Chamber Of Commerce: Midden-Niederlande, No. 30143697
Managing Directors: Alexander van der Ven, Astrid Kepper, Val Maher

---------------------------------------------------------------------
To unsubscribe, e-mail: fop-users-unsubscribe@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-help@xmlgraphics.apache.org


Re: FOP 1.1 out of memory

Posted by Glenn Adams <gl...@skynav.com>.
On Thu, Oct 25, 2012 at 6:11 PM, Stefan Hinz <st...@oracle.com> wrote:

>  Thanks, Glenn, and thanks, Luis!
>
> A combination of -nocs and -Xmx4096m made the build succeed. The resulting
> PDF is 5742 pages, which might explain the issues I had. My fop wrapper
> script looks like this:
>
> #!/bin/sh
> export FOP_OPTS=-Xmx4096m
> exec /usr/bin/fop -nocs "$@"
>
> BTW, is the -nocs option new to FOP 1.1? It doesn't seem to work with FOP
> 1.0.
>

yes, complex script (CS) support is new in 1.1

Re: FOP 1.1 out of memory

Posted by Glenn Adams <gl...@skynav.com>.
You might also try disabling complex script support, about which see [1],
if you don't require use of the complex text path. Somewhat greater memory
may be consumed when CS is enabled (which it is by default in 1.1).

[1]
http://xmlgraphics.apache.org/fop/1.1/complexscripts.html#Disabling+complex+scripts

On Thu, Oct 25, 2012 at 6:03 AM, Luis Bernardo <lm...@gmail.com>wrote:

>
> You can control how much memory the jvm uses by using the -Xmx flags. So I
> think you can try that first.
>
> The only situation where I know that FOP runs out of memory (also in a
> machine with 8 GB) is when you have a very long paragraph (and I mean a
> paragraph with 200K+ words). Then the line breaking algorithm has to hold
> the full paragraph in memory and decide on the optimal break points and
> that very likely will use all available memory. Do you know if you have
> anything like that in your book? Having thousands of pages by itself should
> not be an issue. It really depends on the content and on how the line
> breaks happen (if you insert line breaks FOP uses a lot less memory than if
> you don't).
>
>
> On 10/24/12 7:49 PM, Stefan Hinz wrote:
>
>> I like FOP 1.1 a lot: Unlike previous versions, it tells you which page
>> it's processing, which can make debugging easier, and also it gives you
>> that warm fuzzy feeling that you're somewhat in control of things. :-)
>>
>> However, with really big books, I'm hitting a wall, like this:
>>
>> Okt 24, 2012 8:21:16 PM org.apache.fop.events.**LoggingEventListener
>> processEvent
>> INFO: Rendered page #2630.
>> Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit
>> exceeded
>>
>> That is, FOP stops at about 2600 pages with an out of memory error. On
>> Stackoverflow (http://preview.tinyurl.com/**94qute5<http://preview.tinyurl.com/94qute5>),
>> there's an indication why this happens:
>>
>> "This message means that for some reason the garbage collector is taking
>> an excessive amount of time (by default 98% of all CPU time of the process)
>> and recovers very little memory in each run (by default 2% of the heap).
>> This effectively means that your program stops doing any progress and is
>> busy running only the garbage collection at all time."
>>
>> Does this mean it's a FOP 1.1 bug, or would there be anything I could do
>> to give it/Java more memory and prevent it from failing?
>>
>> The error happens on an 8 GB RAM 4-core machine. At the time FOP ended,
>> there was like 2 GB RAM left.
>>
>>
>
> ------------------------------**------------------------------**---------
> To unsubscribe, e-mail: fop-users-unsubscribe@**xmlgraphics.apache.org<fo...@xmlgraphics.apache.org>
> For additional commands, e-mail: fop-users-help@xmlgraphics.**apache.org<fo...@xmlgraphics.apache.org>
>
>

Re: FOP 1.1 out of memory

Posted by Luis Bernardo <lm...@gmail.com>.
You can control how much memory the jvm uses by using the -Xmx flags. So 
I think you can try that first.

The only situation where I know that FOP runs out of memory (also in a 
machine with 8 GB) is when you have a very long paragraph (and I mean a 
paragraph with 200K+ words). Then the line breaking algorithm has to 
hold the full paragraph in memory and decide on the optimal break points 
and that very likely will use all available memory. Do you know if you 
have anything like that in your book? Having thousands of pages by 
itself should not be an issue. It really depends on the content and on 
how the line breaks happen (if you insert line breaks FOP uses a lot 
less memory than if you don't).

On 10/24/12 7:49 PM, Stefan Hinz wrote:
> I like FOP 1.1 a lot: Unlike previous versions, it tells you which 
> page it's processing, which can make debugging easier, and also it 
> gives you that warm fuzzy feeling that you're somewhat in control of 
> things. :-)
>
> However, with really big books, I'm hitting a wall, like this:
>
> Okt 24, 2012 8:21:16 PM org.apache.fop.events.LoggingEventListener 
> processEvent
> INFO: Rendered page #2630.
> Exception in thread "main" java.lang.OutOfMemoryError: GC overhead 
> limit exceeded
>
> That is, FOP stops at about 2600 pages with an out of memory error. On 
> Stackoverflow (http://preview.tinyurl.com/94qute5), there's an 
> indication why this happens:
>
> "This message means that for some reason the garbage collector is 
> taking an excessive amount of time (by default 98% of all CPU time of 
> the process) and recovers very little memory in each run (by default 
> 2% of the heap).
> This effectively means that your program stops doing any progress and 
> is busy running only the garbage collection at all time."
>
> Does this mean it's a FOP 1.1 bug, or would there be anything I could 
> do to give it/Java more memory and prevent it from failing?
>
> The error happens on an 8 GB RAM 4-core machine. At the time FOP 
> ended, there was like 2 GB RAM left.
>


---------------------------------------------------------------------
To unsubscribe, e-mail: fop-users-unsubscribe@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-help@xmlgraphics.apache.org