You are viewing a plain text version of this content. The canonical link for it is here.
Posted to fop-dev@xmlgraphics.apache.org by "Amit (Jira)" <ji...@apache.org> on 2022/08/01 15:13:00 UTC

[jira] [Commented] (FOP-2860) BreakingAlgorithm causes high memory consumption

    [ https://issues.apache.org/jira/browse/FOP-2860?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17573805#comment-17573805 ] 

Amit commented on FOP-2860:
---------------------------

[~ssteiner] or anyone else, this problem although rare and probably could be solved with smaller paras is something that should be optimised. It takes 3-8 GB of memory sometimes to process large paras of 10-40MB and of course also with large tables. Primarily due to Knuth nodes , is there some optimisation which can be applied to this page break algo to optimize this ?

> BreakingAlgorithm causes high memory consumption
> ------------------------------------------------
>
>                 Key: FOP-2860
>                 URL: https://issues.apache.org/jira/browse/FOP-2860
>             Project: FOP
>          Issue Type: Bug
>    Affects Versions: 2.3
>            Reporter: Raman Katsora
>            Priority: Critical
>         Attachments: image-2019-04-16-10-07-53-502.png, test-1500000.fo, test-250000.fo, test-300000.fo
>
>
> when a single element (e.g. {{<fo:block>}}) contains a sufficiently large amount of text, the fo-to-pdf transformation causes very high memory consumption.
> For instance, transforming a document with {{<fo:block>}} containing 1.5 million characters (~1.5Mb [^test-1500000.fo]) requires about 3Gb of RAM.
> The heapdump shows 27.5 million {{org.apache.fop.layoutmgr.BreakingAlgorithm.KnuthNode}} (~2.6Gb).
> We start observing this issue, having about 300 thousand characters in a single element ([^test-300000.fo]). But the high memory consumption isn't observed when processing 250 thousand characters ([^test-250000.fo]).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)