You are viewing a plain text version of this content. The canonical link for it is here.
Posted to fop-dev@xmlgraphics.apache.org by Eric Douglas <ed...@blockhouse.com> on 2011/05/26 16:03:22 UTC

Fop Memory Leak

I could use some help tracking this down.  I created a Print Preview
program but it just runs out of memory after a dozen or two pages.
I started with a program which generates XML data, transforms it into FO
data, then transforms that.
There are 2 transforms in it, one to create a PDF and one for Print
Preview.
When I tried to create a large FO and generate a PDF of 1000+ pages it
crashed, out of memory.
I changed it to break the output.  It now stores an array of FO files
for no more than 10 pages each, using XSL's intiial-page-number to store
a starting point for each document fragment.
To create a PDF I call the transformer to get the PDF from the Fop
output stream in byte format for each FO.  Then I load them into PDF
objects using the pfdbox project, create a new PDF with it and copy in
the pages to merge them.
This works great for PDF.  I was able to generate a report this way of
over 1800 pages.
Now I try it for print preview, generating 10 pages at a time, calling
the render process again when a new size or a page from a different
block is requested.
The memory use never drops on these calls but in 2 places I see it
jumping up, whenever it calls Transformer.transform with a PNGRenderer,
and whenever I try to create a new image for a page even though I'm only
generating 10 pages and on each render I replace the previous array of
images.
If I do something as simple as this the memory jumps up sometimes on the
create.
myBufferedImage = myImageArray.get(pageNum);
myBufferedImage.flush();
myBufferedImage = new
BufferedImage(sameWidthAsOldImage,sameHeightAsOldImage);
It does this if I call PNGRenderer.getPageImage(pageNum) (from
Java2DRenderer), so I tried creating the BufferedImage (or VolatileImage
same issue) myself and using the Java2DRenderer.print method to draw the
page on it.  Memory use keeps increasing on the image create and on the
transform.
The transform is all done in one method, with only the FopFactory and
TransformerFactory remaining (created static final to the class) after
the method so a second call shouldn't use any more than the first.

RE: Fop Memory Leak

Posted by Eric Douglas <ed...@blockhouse.com>.
That would defeat the purpose.  It appears stopRenderer gets called in
the transform method.  After the transform I need to get the Viewport
values to generate page images.
It appears the transform increases memory use temporarily, releasing it
when done.  It was just increasing my heap size on subsequent calls
because I was using additional memory between calls so I was closer to
the limit.
The additional memory was in the class I was using to call the class
doing the rendering.  I was getting a copy of the image in an array used
to display on a GUI window.  I wasn't clearing that array so it was
holding the old references.
My memory problem now is just in the proprietary API I'm using (an
extension of Oracle's webstart) where I do that rendering on a server
then serialize it to a client to display.  That render hits a memory
limit I'm not sure why.

________________________________

From: Marquart, Joshua D [mailto:joshua.marquart@firstdata.com] 
Sent: Friday, May 27, 2011 11:08 AM
To: fop-dev@xmlgraphics.apache.org
Subject: RE: Fop Memory Leak



Eric-

 

On the PNGRenderer code, the method stopRenderer() does not run a
similar cleanup that TIFFRenderer (which also extends Java2DRenderer)
calls.

 

At the end of the TIFFRenderer method,       

 

        clearViewportList();

 

is called.

 

Can you try extending PNGRenderer with a custom renderer that overrides
the stopRenderer() method with

 

public void stopRenderer() throws IOException {

                super.stopRenderer();

                clearViewportList();

}

 

And see if that helps?

 

I don't see it in trunk, and this is 100% guesswork on my part, and it
could completely screw up PNGs created, but it might work.  J

 

-Josh

 

 

From: Eric Douglas [mailto:edouglas@blockhouse.com] 
Sent: Thursday, May 26, 2011 10:03 AM
To: fop-dev@xmlgraphics.apache.org
Subject: Fop Memory Leak

 

I could use some help tracking this down.  I created a Print Preview
program but it just runs out of memory after a dozen or two pages.

I started with a program which generates XML data, transforms it into FO
data, then transforms that. 
There are 2 transforms in it, one to create a PDF and one for Print
Preview. 
When I tried to create a large FO and generate a PDF of 1000+ pages it
crashed, out of memory. 
I changed it to break the output.  It now stores an array of FO files
for no more than 10 pages each, using XSL's intiial-page-number to store
a starting point for each document fragment.

To create a PDF I call the transformer to get the PDF from the Fop
output stream in byte format for each FO.  Then I load them into PDF
objects using the pfdbox project, create a new PDF with it and copy in
the pages to merge them.

This works great for PDF.  I was able to generate a report this way of
over 1800 pages. 
Now I try it for print preview, generating 10 pages at a time, calling
the render process again when a new size or a page from a different
block is requested.

The memory use never drops on these calls but in 2 places I see it
jumping up, whenever it calls Transformer.transform with a PNGRenderer,
and whenever I try to create a new image for a page even though I'm only
generating 10 pages and on each render I replace the previous array of
images.

If I do something as simple as this the memory jumps up sometimes on the
create. 
myBufferedImage = myImageArray.get(pageNum); 
myBufferedImage.flush(); 
myBufferedImage = new
BufferedImage(sameWidthAsOldImage,sameHeightAsOldImage); 
It does this if I call PNGRenderer.getPageImage(pageNum) (from
Java2DRenderer), so I tried creating the BufferedImage (or VolatileImage
same issue) myself and using the Java2DRenderer.print method to draw the
page on it.  Memory use keeps increasing on the image create and on the
transform.

The transform is all done in one method, with only the FopFactory and
TransformerFactory remaining (created static final to the class) after
the method so a second call shouldn't use any more than the first.

________________________________

The information in this message may be proprietary and/or confidential,
and protected from disclosure. If the reader of this message is not the
intended recipient, or an employee or agent responsible for delivering
this message to the intended recipient, you are hereby notified that any
dissemination, distribution or copying of this communication is strictly
prohibited. If you have received this communication in error, please
notify First Data immediately by replying to this message and deleting
it from your computer. 


RE: Fop Memory Leak

Posted by "Marquart, Joshua D" <jo...@firstdata.com>.
Eric-

 

On the PNGRenderer code, the method stopRenderer() does not run a
similar cleanup that TIFFRenderer (which also extends Java2DRenderer)
calls.

 

At the end of the TIFFRenderer method,       

 

        clearViewportList();

 

is called.

 

Can you try extending PNGRenderer with a custom renderer that overrides
the stopRenderer() method with

 

public void stopRenderer() throws IOException {

                super.stopRenderer();

                clearViewportList();

}

 

And see if that helps?

 

I don't see it in trunk, and this is 100% guesswork on my part, and it
could completely screw up PNGs created, but it might work.  J

 

-Josh

 

 

From: Eric Douglas [mailto:edouglas@blockhouse.com] 
Sent: Thursday, May 26, 2011 10:03 AM
To: fop-dev@xmlgraphics.apache.org
Subject: Fop Memory Leak

 

I could use some help tracking this down.  I created a Print Preview
program but it just runs out of memory after a dozen or two pages.

I started with a program which generates XML data, transforms it into FO
data, then transforms that. 
There are 2 transforms in it, one to create a PDF and one for Print
Preview. 
When I tried to create a large FO and generate a PDF of 1000+ pages it
crashed, out of memory. 
I changed it to break the output.  It now stores an array of FO files
for no more than 10 pages each, using XSL's intiial-page-number to store
a starting point for each document fragment.

To create a PDF I call the transformer to get the PDF from the Fop
output stream in byte format for each FO.  Then I load them into PDF
objects using the pfdbox project, create a new PDF with it and copy in
the pages to merge them.

This works great for PDF.  I was able to generate a report this way of
over 1800 pages. 
Now I try it for print preview, generating 10 pages at a time, calling
the render process again when a new size or a page from a different
block is requested.

The memory use never drops on these calls but in 2 places I see it
jumping up, whenever it calls Transformer.transform with a PNGRenderer,
and whenever I try to create a new image for a page even though I'm only
generating 10 pages and on each render I replace the previous array of
images.

If I do something as simple as this the memory jumps up sometimes on the
create. 
myBufferedImage = myImageArray.get(pageNum); 
myBufferedImage.flush(); 
myBufferedImage = new
BufferedImage(sameWidthAsOldImage,sameHeightAsOldImage); 
It does this if I call PNGRenderer.getPageImage(pageNum) (from
Java2DRenderer), so I tried creating the BufferedImage (or VolatileImage
same issue) myself and using the Java2DRenderer.print method to draw the
page on it.  Memory use keeps increasing on the image create and on the
transform.

The transform is all done in one method, with only the FopFactory and
TransformerFactory remaining (created static final to the class) after
the method so a second call shouldn't use any more than the first.




-----------------------------------------
The information in this message may be proprietary and/or
confidential, and protected from disclosure.  If the reader of this
message is not the intended recipient, or an employee or agent
responsible for delivering this message to the intended recipient,
you are hereby notified that any dissemination, distribution or
copying of this communication is strictly prohibited. If you have
received this communication in error, please notify First Data
immediately by replying to this message and deleting it from your
computer.