You are viewing a plain text version of this content. The canonical link for it is here.
Posted to j-users@xalan.apache.org by Henry Zongaro <zo...@ca.ibm.com> on 2005/05/02 20:11:09 UTC
Fw: Help using the Xalan mailing lists
[Forwarding for Higino Silva, who is having difficulty posting to
xalan-j-users.]
Hi, Higino.
I don't see anything wrong with your program. I do not believe that
this is a known problem.
Are you able to reproduce the failure if you do not use the
incremental processing feature of Xalan-j? Are you able to reproduce it
using the org.apache.xalan.xslt.Process command? 200000 lines doesn't
sound like it should consume 1.34 GB of memory, but it's hard to say for
certain without knowing how many elements, attributes and character data
are packed into those 200000 lines.
Thanks,
Henry
------------------------------------------------------------------
Henry Zongaro Xalan development
IBM SWS Toronto Lab T/L 969-6044; Phone +1 905 413-6044
mailto:zongaro@ca.ibm.com
----- Forwarded by Henry Zongaro/Toronto/IBM on 2005-05-02 01:39 PM -----
Higino Silva, EngÂș <hs...@tmn.pt>
2005-05-02 11:59 AM
To
Henry Zongaro/Toronto/IBM@IBMCA
cc
Subject
Help using the Xalan mailing lists
Hi,
I'm using Xalan to convert big XML files. Following is a simplified
version of the code used:
StreamResult strResult = null; // Shut up compiler
if( conv.outFileName == null )
strResult = new StreamResult(System.out);
else
strResult = new StreamResult(new
File(conv.outFileName));
strResult.setSystemId(conv.outFileName);
// Use a Transformer for output
// Use the TransformerFactory to instantiate a
Transformer that will work with
// the stylesheet you specify. This method call
also processes the stylesheet
// into a compiled Templates object.
TransformerFactory tFactory =
TransformerFactory.newInstance();
tFactory.setAttribute("
http://xml.apache.org/xalan/features/incremental",
java.lang.Boolean.TRUE);
Transformer transformer =
tFactory.newTransformer(new StreamSource(xslFileName));
BufferedReader br = new BufferedReader(new
FileReader(conv.inputFileName));
InputSource inputSource = new InputSource(br);
XMLFilereader gsmReader = new UBEEFileReader();
// Use the parser as a SAX source for input
SAXSource source = new SAXSource(gsmReader,
inputSource);
transformer.transform(source, strResult);
When I'm aplying this code to tranform an input file with let's say
200,000 lines the process instance starts comsuming arround 64M of memory
and keeps increasing to 1.34GB untill is crashes with out of memory.
Can someone explain what I'm doing wrong or how can I prevent this memory
creep?
Besta regards
Higino Silva