You are viewing a plain text version of this content. The canonical link for it is here.
Posted to j-users@xalan.apache.org by Lilantha Darshana <Li...@virtusa.com> on 2003/07/15 05:48:09 UTC

FW: I'm getting java.lang.OutOfMemoryError when my XML (data) fil es is very large (260MB)

> Hi All,
> 
> I have a style sheet (attached - CDRGlobalDownload.zip) which has XSLT
> extensions 
> with javascripts. When I try to transforms my XML data files of size about
> 260 MB, 
> the parser fails with the error: java.lang.OutOfMemoryError. When I try
> with small XML 
> data files of size 445KB & 21MB the parser works without any errors.
> When I tried with "org.apache.xalan.xslt.Process" class it gave me
> following 
> error: 
> 
> (Location of error unknown)XSLT Error (java.lang.OutOfMemoryError): null
> 
> I modified the SAX2SAX example (attached - XSLTransform.java) to do my 
> transformation but still it gives me the same error.
> I have about 1GB of physical memory & 2 GB of virtual memory.
> 
> Could some one help me on figuring out this issue.
> 
> regards
> Lilantha
> 
> 
>  <<XSLTransform.java>>  <<CDRGlobalDownload.zip>> 


Re: FW: I'm getting java.lang.OutOfMemoryError when my XML (data) fil es is very large (260MB)

Posted by Simon Kitching <si...@ecnetwork.co.nz>.
Hi Lilantha,

The general rule of thumb is that when loaded into memory, a DOM
representation takes around 5 times the size of the input document.

So the DOM generated from your 260MB file will be about 1.3GBytes.
That's a big chunk of your available memory used up straight away.

I presume that you have run the JVM with the "-Xmx" option, to set the
max memory the JVM can allocate? If not, consult your JVM documentation
for how to use this option.

If you have used "-Xmx" and are still getting "out of memory" then I
think that probably is indeed the case, and that you just will not be
able to transform that file. You could try increasing the amount of
virtual memory available (though that will have horrible effects on the
time required to process your file).

Is this just a one-off, or are you intending to transform files of that
size on a regular basis? Frankly, if you are intending to do this
transform often then I think that XSLT is the wrong solution. XSLT is a
fine thing, but just isn't going to work on input data like that.

And I hate to think how long that transform will take...have you tried
to extrapolate from your smaller successful transforms to guess how long
a 260MB file will take to process? You might find that it isn't worth
trying to get it working if it is going to take a week to finish :-)

Note that using "SAX2SAX" doesn't bypass the need to build a DOM
representation of the input; XSLT basically requires a DOM.

You might want to look at STX as an alternative to XSLT (less features
but more efficient esp. for large documents).

Regards,

Simon


On Tue, 2003-07-15 at 15:48, Lilantha Darshana wrote:
> > Hi All,
> > 
> > I have a style sheet (attached - CDRGlobalDownload.zip) which has XSLT
> > extensions 
> > with javascripts. When I try to transforms my XML data files of size about
> > 260 MB, 
> > the parser fails with the error: java.lang.OutOfMemoryError. When I try
> > with small XML 
> > data files of size 445KB & 21MB the parser works without any errors.
> > When I tried with "org.apache.xalan.xslt.Process" class it gave me
> > following 
> > error: 
> > 
> > (Location of error unknown)XSLT Error (java.lang.OutOfMemoryError): null
> > 
> > I modified the SAX2SAX example (attached - XSLTransform.java) to do my 
> > transformation but still it gives me the same error.
> > I have about 1GB of physical memory & 2 GB of virtual memory.
> > 
> > Could some one help me on figuring out this issue.
> > 
> > regards
> > Lilantha
> > 
> > 
> >  <<XSLTransform.java>>  <<CDRGlobalDownload.zip>> 
>