You are viewing a plain text version of this content. The canonical link for it is here.
Posted to xmlbeans-dev@xml.apache.org by Matthias Kubik <KU...@de.ibm.com> on 2003/12/10 13:25:35 UTC

high volume processing

Hi all,
I'm new  to this list as I could not find any thing that would indicate 
memory problems in the list archive.
Now, here's what happened:

I was trying the easypo sample as described o the web site. After some 
script fixing (Linux) I finally got the sample to work.
As I have a requirement to process xml files that are 100MB+ in size, I 
had some expectations...that were not fulfilled.
It seems that even a 30MB file would run into an out-of-memory error.  I 
know I could temporarily fix that with giving the
JVM more memory. But this is not a solution. To me this looks like the 
whole DOM tree (if any) and the Object hierarchy is 
kept in memory. I'd love to see something more "intelligent" there.
My question now is, will that be addressed in V2 or is it even a design 
goal? (didn't find anything in the project mgt, tho).

Thanks
 - matthias