You are viewing a plain text version of this content. The canonical link for it is here.
Posted to general@xerces.apache.org by Christian Lizell <Ch...@athega.se> on 2000/02/11 10:45:49 UTC

Caching problem? (was: Re: cloneNode() eats memory?)

Hi!

I have tried every possible way that I can think of
in order to reclaim memory from a created node.

Here is a very simple test case illustrating the problem:

<CODE>
import org.w3c.dom.*;
import org.apache.xerces.dom.*;

public class CloneTest3 {
    public static void main(String[] args) {
	// Make Nodes
	for (int i=0; i<=10000; i++) {
	    new DocumentImpl().createTextNode("");
	    if ((i%1000) == 0) {
		System.gc(); System.runFinalization();
		Runtime rt = Runtime.getRuntime();
		System.out.println("Memory usage after " + i + " empty text nodes: " + (rt.totalMemory()-rt.freeMemory()));
	    }
	}
    }
}
</CODE>

When executing the memory for the created text nodes are never
reclaimed:

<CODE>
Memory usage after 0 clones: 2018096
Memory usage after 1000 clones: 2267040
Memory usage after 2000 clones: 2515040
Memory usage after 3000 clones: 2763040
Memory usage after 4000 clones: 3011040
Memory usage after 5000 clones: 3259040
Memory usage after 6000 clones: 3507040
Memory usage after 7000 clones: 3755040
Memory usage after 8000 clones: 4003040
Memory usage after 9000 clones: 4251040
Memory usage after 10000 clones: 4499040
</CODE>


Isn't this weird?
How can I reclaim the memory?
Are the nodes statically cached in some class?


Any help on this is very much appreciated. I am entering panic mode here. :)

Thanks,
/Christian Lizell

Re: Caching problem? (was: Re: cloneNode() eats memory?)

Posted by Christian Lizell <Ch...@athega.se>.
Hi again!

After testing the same sample program as in my original posting
"cloneNode() eats memory?" with the old xml4j classes I found
that they did not eat the memory as the xerces-J classes do.


This is the sample program:

<CODE>
import org.w3c.dom.*;
import com.ibm.xml.dom.*;
import com.ibm.xml.parsers.DOMParser;

public class CloneTest {
    public static void main(String[] args) {
	// Parse the document
	DOMParser parser = new DOMParser();
	try {
	    parser.parse("clonetest.xml");
	}
	catch (Exception e) {
	    e.printStackTrace();
	    System.exit(0);
	}

	// Get some_node
	Node someNode = null;
	try {
	    someNode = parser.getDocument().getElementsByTagName("some_node").item(0);
	} catch (Exception e) {
	    e.printStackTrace();
	    System.exit(0);
	}

	// Make clones
	for (int i=0; i<=100; i++) {
	    Node clonedNode = ((NodeImpl)someNode).cloneNode(true);
	    clonedNode = null;
	    if ((i%10) == 0) {
		System.gc(); System.runFinalization();
		Runtime rt = Runtime.getRuntime();
		System.out.println("Memory usage after " + i + " clones: " + (rt.totalMemory()-rt.freeMemory()));
	    }
	}
    }
}
</CODE>


This is the output using xerces-J:

Memory usage after 0 clones: 2206416
Memory usage after 1000 clones: 2831232
Memory usage after 2000 clones: 3455232
Memory usage after 3000 clones: 4079232
Memory usage after 4000 clones: 4703232
Memory usage after 5000 clones: 5327232
Memory usage after 6000 clones: 5951232
Memory usage after 7000 clones: 6575232
Memory usage after 8000 clones: 7199232
Memory usage after 9000 clones: 7823232
Memory usage after 10000 clones: 8447232


This is the output using xml4j:

Memory usage after 0 clones: 2362576
Memory usage after 1000 clones: 2363400
Memory usage after 2000 clones: 2363400
Memory usage after 3000 clones: 2363400
Memory usage after 4000 clones: 2363400
Memory usage after 5000 clones: 2363400
Memory usage after 6000 clones: 2363400
Memory usage after 7000 clones: 2363400
Memory usage after 8000 clones: 2363400
Memory usage after 9000 clones: 2363400
Memory usage after 10000 clones: 2363400



This indicates some sort of memory leak in the xerces-J classes.
Anyone who can help me out nailing it?

Regards,
/Christian Lizell

Re: Caching problem? (was: Re: cloneNode() eats memory?)

Posted by Ch...@athega.se.
Hi!

Thank you for your kind reply. I'm aware of how the GC works and the problem
was that on my system I actually got an OutOfMemory when increasing the iterations
to 100,000. This did not happen with the old xml4j. However the problem was my
JVM. Thanks for pointing that out!

If anyone is running Linux and Sun's JDK 1.2pre-v2 and is experiencing memory leaks
with Xerces-J, the problem is in the JDK. After upgrading to JDK 1.2 RC2 everything
works like a charm!

Thank you all for your help.

/Christian Lizell



"George T. Joseph" wrote:
> 
> Christian,
> 
> I ran your sample and found that used memory does increase, but that's not
> necessarily a problem or a leak.  System.gc() is only a hint to the JVM that GC
> could be run.  If the JVM detects that there's still plenty of free space on the
> heap, it may not bother.  In your example, add the display of rt.totalMemory()
> as well as (rt.totalMemory()-rt.freeMemory()) and you'll probably see that by
> the time your program ends, only 50% of the heap is actually used (depending on
> the JVM used).  The program ends before GC was needed.
> 
> If you want to see the GC in action, change the iterations from 10,000 to
> 100,000 and you should see that used memory rises and falls as the GC is run.
> You may also see that the program runs at a slower rate for the same reason.
> 
> What operating system and Java VM are you running?  Each combination of the two
> will probably result in different memory utilization profiles.  For instance,
> the HotSpot performance engine used with Sun's JVMs seems to use more memory and
> run GC less while the Classic version of the same JVM seems to use less memory
> and run GC more often.  Different JVM may also have differently sized initial
> heaps and different heap size limits.
> 
> george
> 
> -----Original Message-----
> From: chrille@shield.athega.se [mailto:chrille@shield.athega.se]On
> Behalf Of Christian Lizell
> Sent: Friday, February 11, 2000 4:46 AM
> To: xerces-dev@xml.apache.org
> Subject: Caching problem? (was: Re: cloneNode() eats memory?)
> 
> Hi!
> 
> I have tried every possible way that I can think of
> in order to reclaim memory from a created node.
> 
> Here is a very simple test case illustrating the problem:
> 
> <CODE>
> import org.w3c.dom.*;
> import org.apache.xerces.dom.*;
> 
> public class CloneTest3 {
>     public static void main(String[] args) {
>         // Make Nodes
>         for (int i=0; i<=10000; i++) {
>             new DocumentImpl().createTextNode("");
>             if ((i%1000) == 0) {
>                 System.gc(); System.runFinalization();
>                 Runtime rt = Runtime.getRuntime();
>                 System.out.println("Memory usage after " + i + " empty text nodes: " +
> (rt.totalMemory()-rt.freeMemory()));
>             }
>         }
>     }
> }
> </CODE>
> 
> When executing the memory for the created text nodes are never
> reclaimed:
> 
> <CODE>
> Memory usage after 0 clones: 2018096
> Memory usage after 1000 clones: 2267040
> Memory usage after 2000 clones: 2515040
> Memory usage after 3000 clones: 2763040
> Memory usage after 4000 clones: 3011040
> Memory usage after 5000 clones: 3259040
> Memory usage after 6000 clones: 3507040
> Memory usage after 7000 clones: 3755040
> Memory usage after 8000 clones: 4003040
> Memory usage after 9000 clones: 4251040
> Memory usage after 10000 clones: 4499040
> </CODE>
> 
> Isn't this weird?
> How can I reclaim the memory?
> Are the nodes statically cached in some class?
> 
> Any help on this is very much appreciated. I am entering panic mode here. :)
> 
> Thanks,
> /Christian Lizell

RE: Caching problem? (was: Re: cloneNode() eats memory?)

Posted by "George T. Joseph" <gt...@peakin.com>.
Christian,

I ran your sample and found that used memory does increase, but that's not
necessarily a problem or a leak.  System.gc() is only a hint to the JVM that GC
could be run.  If the JVM detects that there's still plenty of free space on the
heap, it may not bother.  In your example, add the display of rt.totalMemory()
as well as (rt.totalMemory()-rt.freeMemory()) and you'll probably see that by
the time your program ends, only 50% of the heap is actually used (depending on
the JVM used).  The program ends before GC was needed.

If you want to see the GC in action, change the iterations from 10,000 to
100,000 and you should see that used memory rises and falls as the GC is run.
You may also see that the program runs at a slower rate for the same reason.

What operating system and Java VM are you running?  Each combination of the two
will probably result in different memory utilization profiles.  For instance,
the HotSpot performance engine used with Sun's JVMs seems to use more memory and
run GC less while the Classic version of the same JVM seems to use less memory
and run GC more often.  Different JVM may also have differently sized initial
heaps and different heap size limits.

george

-----Original Message-----
From: chrille@shield.athega.se [mailto:chrille@shield.athega.se]On
Behalf Of Christian Lizell
Sent: Friday, February 11, 2000 4:46 AM
To: xerces-dev@xml.apache.org
Subject: Caching problem? (was: Re: cloneNode() eats memory?)


Hi!

I have tried every possible way that I can think of
in order to reclaim memory from a created node.

Here is a very simple test case illustrating the problem:

<CODE>
import org.w3c.dom.*;
import org.apache.xerces.dom.*;

public class CloneTest3 {
    public static void main(String[] args) {
	// Make Nodes
	for (int i=0; i<=10000; i++) {
	    new DocumentImpl().createTextNode("");
	    if ((i%1000) == 0) {
		System.gc(); System.runFinalization();
		Runtime rt = Runtime.getRuntime();
		System.out.println("Memory usage after " + i + " empty text nodes: " +
(rt.totalMemory()-rt.freeMemory()));
	    }
	}
    }
}
</CODE>

When executing the memory for the created text nodes are never
reclaimed:

<CODE>
Memory usage after 0 clones: 2018096
Memory usage after 1000 clones: 2267040
Memory usage after 2000 clones: 2515040
Memory usage after 3000 clones: 2763040
Memory usage after 4000 clones: 3011040
Memory usage after 5000 clones: 3259040
Memory usage after 6000 clones: 3507040
Memory usage after 7000 clones: 3755040
Memory usage after 8000 clones: 4003040
Memory usage after 9000 clones: 4251040
Memory usage after 10000 clones: 4499040
</CODE>


Isn't this weird?
How can I reclaim the memory?
Are the nodes statically cached in some class?


Any help on this is very much appreciated. I am entering panic mode here. :)

Thanks,
/Christian Lizell