You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@continuum.apache.org by "Brett Porter (JIRA)" <ji...@codehaus.org> on 2014/06/04 07:44:10 UTC
[jira] (CONTINUUM-2119) Cleaning up large working directories fails
with OutOfMemoryError
[ https://jira.codehaus.org/browse/CONTINUUM-2119?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Brett Porter updated CONTINUUM-2119:
------------------------------------
Fix Version/s: (was: 1.4.2)
1.5.0
> Cleaning up large working directories fails with OutOfMemoryError
> ------------------------------------------------------------------
>
> Key: CONTINUUM-2119
> URL: https://jira.codehaus.org/browse/CONTINUUM-2119
> Project: Continuum
> Issue Type: Bug
> Components: Core system
> Affects Versions: 1.2.3
> Reporter: Frank Förstemann
> Assignee: Brent N Atkinson
> Fix For: 1.5.0
>
> Attachments: CONTINUUM-2119-continuum-core.patch
>
>
> The action clean-working-directory runs out of heap space while cleaning up huge working directories (30GB / 500.000 files & directories in our case):
> 2009-03-08 06:00:14,582 [pool-1-thread-1] INFO buildController - Initializing build
> 2009-03-08 06:00:15,286 [pool-1-thread-1] INFO buildController - Starting build of ivu_plan_nightly_build
> 2009-03-08 06:00:15,301 [pool-1-thread-1] INFO buildController - Purging exiting working copy
> 2009-03-08 06:00:15,301 [pool-1-thread-1] INFO buildController - Performing action clean-working-directory
> 2009-03-08 06:31:39,209 [Thread-3] ERROR taskQueueExecutor#build-project - Error executing task
> edu.emory.mathcs.backport.java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
> at edu.emory.mathcs.backport.java.util.concurrent.FutureTask.getResult(FutureTask.java:301)
> at edu.emory.mathcs.backport.java.util.concurrent.FutureTask.get(FutureTask.java:120)
> at org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable.waitForTask(ThreadedTaskQueueExecutor.java:159)
> at org.codehaus.plexus.taskqueue.execution.ThreadedTaskQueueExecutor$ExecutorRunnable.run(ThreadedTaskQueueExecutor.java:127)
> Caused by: java.lang.OutOfMemoryError: Java heap space
> The issue seems to be caused by the implementation in CleanWorkingDirectoryAction: the FileSetManager used to delete the working directory first scans all files & directores to create an object representation of the tree and then deletes the tree by scanning through the objects. As no filter condition is required using org.codehaus.plexus.util.FileUtils to directly delete the tree would avoid these kind of problems.
> I'll attach a patch based on 1.2.3.
--
This message was sent by Atlassian JIRA
(v6.1.6#6162)