You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Apache Hudson Server <hu...@hudson.zones.apache.org> on 2009/03/02 20:07:50 UTC

Build failed in Hudson: Hadoop-trunk #770

See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/770/changes

Changes:

[yhemanth] HADOOP-4638. Fixes job recovery to not crash the job tracker for problems with a single job file. Contributed by Amar Kamat.

[yhemanth] HADOOP-4744. Workaround for jetty6 returning -1 when getLocalPort is invoked on the connector, by retrying a few times. Contributed by Jothi Padmanabhan.

------------------------------------------
[...truncated 299791 lines...]
    [junit] 2009-03-02 19:15:02,711 INFO  mapred.IndexCache (IndexCache.java:<init>(46)) - IndexCache created with max memory = 10485760
    [junit] 2009-03-02 19:15:02,822 INFO  net.NetworkTopology (NetworkTopology.java:add(328)) - Adding a new node: /default-rack/host1.foo.com
    [junit] 2009-03-02 19:15:02,827 INFO  net.NetworkTopology (NetworkTopology.java:add(328)) - Adding a new node: /default-rack/host0.foo.com
    [junit] rootdir = /test/testDistCh
    [junit] root=/test/testDistCh, returnvalue=0
    [junit] results:
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-02 19:15 /test/testDistCh/f1
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-02 19:15 /test/testDistCh/f2
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-02 19:15 /test/testDistCh/f3
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-02 19:15 /test/testDistCh/sub0
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-02 19:15 /test/testDistCh/sub0/f4
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-02 19:15 /test/testDistCh/sub1
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-02 19:15 /test/testDistCh/sub2
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-02 19:15 /test/testDistCh/sub2/f5
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-02 19:15 /test/testDistCh/sub2/f6
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-02 19:15 /test/testDistCh/sub3
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-02 19:15 /test/testDistCh/sub3/f7
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-02 19:15 /test/testDistCh/sub3/f8
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-02 19:15 /test/testDistCh/sub4
    [junit] 
    [junit] args=[/test/testDistCh/sub0:sub0:sub0:,
    [junit]    /test/testDistCh/sub2:sub2::620]
    [junit] newstatus=[sub0:sub0:rwxr-xr-x,
    [junit]    hudson:supergroup:rwxr-xr-x,
    [junit]    sub2:supergroup:rw--w----,
    [junit]    hudson:supergroup:rwxr-xr-x,
    [junit]    hudson:supergroup:rwxr-xr-x]
    [junit] 2009-03-02 19:15:03,276 INFO  tools.DistTool (DistCh.java:run(376)) - ops=[/test/testDistCh/sub0:sub0:sub0:null, /test/testDistCh/sub2:sub2:null:rw--w----]
    [junit] 2009-03-02 19:15:03,277 INFO  tools.DistTool (DistCh.java:run(377)) - isIgnoreFailures=false
    [junit] 2009-03-02 19:15:03,295 INFO  tools.DistTool (DistCh.java:setup(427)) - distch.job.dir=hdfs://localhost:38656/user/hudson/build/test/mapred/system/distch_m477q4
    [junit] 2009-03-02 19:15:03,298 INFO  tools.DistTool (DistCh.java:setup(433)) - log=hdfs://localhost:38656/user/hudson/build/test/mapred/system/distch_m477q4/_logs
    [junit] 2009-03-02 19:15:03,377 INFO  tools.DistTool (DistCh.java:setup(476)) - distch.op.count=5
    [junit] 2009-03-02 19:15:03,384 WARN  mapred.JobClient (JobClient.java:configureCommandLineOptions(539)) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 2009-03-02 19:15:03,389 WARN  mapred.JobClient (JobClient.java:configureCommandLineOptions(661)) - No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
    [junit] 2009-03-02 19:15:03,402 INFO  tools.DistTool (DistCh.java:getSplits(261)) - numSplits=1, splits.size()=1
    [junit] 2009-03-02 19:15:03,522 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - build/test/mapred/local/jobTracker/job_200903021914_0001.xml:a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] 2009-03-02 19:15:03,550 INFO  mapred.JobClient (JobClient.java:runJob(1268)) - Running job: job_200903021914_0001
    [junit] 2009-03-02 19:15:03,789 INFO  mapred.JobInProgress (JobInProgress.java:initTasks(426)) - Input size for job job_200903021914_0001 = 563
    [junit] 2009-03-02 19:15:03,790 INFO  mapred.JobInProgress (JobInProgress.java:initTasks(428)) - Split info for job:job_200903021914_0001 with 1 splits:
    [junit] 2009-03-02 19:15:04,554 INFO  mapred.JobClient (JobClient.java:runJob(1291)) -  map 0% reduce 0%
    [junit] 2009-03-02 19:15:05,846 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1692)) - Adding task 'attempt_200903021914_0001_m_000002_0' to tip task_200903021914_0001_m_000002, for tracker 'tracker_host1.foo.com:localhost/127.0.0.1:44884'
    [junit] 2009-03-02 19:15:05,921 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903021914_0001/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] 2009-03-02 19:15:05,978 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903021914_0001_m_-680180399
    [junit] 2009-03-02 19:15:05,978 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903021914_0001_m_-680180399 spawned.
    [junit] 2009-03-02 19:15:06,934 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903021914_0001_m_-680180399 exited. Number of tasks it ran: 1
    [junit] 2009-03-02 19:15:08,855 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903021914_0001_m_000002_0' has completed task_200903021914_0001_m_000002 successfully.
    [junit] 2009-03-02 19:15:08,860 INFO  mapred.JobInProgress (JobInProgress.java:findNewMapTask(1820)) - Choosing a non-local task task_200903021914_0001_m_000000
    [junit] 2009-03-02 19:15:08,860 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1692)) - Adding task 'attempt_200903021914_0001_m_000000_0' to tip task_200903021914_0001_m_000000, for tracker 'tracker_host1.foo.com:localhost/127.0.0.1:44884'
    [junit] 2009-03-02 19:15:08,865 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903021914_0001_m_000002_0 done; removing files.
    [junit] 2009-03-02 19:15:08,866 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903021914_0001_m_000002_0 not found in cache
    [junit] 2009-03-02 19:15:08,912 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903021914_0001_m_204408398
    [junit] 2009-03-02 19:15:08,912 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903021914_0001_m_204408398 spawned.
    [junit] 2009-03-02 19:15:09,574 INFO  mapred.JobClient (JobClient.java:runJob(1340)) - Task Id : attempt_200903021914_0001_m_000002_0, Status : SUCCEEDED
    [junit] attempt_200903021914_0001_m_000002_0: 2009-03-02 19:15:06,578 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903021914_0001/attempt_200903021914_0001_m_000002_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903021914_0001_m_000002_0: 2009-03-02 19:15:06,658 INFO  jvm.JvmMetrics (JvmMetrics.java:init(71)) - Initializing JVM Metrics with processName=MAP, sessionId=
    [junit] attempt_200903021914_0001_m_000002_0: 2009-03-02 19:15:06,747 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903021914_0001/attempt_200903021914_0001_m_000002_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903021914_0001_m_000002_0: 2009-03-02 19:15:06,798 INFO  mapred.TaskRunner (Task.java:done(644)) - Task:attempt_200903021914_0001_m_000002_0 is done. And is in the process of commiting
    [junit] attempt_200903021914_0001_m_000002_0: 2009-03-02 19:15:06,804 INFO  mapred.TaskRunner (Task.java:sendDone(715)) - Task 'attempt_200903021914_0001_m_000002_0' done.
    [junit] 2009-03-02 19:15:12,057 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903021914_0001_m_204408398 exited. Number of tasks it ran: 1
    [junit] 2009-03-02 19:15:14,870 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903021914_0001_m_000000_0' has completed task_200903021914_0001_m_000000 successfully.
    [junit] 2009-03-02 19:15:14,874 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1692)) - Adding task 'attempt_200903021914_0001_m_000001_0' to tip task_200903021914_0001_m_000001, for tracker 'tracker_host1.foo.com:localhost/127.0.0.1:44884'
    [junit] 2009-03-02 19:15:14,908 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903021914_0001_m_-1347310764
    [junit] 2009-03-02 19:15:14,908 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903021914_0001_m_-1347310764 spawned.
    [junit] 2009-03-02 19:15:15,651 INFO  mapred.JobClient (JobClient.java:runJob(1291)) -  map 100% reduce 0%
    [junit] 2009-03-02 19:15:15,653 INFO  mapred.JobClient (JobClient.java:runJob(1340)) - Task Id : attempt_200903021914_0001_m_000000_0, Status : SUCCEEDED
    [junit] attempt_200903021914_0001_m_000000_0: 2009-03-02 19:15:09,478 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903021914_0001/attempt_200903021914_0001_m_000000_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903021914_0001_m_000000_0: 2009-03-02 19:15:09,558 INFO  jvm.JvmMetrics (JvmMetrics.java:init(71)) - Initializing JVM Metrics with processName=MAP, sessionId=
    [junit] attempt_200903021914_0001_m_000000_0: 2009-03-02 19:15:09,677 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903021914_0001/attempt_200903021914_0001_m_000000_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903021914_0001_m_000000_0: 2009-03-02 19:15:09,774 INFO  mapred.MapTask (MapTask.java:runOldMapper(343)) - numReduceTasks: 0
    [junit] attempt_200903021914_0001_m_000000_0: 2009-03-02 19:15:09,907 INFO  mapred.TaskRunner (Task.java:done(644)) - Task:attempt_200903021914_0001_m_000000_0 is done. And is in the process of commiting
    [junit] attempt_200903021914_0001_m_000000_0: 2009-03-02 19:15:11,916 INFO  mapred.TaskRunner (Task.java:commit(744)) - Task attempt_200903021914_0001_m_000000_0 is allowed to commit now
    [junit] attempt_200903021914_0001_m_000000_0: 2009-03-02 19:15:11,930 INFO  mapred.FileOutputCommitter (FileOutputCommitter.java:commitTask(92)) - Saved output of task 'attempt_200903021914_0001_m_000000_0' to hdfs://localhost:38656/user/hudson/build/test/mapred/system/distch_m477q4/_logs
    [junit] attempt_200903021914_0001_m_000000_0: 2009-03-02 19:15:11,936 INFO  mapred.TaskRunner (Task.java:sendDone(715)) - Task 'attempt_200903021914_0001_m_000000_0' done.
    [junit] 2009-03-02 19:15:15,857 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903021914_0001_m_-1347310764 exited. Number of tasks it ran: 1
    [junit] 2009-03-02 19:15:17,879 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903021914_0001_m_000001_0' has completed task_200903021914_0001_m_000001 successfully.
    [junit] 2009-03-02 19:15:17,880 INFO  mapred.JobInProgress (JobInProgress.java:jobComplete(2099)) - Job job_200903021914_0001 has completed successfully.
    [junit] 2009-03-02 19:15:17,992 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1813)) - Removed completed task 'attempt_200903021914_0001_m_000000_0' from 'tracker_host1.foo.com:localhost/127.0.0.1:44884'
    [junit] 2009-03-02 19:15:17,992 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1813)) - Removed completed task 'attempt_200903021914_0001_m_000001_0' from 'tracker_host1.foo.com:localhost/127.0.0.1:44884'
    [junit] 2009-03-02 19:15:17,993 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1813)) - Removed completed task 'attempt_200903021914_0001_m_000002_0' from 'tracker_host1.foo.com:localhost/127.0.0.1:44884'
    [junit] 2009-03-02 19:15:18,005 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903021914_0001_m_000001_0 done; removing files.
    [junit] 2009-03-02 19:15:18,006 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903021914_0001_m_000001_0 not found in cache
    [junit] 2009-03-02 19:15:18,006 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903021914_0001_m_000000_0 done; removing files.
    [junit] 2009-03-02 19:15:18,007 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903021914_0001_m_000000_0 not found in cache
    [junit] 2009-03-02 19:15:18,670 INFO  mapred.JobClient (JobClient.java:runJob(1358)) - Job complete: job_200903021914_0001
    [junit] 2009-03-02 19:15:18,672 INFO  mapred.JobClient (Counters.java:log(514)) - Counters: 7
    [junit] 2009-03-02 19:15:18,672 INFO  mapred.JobClient (Counters.java:log(516)) -   Job Counters 
    [junit] 2009-03-02 19:15:18,673 INFO  mapred.JobClient (Counters.java:log(518)) -     Launched map tasks=1
    [junit] 2009-03-02 19:15:18,673 INFO  mapred.JobClient (Counters.java:log(516)) -   org.apache.hadoop.tools.DistCh$Counter
    [junit] 2009-03-02 19:15:18,673 INFO  mapred.JobClient (Counters.java:log(518)) -     SUCCEED=5
    [junit] 2009-03-02 19:15:18,674 INFO  mapred.JobClient (Counters.java:log(516)) -   FileSystemCounters
    [junit] 2009-03-02 19:15:18,674 INFO  mapred.JobClient (Counters.java:log(518)) -     HDFS_BYTES_READ=563
    [junit] 2009-03-02 19:15:18,674 INFO  mapred.JobClient (Counters.java:log(516)) -   Map-Reduce Framework
    [junit] 2009-03-02 19:15:18,674 INFO  mapred.JobClient (Counters.java:log(518)) -     Map input records=5
    [junit] 2009-03-02 19:15:18,675 INFO  mapred.JobClient (Counters.java:log(518)) -     Spilled Records=0
    [junit] 2009-03-02 19:15:18,675 INFO  mapred.JobClient (Counters.java:log(518)) -     Map input bytes=466
    [junit] 2009-03-02 19:15:18,675 INFO  mapred.JobClient (Counters.java:log(518)) -     Map output records=0
    [junit] root=/test/testDistCh, returnvalue=0
    [junit] results:
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-02 19:15 /test/testDistCh/f1
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-02 19:15 /test/testDistCh/f2
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-02 19:15 /test/testDistCh/f3
    [junit] drwxr-xr-x   - sub0   sub0                0 2009-03-02 19:15 /test/testDistCh/sub0
    [junit] -rw-r--r--   2 sub0 sub0         43 2009-03-02 19:15 /test/testDistCh/sub0/f4
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-02 19:15 /test/testDistCh/sub1
    [junit] drw--w----   - sub2   supergroup          0 2009-03-02 19:15 /test/testDistCh/sub2
    [junit] -rw--w----   2 sub2 supergroup         43 2009-03-02 19:15 /test/testDistCh/sub2/f5
    [junit] -rw--w----   2 sub2 supergroup         43 2009-03-02 19:15 /test/testDistCh/sub2/f6
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-02 19:15 /test/testDistCh/sub3
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-02 19:15 /test/testDistCh/sub3/f7
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-02 19:15 /test/testDistCh/sub3/f8
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-02 19:15 /test/testDistCh/sub4
    [junit] 
    [junit] Shutting down the Mini HDFS Cluster
    [junit] Shutting down DataNode 1
    [junit] 2009-03-02 19:15:18,700 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 55372
    [junit] 2009-03-02 19:15:18,700 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 55372: exiting
    [junit] 2009-03-02 19:15:18,700 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 55372: exiting
    [junit] 2009-03-02 19:15:18,700 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 55372: exiting
    [junit] 2009-03-02 19:15:18,701 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 55372
    [junit] 2009-03-02 19:15:18,701 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-02 19:15:19,333 INFO  datanode.DataBlockScanner (DataBlockScanner.java:run(603)) - Exiting DataBlockScanner thread.
    [junit] 2009-03-02 19:15:19,702 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 55372
    [junit] Shutting down DataNode 0
    [junit] 2009-03-02 19:15:19,803 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 44107
    [junit] 2009-03-02 19:15:19,804 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 44107: exiting
    [junit] 2009-03-02 19:15:19,804 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 44107
    [junit] 2009-03-02 19:15:19,804 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 44107: exiting
    [junit] 2009-03-02 19:15:19,804 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 44107: exiting
    [junit] 2009-03-02 19:15:19,804 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-02 19:15:20,159 INFO  datanode.DataBlockScanner (DataBlockScanner.java:run(603)) - Exiting DataBlockScanner thread.
    [junit] 2009-03-02 19:15:20,804 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 44107
    [junit] 2009-03-02 19:15:20,907 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 38656
    [junit] 2009-03-02 19:15:20,907 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 3 on 38656: exiting
    [junit] 2009-03-02 19:15:20,907 INFO  namenode.DecommissionManager (DecommissionManager.java:run(67)) - Interrupted Monitor
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2009-03-02 19:15:20,908 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 6 on 38656: exiting
    [junit] 2009-03-02 19:15:20,908 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 7 on 38656: exiting
    [junit] 2009-03-02 19:15:20,908 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 38656: exiting
    [junit] 2009-03-02 19:15:20,908 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 5 on 38656: exiting
    [junit] 2009-03-02 19:15:20,907 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 4 on 38656: exiting
    [junit] 2009-03-02 19:15:20,908 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-02 19:15:20,907 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 38656: exiting
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 31.075 sec
    [junit] 2009-03-02 19:15:20,907 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 38656
    [junit] 2009-03-02 19:15:20,908 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 8 on 38656: exiting
    [junit] 2009-03-02 19:15:20,908 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 38656: exiting
    [junit] 2009-03-02 19:15:20,908 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 9 on 38656: exiting
    [junit] Running org.apache.hadoop.util.TestCyclicIteration
    [junit] 
    [junit] 
    [junit] integers=[]
    [junit] map={}
    [junit] start=-1, iteration=[]
    [junit] 
    [junit] 
    [junit] integers=[0]
    [junit] map={0=0}
    [junit] start=-1, iteration=[0]
    [junit] start=0, iteration=[0]
    [junit] start=1, iteration=[0]
    [junit] 
    [junit] 
    [junit] integers=[0, 2]
    [junit] map={0=0, 2=2}
    [junit] start=-1, iteration=[0, 2]
    [junit] start=0, iteration=[2, 0]
    [junit] start=1, iteration=[2, 0]
    [junit] start=2, iteration=[0, 2]
    [junit] start=3, iteration=[0, 2]
    [junit] 
    [junit] 
    [junit] integers=[0, 2, 4]
    [junit] map={0=0, 2=2, 4=4}
    [junit] start=-1, iteration=[0, 2, 4]
    [junit] start=0, iteration=[2, 4, 0]
    [junit] start=1, iteration=[2, 4, 0]
    [junit] start=2, iteration=[4, 0, 2]
    [junit] start=3, iteration=[4, 0, 2]
    [junit] start=4, iteration=[0, 2, 4]
    [junit] start=5, iteration=[0, 2, 4]
    [junit] 
    [junit] 
    [junit] integers=[0, 2, 4, 6]
    [junit] map={0=0, 2=2, 4=4, 6=6}
    [junit] start=-1, iteration=[0, 2, 4, 6]
    [junit] start=0, iteration=[2, 4, 6, 0]
    [junit] start=1, iteration=[2, 4, 6, 0]
    [junit] start=2, iteration=[4, 6, 0, 2]
    [junit] start=3, iteration=[4, 6, 0, 2]
    [junit] start=4, iteration=[6, 0, 2, 4]
    [junit] start=5, iteration=[6, 0, 2, 4]
    [junit] start=6, iteration=[0, 2, 4, 6]
    [junit] start=7, iteration=[0, 2, 4, 6]
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.087 sec
    [junit] Running org.apache.hadoop.util.TestGenericsUtil
    [junit] 2009-03-02 19:15:22,032 WARN  conf.Configuration (Configuration.java:<clinit>(175)) - DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
    [junit] 2009-03-02 19:15:22,044 WARN  util.GenericOptionsParser (GenericOptionsParser.java:parseGeneralOptions(377)) - options parsing failed: Missing argument for option:jt
    [junit] usage: general options are:
    [junit]  -archives <paths>             comma separated archives to be unarchived
    [junit]                                on the compute machines.
    [junit]  -conf <configuration file>    specify an application configuration file
    [junit]  -D <property=value>           use value for given property
    [junit]  -files <paths>                comma separated files to be copied to the
    [junit]                                map reduce cluster
    [junit]  -fs <local|namenode:port>     specify a namenode
    [junit]  -jt <local|jobtracker:port>   specify a job tracker
    [junit]  -libjars <paths>              comma separated jar files to include in the
    [junit]                                classpath.
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.184 sec
    [junit] Running org.apache.hadoop.util.TestIndexedSort
    [junit] sortRandom seed: 8350212373966102532(org.apache.hadoop.util.QuickSort)
    [junit] testSorted seed: -5145397827608362279(org.apache.hadoop.util.QuickSort)
    [junit] testAllEqual setting min/max at 115/292(org.apache.hadoop.util.QuickSort)
    [junit] sortWritable seed: 349868019564119539(org.apache.hadoop.util.QuickSort)
    [junit] QuickSort degen cmp/swp: 23252/3713(org.apache.hadoop.util.QuickSort)
    [junit] sortRandom seed: 413046522922751435(org.apache.hadoop.util.HeapSort)
    [junit] testSorted seed: 3342023026909933194(org.apache.hadoop.util.HeapSort)
    [junit] testAllEqual setting min/max at 228/469(org.apache.hadoop.util.HeapSort)
    [junit] sortWritable seed: 3530470734220596879(org.apache.hadoop.util.HeapSort)
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.896 sec
    [junit] Running org.apache.hadoop.util.TestProcfsBasedProcessTree
    [junit] 2009-03-02 19:15:23,635 INFO  util.ProcessTree (ProcessTree.java:isSetsidSupported(54)) - setsid exited with exit code 0
    [junit] 2009-03-02 19:15:24,140 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(141)) - Root process pid: 21389
    [junit] 2009-03-02 19:15:24,189 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(146)) - ProcessTree: [ 21391 21389 21392 ]
    [junit] 2009-03-02 19:15:30,713 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(159)) - ProcessTree: [ 21403 21401 21391 21405 21389 21395 21393 21399 21397 ]
    [junit] 2009-03-02 19:15:30,764 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:run(64)) - Shell Command exit with a non-zero exit code. This is expected as we are killing the subprocesses of the task intentionally. org.apache.hadoop.util.Shell$ExitCodeException: 
    [junit] 2009-03-02 19:15:30,764 INFO  util.ProcessTree (ProcessTree.java:destroyProcessGroup(160)) - Killing all processes in the process group 21389 with SIGTERM. Exit code 0
    [junit] 2009-03-02 19:15:30,765 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:run(70)) - Exit code: 143
    [junit] 2009-03-02 19:15:30,851 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(173)) - RogueTaskThread successfully joined.
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.27 sec
    [junit] Running org.apache.hadoop.util.TestReflectionUtils
    [junit] 2009-03-02 19:15:31,690 WARN  conf.Configuration (Configuration.java:<clinit>(175)) - DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.563 sec
    [junit] Running org.apache.hadoop.util.TestShell
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.181 sec
    [junit] Running org.apache.hadoop.util.TestStringUtils
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.084 sec

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build.xml :769: Tests failed!

Total time: 184 minutes 29 seconds
Recording fingerprints
Publishing Javadoc
Recording test results
Publishing Clover coverage report...


Hudson build is back to normal: Hadoop-trunk #778

Posted by Apache Hudson Server <hu...@hudson.zones.apache.org>.
See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/778/changes



Build failed in Hudson: Hadoop-trunk #777

Posted by Apache Hudson Server <hu...@hudson.zones.apache.org>.
See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/777/changes

Changes:

[ddas] HADOOP-5440. Fixes a problem to do with removing a taskId from the list of taskIds that the TaskTracker's TaskMemoryManager manages. Contributed by Amareshwari Sriramadasu.

[yhemanth] HADOOP-5327. Fixed job tracker to remove files from system directory on ACL check failures and also check ACLs on restart. Contributed by Amar Kamat.

[zshao] HADOOP-5379. CBZip2InputStream to throw IOException on data crc error. (Rodrigo Schmidt via zshao)

[cdouglas] HADOOP-5386. Modify hdfsproxy unit test to start on a random port,
implement clover instrumentation. Contributed by Zhiyong Zhang

[hairong] HADOOP-5358. Provide scripting functionality to the synthetic load generator. Contributed by Jakob Homan.

[hairong] HADOOP-5412. Simulated DataNode should not write to a block that's being written by another thread. Contributed by Hairong Kuang.

[dhruba] HADOOP-3998. Fix dfsclient exception when JVM is shutdown. (dhruba)

[cdouglas] HADOOP-5455. Document rpc metrics context to the extent dfs, mapred, and
jvm contexts are documented. Contributed by Philip Zeyliger

[dhruba] HADOOP-5333. libhdfs supports appending to files. (dhruba)

[cdouglas] Revert HADOOP-5307

[dhruba] HADOOP-5332. Appending to files is not allowed (by default) unless
dfs.support.append is set to true. (dhruba)

[enis] HADOOP-5307. Fix null value handling in StringUtils#arrayToString() and #getStrings(). Contributed by Enis Soztutar.

------------------------------------------
[...truncated 301294 lines...]
    [junit] root=/test/testDistCh, returnvalue=0
    [junit] results:
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-12 16:54 /test/testDistCh/f1
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-12 16:54 /test/testDistCh/f2
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-12 16:54 /test/testDistCh/f3
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-12 16:54 /test/testDistCh/sub0
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-12 16:54 /test/testDistCh/sub1
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub1/f4
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub1/f5
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-12 16:54 /test/testDistCh/sub2
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub2/f6
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-12 16:54 /test/testDistCh/sub3
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub3/f7
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-12 16:54 /test/testDistCh/sub4
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub4/f8
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub4/f9
    [junit] 
    [junit] args=[/test/testDistCh/sub4:::212,
    [junit]    /test/testDistCh/sub0:sub0:sub0:,
    [junit]    /test/testDistCh/sub2:sub2:sub2:516]
    [junit] newstatus=[sub0:sub0:rwxr-xr-x,
    [junit]    hudson:supergroup:rwxr-xr-x,
    [junit]    sub2:sub2:r-x--xrw-,
    [junit]    hudson:supergroup:rwxr-xr-x,
    [junit]    hudson:supergroup:-w---x-w-]
    [junit] 2009-03-12 16:54:54,339 INFO  tools.DistTool (DistCh.java:run(376)) - ops=[/test/testDistCh/sub4:null:null:-w---x-w-, /test/testDistCh/sub0:sub0:sub0:null, /test/testDistCh/sub2:sub2:sub2:r-x--xrw-]
    [junit] 2009-03-12 16:54:54,339 INFO  tools.DistTool (DistCh.java:run(377)) - isIgnoreFailures=false
    [junit] 2009-03-12 16:54:54,362 INFO  tools.DistTool (DistCh.java:setup(427)) - distch.job.dir=hdfs://localhost:45402/user/hudson/build/test/mapred/system/distch_xyfcga
    [junit] 2009-03-12 16:54:54,365 INFO  tools.DistTool (DistCh.java:setup(433)) - log=hdfs://localhost:45402/user/hudson/build/test/mapred/system/distch_xyfcga/_logs
    [junit] 2009-03-12 16:54:54,477 INFO  tools.DistTool (DistCh.java:setup(476)) - distch.op.count=6
    [junit] 2009-03-12 16:54:54,483 WARN  mapred.JobClient (JobClient.java:configureCommandLineOptions(539)) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 2009-03-12 16:54:54,490 WARN  mapred.JobClient (JobClient.java:configureCommandLineOptions(661)) - No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
    [junit] 2009-03-12 16:54:54,503 INFO  tools.DistTool (DistCh.java:getSplits(261)) - numSplits=1, splits.size()=1
    [junit] 2009-03-12 16:54:54,652 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - build/test/mapred/local/jobTracker/job_200903121654_0001.xml:a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] 2009-03-12 16:54:54,661 INFO  mapred.JobClient (JobClient.java:runJob(1268)) - Running job: job_200903121654_0001
    [junit] 2009-03-12 16:54:54,924 INFO  mapred.JobInProgress (JobInProgress.java:initTasks(426)) - Input size for job job_200903121654_0001 = 617
    [junit] 2009-03-12 16:54:54,924 INFO  mapred.JobInProgress (JobInProgress.java:initTasks(428)) - Split info for job:job_200903121654_0001 with 1 splits:
    [junit] 2009-03-12 16:54:55,668 INFO  mapred.JobClient (JobClient.java:runJob(1291)) -  map 0% reduce 0%
    [junit] 2009-03-12 16:54:56,316 INFO  datanode.DataBlockScanner (DataBlockScanner.java:verifyBlock(434)) - Verification succeeded for blk_-9023609694079356566_1002
    [junit] 2009-03-12 16:54:56,891 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1720)) - Adding task 'attempt_200903121654_0001_m_000002_0' to tip task_200903121654_0001_m_000002, for tracker 'tracker_host1.foo.com:localhost/127.0.0.1:57505'
    [junit] 2009-03-12 16:54:56,972 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903121654_0001/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] 2009-03-12 16:54:57,027 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903121654_0001_m_1921699811
    [junit] 2009-03-12 16:54:57,028 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903121654_0001_m_1921699811 spawned.
    [junit] 2009-03-12 16:54:57,980 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903121654_0001_m_1921699811 exited. Number of tasks it ran: 1
    [junit] 2009-03-12 16:54:59,900 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903121654_0001_m_000002_0' has completed task_200903121654_0001_m_000002 successfully.
    [junit] 2009-03-12 16:54:59,905 INFO  mapred.JobInProgress (JobInProgress.java:findNewMapTask(1820)) - Choosing a non-local task task_200903121654_0001_m_000000
    [junit] 2009-03-12 16:54:59,905 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1720)) - Adding task 'attempt_200903121654_0001_m_000000_0' to tip task_200903121654_0001_m_000000, for tracker 'tracker_host1.foo.com:localhost/127.0.0.1:57505'
    [junit] 2009-03-12 16:54:59,910 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903121654_0001_m_000002_0 done; removing files.
    [junit] 2009-03-12 16:54:59,911 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903121654_0001_m_000002_0 not found in cache
    [junit] 2009-03-12 16:54:59,960 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903121654_0001_m_296650207
    [junit] 2009-03-12 16:54:59,960 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903121654_0001_m_296650207 spawned.
    [junit] 2009-03-12 16:55:00,687 INFO  mapred.JobClient (JobClient.java:runJob(1340)) - Task Id : attempt_200903121654_0001_m_000002_0, Status : SUCCEEDED
    [junit] attempt_200903121654_0001_m_000002_0: 2009-03-12 16:54:57,613 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903121654_0001/attempt_200903121654_0001_m_000002_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903121654_0001_m_000002_0: 2009-03-12 16:54:57,700 INFO  jvm.JvmMetrics (JvmMetrics.java:init(71)) - Initializing JVM Metrics with processName=MAP, sessionId=
    [junit] attempt_200903121654_0001_m_000002_0: 2009-03-12 16:54:57,789 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903121654_0001/attempt_200903121654_0001_m_000002_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903121654_0001_m_000002_0: 2009-03-12 16:54:57,839 INFO  mapred.TaskRunner (Task.java:done(644)) - Task:attempt_200903121654_0001_m_000002_0 is done. And is in the process of commiting
    [junit] attempt_200903121654_0001_m_000002_0: 2009-03-12 16:54:57,846 INFO  mapred.TaskRunner (Task.java:sendDone(715)) - Task 'attempt_200903121654_0001_m_000002_0' done.
    [junit] 2009-03-12 16:55:04,026 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903121654_0001_m_296650207 exited. Number of tasks it ran: 1
    [junit] 2009-03-12 16:55:05,915 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903121654_0001_m_000000_0' has completed task_200903121654_0001_m_000000 successfully.
    [junit] 2009-03-12 16:55:05,918 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1720)) - Adding task 'attempt_200903121654_0001_m_000001_0' to tip task_200903121654_0001_m_000001, for tracker 'tracker_host1.foo.com:localhost/127.0.0.1:57505'
    [junit] 2009-03-12 16:55:05,948 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903121654_0001_m_-1352093016
    [junit] 2009-03-12 16:55:05,949 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903121654_0001_m_-1352093016 spawned.
    [junit] 2009-03-12 16:55:06,763 INFO  mapred.JobClient (JobClient.java:runJob(1291)) -  map 100% reduce 0%
    [junit] 2009-03-12 16:55:06,765 INFO  mapred.JobClient (JobClient.java:runJob(1340)) - Task Id : attempt_200903121654_0001_m_000000_0, Status : SUCCEEDED
    [junit] attempt_200903121654_0001_m_000000_0: 2009-03-12 16:55:00,546 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903121654_0001/attempt_200903121654_0001_m_000000_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903121654_0001_m_000000_0: 2009-03-12 16:55:00,627 INFO  jvm.JvmMetrics (JvmMetrics.java:init(71)) - Initializing JVM Metrics with processName=MAP, sessionId=
    [junit] attempt_200903121654_0001_m_000000_0: 2009-03-12 16:55:00,718 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/1_0/taskTracker/jobcache/job_200903121654_0001/attempt_200903121654_0001_m_000000_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903121654_0001_m_000000_0: 2009-03-12 16:55:00,820 INFO  mapred.MapTask (MapTask.java:runOldMapper(343)) - numReduceTasks: 0
    [junit] attempt_200903121654_0001_m_000000_0: 2009-03-12 16:55:00,864 INFO  mapred.TaskRunner (Task.java:done(644)) - Task:attempt_200903121654_0001_m_000000_0 is done. And is in the process of commiting
    [junit] attempt_200903121654_0001_m_000000_0: 2009-03-12 16:55:03,873 INFO  mapred.TaskRunner (Task.java:commit(744)) - Task attempt_200903121654_0001_m_000000_0 is allowed to commit now
    [junit] attempt_200903121654_0001_m_000000_0: 2009-03-12 16:55:03,885 INFO  mapred.FileOutputCommitter (FileOutputCommitter.java:commitTask(92)) - Saved output of task 'attempt_200903121654_0001_m_000000_0' to hdfs://localhost:45402/user/hudson/build/test/mapred/system/distch_xyfcga/_logs
    [junit] attempt_200903121654_0001_m_000000_0: 2009-03-12 16:55:03,891 INFO  mapred.TaskRunner (Task.java:sendDone(715)) - Task 'attempt_200903121654_0001_m_000000_0' done.
    [junit] 2009-03-12 16:55:06,899 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903121654_0001_m_-1352093016 exited. Number of tasks it ran: 1
    [junit] 2009-03-12 16:55:08,921 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903121654_0001_m_000001_0' has completed task_200903121654_0001_m_000001 successfully.
    [junit] 2009-03-12 16:55:08,923 INFO  mapred.JobInProgress (JobInProgress.java:jobComplete(2099)) - Job job_200903121654_0001 has completed successfully.
    [junit] 2009-03-12 16:55:09,050 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1841)) - Removed completed task 'attempt_200903121654_0001_m_000000_0' from 'tracker_host1.foo.com:localhost/127.0.0.1:57505'
    [junit] 2009-03-12 16:55:09,050 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1841)) - Removed completed task 'attempt_200903121654_0001_m_000001_0' from 'tracker_host1.foo.com:localhost/127.0.0.1:57505'
    [junit] 2009-03-12 16:55:09,050 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1841)) - Removed completed task 'attempt_200903121654_0001_m_000002_0' from 'tracker_host1.foo.com:localhost/127.0.0.1:57505'
    [junit] 2009-03-12 16:55:09,051 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903121654_0001_m_000000_0 done; removing files.
    [junit] 2009-03-12 16:55:09,052 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903121654_0001_m_000000_0 not found in cache
    [junit] 2009-03-12 16:55:09,052 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903121654_0001_m_000001_0 done; removing files.
    [junit] 2009-03-12 16:55:09,053 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903121654_0001_m_000001_0 not found in cache
    [junit] 2009-03-12 16:55:09,782 INFO  mapred.JobClient (JobClient.java:runJob(1358)) - Job complete: job_200903121654_0001
    [junit] 2009-03-12 16:55:09,785 INFO  mapred.JobClient (Counters.java:log(514)) - Counters: 7
    [junit] 2009-03-12 16:55:09,785 INFO  mapred.JobClient (Counters.java:log(516)) -   Job Counters 
    [junit] 2009-03-12 16:55:09,785 INFO  mapred.JobClient (Counters.java:log(518)) -     Launched map tasks=1
    [junit] 2009-03-12 16:55:09,786 INFO  mapred.JobClient (Counters.java:log(516)) -   org.apache.hadoop.tools.DistCh$Counter
    [junit] 2009-03-12 16:55:09,786 INFO  mapred.JobClient (Counters.java:log(518)) -     SUCCEED=6
    [junit] 2009-03-12 16:55:09,786 INFO  mapred.JobClient (Counters.java:log(516)) -   FileSystemCounters
    [junit] 2009-03-12 16:55:09,787 INFO  mapred.JobClient (Counters.java:log(518)) -     HDFS_BYTES_READ=617
    [junit] 2009-03-12 16:55:09,787 INFO  mapred.JobClient (Counters.java:log(516)) -   Map-Reduce Framework
    [junit] 2009-03-12 16:55:09,787 INFO  mapred.JobClient (Counters.java:log(518)) -     Map input records=6
    [junit] 2009-03-12 16:55:09,788 INFO  mapred.JobClient (Counters.java:log(518)) -     Spilled Records=0
    [junit] 2009-03-12 16:55:09,788 INFO  mapred.JobClient (Counters.java:log(518)) -     Map input bytes=520
    [junit] 2009-03-12 16:55:09,788 INFO  mapred.JobClient (Counters.java:log(518)) -     Map output records=0
    [junit] root=/test/testDistCh, returnvalue=0
    [junit] results:
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-12 16:54 /test/testDistCh/f1
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-12 16:54 /test/testDistCh/f2
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-12 16:54 /test/testDistCh/f3
    [junit] drwxr-xr-x   - sub0   sub0                0 2009-03-12 16:54 /test/testDistCh/sub0
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-12 16:54 /test/testDistCh/sub1
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub1/f4
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub1/f5
    [junit] dr-x--xrw-   - sub2   sub2                0 2009-03-12 16:54 /test/testDistCh/sub2
    [junit] -r-----rw-   2 sub2 sub2         43 2009-03-12 16:54 /test/testDistCh/sub2/f6
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-12 16:54 /test/testDistCh/sub3
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub3/f7
    [junit] d-w---x-w-   - hudson supergroup          0 2009-03-12 16:54 /test/testDistCh/sub4
    [junit] --w-----w-   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub4/f8
    [junit] --w-----w-   2 hudson supergroup         43 2009-03-12 16:54 /test/testDistCh/sub4/f9
    [junit] 
    [junit] Shutting down the Mini HDFS Cluster
    [junit] Shutting down DataNode 1
    [junit] 2009-03-12 16:55:09,913 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 44036
    [junit] 2009-03-12 16:55:09,914 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 44036: exiting
    [junit] 2009-03-12 16:55:09,914 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 44036: exiting
    [junit] 2009-03-12 16:55:09,914 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-12 16:55:09,914 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 44036: exiting
    [junit] 2009-03-12 16:55:09,914 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 44036
    [junit] 2009-03-12 16:55:10,470 INFO  datanode.DataBlockScanner (DataBlockScanner.java:run(603)) - Exiting DataBlockScanner thread.
    [junit] 2009-03-12 16:55:10,914 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 44036
    [junit] Shutting down DataNode 0
    [junit] 2009-03-12 16:55:11,016 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 33706
    [junit] 2009-03-12 16:55:11,016 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 33706: exiting
    [junit] 2009-03-12 16:55:11,017 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 33706: exiting
    [junit] 2009-03-12 16:55:11,016 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-12 16:55:11,016 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 33706
    [junit] 2009-03-12 16:55:11,017 INFO  datanode.DataBlockScanner (DataBlockScanner.java:run(603)) - Exiting DataBlockScanner thread.
    [junit] 2009-03-12 16:55:11,017 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 33706: exiting
    [junit] 2009-03-12 16:55:11,018 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 33706
    [junit] 2009-03-12 16:55:11,121 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 45402
    [junit] 2009-03-12 16:55:11,121 INFO  namenode.DecommissionManager (DecommissionManager.java:run(67)) - Interrupted Monitor
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2009-03-12 16:55:11,122 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 6 on 45402: exiting
    [junit] 2009-03-12 16:55:11,122 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 8 on 45402: exiting
    [junit] 2009-03-12 16:55:11,122 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 7 on 45402: exiting
    [junit] 2009-03-12 16:55:11,122 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 4 on 45402: exiting
    [junit] 2009-03-12 16:55:11,121 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-12 16:55:11,121 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 3 on 45402: exiting
    [junit] 2009-03-12 16:55:11,121 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 5 on 45402: exiting
    [junit] 2009-03-12 16:55:11,121 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 45402: exiting
    [junit] 2009-03-12 16:55:11,121 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 45402
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 30.138 sec
    [junit] 2009-03-12 16:55:11,121 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 45402: exiting
    [junit] 2009-03-12 16:55:11,122 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 45402: exiting
    [junit] 2009-03-12 16:55:11,122 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 9 on 45402: exiting
    [junit] Running org.apache.hadoop.util.TestCyclicIteration
    [junit] 
    [junit] 
    [junit] integers=[]
    [junit] map={}
    [junit] start=-1, iteration=[]
    [junit] 
    [junit] 
    [junit] integers=[0]
    [junit] map={0=0}
    [junit] start=-1, iteration=[0]
    [junit] start=0, iteration=[0]
    [junit] start=1, iteration=[0]
    [junit] 
    [junit] 
    [junit] integers=[0, 2]
    [junit] map={0=0, 2=2}
    [junit] start=-1, iteration=[0, 2]
    [junit] start=0, iteration=[2, 0]
    [junit] start=1, iteration=[2, 0]
    [junit] start=2, iteration=[0, 2]
    [junit] start=3, iteration=[0, 2]
    [junit] 
    [junit] 
    [junit] integers=[0, 2, 4]
    [junit] map={0=0, 2=2, 4=4}
    [junit] start=-1, iteration=[0, 2, 4]
    [junit] start=0, iteration=[2, 4, 0]
    [junit] start=1, iteration=[2, 4, 0]
    [junit] start=2, iteration=[4, 0, 2]
    [junit] start=3, iteration=[4, 0, 2]
    [junit] start=4, iteration=[0, 2, 4]
    [junit] start=5, iteration=[0, 2, 4]
    [junit] 
    [junit] 
    [junit] integers=[0, 2, 4, 6]
    [junit] map={0=0, 2=2, 4=4, 6=6}
    [junit] start=-1, iteration=[0, 2, 4, 6]
    [junit] start=0, iteration=[2, 4, 6, 0]
    [junit] start=1, iteration=[2, 4, 6, 0]
    [junit] start=2, iteration=[4, 6, 0, 2]
    [junit] start=3, iteration=[4, 6, 0, 2]
    [junit] start=4, iteration=[6, 0, 2, 4]
    [junit] start=5, iteration=[6, 0, 2, 4]
    [junit] start=6, iteration=[0, 2, 4, 6]
    [junit] start=7, iteration=[0, 2, 4, 6]
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.082 sec
    [junit] Running org.apache.hadoop.util.TestGenericsUtil
    [junit] 2009-03-12 16:55:12,283 WARN  conf.Configuration (Configuration.java:<clinit>(175)) - DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
    [junit] 2009-03-12 16:55:12,296 WARN  util.GenericOptionsParser (GenericOptionsParser.java:parseGeneralOptions(377)) - options parsing failed: Missing argument for option:jt
    [junit] usage: general options are:
    [junit]  -archives <paths>             comma separated archives to be unarchived
    [junit]                                on the compute machines.
    [junit]  -conf <configuration file>    specify an application configuration file
    [junit]  -D <property=value>           use value for given property
    [junit]  -files <paths>                comma separated files to be copied to the
    [junit]                                map reduce cluster
    [junit]  -fs <local|namenode:port>     specify a namenode
    [junit]  -jt <local|jobtracker:port>   specify a job tracker
    [junit]  -libjars <paths>              comma separated jar files to include in the
    [junit]                                classpath.
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.186 sec
    [junit] Running org.apache.hadoop.util.TestIndexedSort
    [junit] sortRandom seed: -1798328634733982203(org.apache.hadoop.util.QuickSort)
    [junit] testSorted seed: 4098882688518272941(org.apache.hadoop.util.QuickSort)
    [junit] testAllEqual setting min/max at 138/140(org.apache.hadoop.util.QuickSort)
    [junit] sortWritable seed: -1452106224397442699(org.apache.hadoop.util.QuickSort)
    [junit] QuickSort degen cmp/swp: 23252/3713(org.apache.hadoop.util.QuickSort)
    [junit] sortRandom seed: -1760193741196207872(org.apache.hadoop.util.HeapSort)
    [junit] testSorted seed: -7644926114297956705(org.apache.hadoop.util.HeapSort)
    [junit] testAllEqual setting min/max at 1/180(org.apache.hadoop.util.HeapSort)
    [junit] sortWritable seed: 5084688075017256500(org.apache.hadoop.util.HeapSort)
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.952 sec
    [junit] Running org.apache.hadoop.util.TestProcfsBasedProcessTree
    [junit] 2009-03-12 16:55:13,952 INFO  util.ProcessTree (ProcessTree.java:isSetsidSupported(54)) - setsid exited with exit code 0
    [junit] 2009-03-12 16:55:14,457 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(141)) - Root process pid: 11904
    [junit] 2009-03-12 16:55:14,546 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(146)) - ProcessTree: [ 11907 11906 11904 ]
    [junit] 2009-03-12 16:55:21,075 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(159)) - ProcessTree: [ 11927 11910 11924 11908 11906 11904 11914 11929 11912 ]
    [junit] 2009-03-12 16:55:21,133 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:run(64)) - Shell Command exit with a non-zero exit code. This is expected as we are killing the subprocesses of the task intentionally. org.apache.hadoop.util.Shell$ExitCodeException: 
    [junit] 2009-03-12 16:55:21,134 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:run(70)) - Exit code: 143
    [junit] 2009-03-12 16:55:21,136 INFO  util.ProcessTree (ProcessTree.java:destroyProcessGroup(160)) - Killing all processes in the process group 11904 with SIGTERM. Exit code 0
    [junit] 2009-03-12 16:55:21,183 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(173)) - RogueTaskThread successfully joined.
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.296 sec
    [junit] Running org.apache.hadoop.util.TestReflectionUtils
    [junit] 2009-03-12 16:55:22,023 WARN  conf.Configuration (Configuration.java:<clinit>(175)) - DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.576 sec
    [junit] Running org.apache.hadoop.util.TestShell
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.186 sec
    [junit] Running org.apache.hadoop.util.TestStringUtils
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.08 sec

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build.xml :769: Tests failed!

Total time: 179 minutes 1 second
Publishing Javadoc
Recording test results
Recording fingerprints
Publishing Clover coverage report...


Build failed in Hudson: Hadoop-trunk #776

Posted by Apache Hudson Server <hu...@hudson.zones.apache.org>.
See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/776/changes

Changes:

[cdouglas] HADOOP-5458. Remove leftover Chukwa entries from build, etc.

[szetszwo] HADOOP-5456. Fix javadoc links to ClientProtocol#restoreFailedStorage(..).  (Boris Shkolnik via szetszwo)

[szetszwo] HADOOP-5258. Add a new DFSAdmin command to print a tree of the rack and datanode topology as seen by the namenode.  (Jakob Homan via szetszwo)

[szetszwo] HADOOP-5416. Correct the shell command "fs -test" forrest doc description.  (Ravi Phulari via szetszwo)

[omalley] Moving codebase

------------------------------------------
[...truncated 296821 lines...]
    [javac] Compiling 35 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/streaming/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

check-contrib:

init:
     [echo] contrib: thriftfs
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/examples 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test/logs 

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 17ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/4ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: thriftfs
    [javac] Compiling 1 source file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 

init:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 18ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/3ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: vaidya
    [javac] Compiling 14 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/vaidya/src/java/org/apache/hadoop/vaidya/statistics/job/JobStatistics.java  uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

compile-ant-tasks:
    [javac] Compiling 5 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/ant 
   [clover] Clover Version 2.3.2, built on July 15 2008 (build-732)
   [clover] Loaded from: /home/hudson/tools/clover/latest/lib/clover.jar
   [clover] Clover: Open Source License registered to Apache Software Foundation.
   [clover] Updating existing database at 'http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/clover/db/hadoop_coverage.db'. 
   [clover] Processing files at 1.6 source level.
   [clover] Clover all over. Instrumented 0 files (0 packages).

compile:

test-contrib:

test:
Trying to override old definition of task macro_tar

check-contrib:

init:
     [echo] contrib: hdfsproxy

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#hdfsproxy;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-logging#commons-logging;1.1 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;3.8.2 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.core.framework.uberjar.javaEE.14;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.ant;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.shared.api;1.8.0 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.1 in maven2
[ivy:resolve] 	found commons-io#commons-io;1.4 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.3 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.3 in maven2
[ivy:resolve] 	found aspectj#aspectjrt;1.5.3 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-core-uberjar;0.9 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-ant;0.9 in maven2
[ivy:resolve] 	found javax.servlet#jsp-api;2.0 in maven2
[ivy:resolve] 	found javax.servlet#servlet-api;2.5 in maven2
[ivy:resolve] 	found javax.servlet#jstl;1.1.2 in maven2
[ivy:resolve] 	found taglibs#standard;1.1.2 in maven2
[ivy:resolve] 	found junitperf#junitperf;1.8 in maven2
[ivy:resolve] :: resolution report :: resolve 240ms :: artifacts dl 10ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   26  |   0   |   0   |   0   ||   26  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#hdfsproxy [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 26 already retrieved (0kB/8ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: hdfsproxy

compile-examples:

compile-test:
     [echo] contrib: hdfsproxy
    [javac] Compiling 4 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/test 

jar:
     [echo] 
     [echo]             Building the .jar files.
     [echo]         
      [jar] Building jar: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.jar 

local-package:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/logs 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 12 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/lib 
     [copy] Copying 13 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/bin 
     [copy] Copying 10 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/conf 
     [copy] Copying 2 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 8 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/src 

war:
     [echo] 
     [echo] 			Building the .war file
     [echo] 		
      [war] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 

load-tasks:

cactifywar:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target 
[cactifywar] Analyzing war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 
[cactifywar] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war 

test:
     [echo] Please take a deep breath while Cargo gets the Tomcat for running the servlet tests...
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/temp 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/logs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/reports 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
   [cactus] -----------------------------------------------------------------
   [cactus] Running tests against Tomcat 5.x @ http://localhost:8087
   [cactus] -----------------------------------------------------------------
   [cactus] Deploying [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war]  to [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps]... 
   [cactus] Tomcat 5.x starting...
   [cactus] Tomcat 5.x started on port [8087]

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build.xml :773: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/build.xml :48: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/hdfsproxy/build.xml :150: Failed to start the container after more than [180000] ms. Trying to connect to the [http://localhost:8087/test/ServletRedirector?Cactus_Service=RUN_TEST] test URL yielded a [-1] error code. Please run in debug mode for more details about the error.

Total time: 176 minutes 51 seconds
Publishing Javadoc
Recording test results
Recording fingerprints
Publishing Clover coverage report...


Build failed in Hudson: Hadoop-trunk #775

Posted by Apache Hudson Server <hu...@hudson.zones.apache.org>.
See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/775/changes

Changes:

[ddas] HADOOP-5392. Fixes a problem to do with JT crashing during recovery when the job files are garbled. Contributed by Amar Kamat.

[cdouglas] HADOOP-5432. Disable ssl during unit tests in hdfsproxy, as it is unused and causes failures.

[szetszwo] HADOOP-5298. Change TestServletFilter so that it allows a web page to be filtered more than once for a single access.  (szetszwo)

[szetszwo] HADOOP-4695. Change TestGlobalFilter so that it allows a web page to be filtered more than once for a single access.  (Kan Zhang via szetszwo)

[eyang] HADOOP-5387. Collect number of disk read/write operation for system metrics.

[eyang] HADOOP-5397. Remove adaptor reference from adaptorPositions hashMap, when adaptor is removed through AgentController.

[eyang] HADOOP-5411.  Converted from String to StringBuilder for Chart class.

------------------------------------------
[...truncated 299606 lines...]
    [javac] Compiling 35 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/streaming/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

check-contrib:

init:
     [echo] contrib: thriftfs
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/examples 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test/logs 

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 17ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/3ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: thriftfs
    [javac] Compiling 1 source file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 

init:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 17ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/4ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: vaidya
    [javac] Compiling 14 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/vaidya/src/java/org/apache/hadoop/vaidya/statistics/job/JobStatistics.java  uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

compile-ant-tasks:
    [javac] Compiling 5 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/ant 
   [clover] Clover Version 2.3.2, built on July 15 2008 (build-732)
   [clover] Loaded from: /home/hudson/tools/clover/latest/lib/clover.jar
   [clover] Clover: Open Source License registered to Apache Software Foundation.
   [clover] Updating existing database at 'http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/clover/db/hadoop_coverage.db'. 
   [clover] Processing files at 1.6 source level.
   [clover] Clover all over. Instrumented 0 files (0 packages).

compile:

test-contrib:

test:
Trying to override old definition of task macro_tar

check-contrib:

init:
     [echo] contrib: hdfsproxy

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#hdfsproxy;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-logging#commons-logging;1.1 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;3.8.2 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.core.framework.uberjar.javaEE.14;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.ant;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.shared.api;1.8.0 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.1 in maven2
[ivy:resolve] 	found commons-io#commons-io;1.4 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.3 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.3 in maven2
[ivy:resolve] 	found aspectj#aspectjrt;1.5.3 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-core-uberjar;0.9 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-ant;0.9 in maven2
[ivy:resolve] 	found javax.servlet#jsp-api;2.0 in maven2
[ivy:resolve] 	found javax.servlet#servlet-api;2.5 in maven2
[ivy:resolve] 	found javax.servlet#jstl;1.1.2 in maven2
[ivy:resolve] 	found taglibs#standard;1.1.2 in maven2
[ivy:resolve] 	found junitperf#junitperf;1.8 in maven2
[ivy:resolve] :: resolution report :: resolve 221ms :: artifacts dl 10ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   26  |   0   |   0   |   0   ||   26  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#hdfsproxy [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 26 already retrieved (0kB/8ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: hdfsproxy

compile-examples:

compile-test:
     [echo] contrib: hdfsproxy
    [javac] Compiling 4 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/test 

jar:
     [echo] 
     [echo]             Building the .jar files.
     [echo]         
      [jar] Building jar: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.jar 

local-package:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/logs 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 12 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/lib 
     [copy] Copying 13 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/bin 
     [copy] Copying 10 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/conf 
     [copy] Copying 2 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 8 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/src 

war:
     [echo] 
     [echo] 			Building the .war file
     [echo] 		
      [war] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 

load-tasks:

cactifywar:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target 
[cactifywar] Analyzing war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 
[cactifywar] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war 

test:
     [echo] Please take a deep breath while Cargo gets the Tomcat for running the servlet tests...
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/temp 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/logs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/reports 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
   [cactus] -----------------------------------------------------------------
   [cactus] Running tests against Tomcat 5.x @ http://localhost:8087
   [cactus] -----------------------------------------------------------------
   [cactus] Deploying [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war]  to [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps]... 
   [cactus] Tomcat 5.x starting...
   [cactus] Tomcat 5.x started on port [8087]

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build.xml :773: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/build.xml :48: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/hdfsproxy/build.xml :150: Failed to start the container after more than [180000] ms. Trying to connect to the [http://localhost:8087/test/ServletRedirector?Cactus_Service=RUN_TEST] test URL yielded a [-1] error code. Please run in debug mode for more details about the error.

Total time: 178 minutes 52 seconds
Publishing Javadoc
Recording test results
Recording fingerprints
Publishing Clover coverage report...


Build failed in Hudson: Hadoop-trunk #774

Posted by Apache Hudson Server <hu...@hudson.zones.apache.org>.
See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/774/

------------------------------------------
[...truncated 300212 lines...]
    [javac] Compiling 35 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/streaming/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

check-contrib:

init:
     [echo] contrib: thriftfs
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/examples 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test/logs 

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 17ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/3ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: thriftfs
    [javac] Compiling 1 source file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 

init:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 18ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/3ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: vaidya
    [javac] Compiling 14 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/vaidya/src/java/org/apache/hadoop/vaidya/statistics/job/JobStatistics.java  uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

compile-ant-tasks:
    [javac] Compiling 5 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/ant 
   [clover] Clover Version 2.3.2, built on July 15 2008 (build-732)
   [clover] Loaded from: /home/hudson/tools/clover/latest/lib/clover.jar
   [clover] Clover: Open Source License registered to Apache Software Foundation.
   [clover] Updating existing database at 'http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/clover/db/hadoop_coverage.db'. 
   [clover] Processing files at 1.6 source level.
   [clover] Clover all over. Instrumented 0 files (0 packages).

compile:

test-contrib:

test:
Trying to override old definition of task macro_tar

check-contrib:

init:
     [echo] contrib: hdfsproxy

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#hdfsproxy;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-logging#commons-logging;1.1 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;3.8.2 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.core.framework.uberjar.javaEE.14;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.ant;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.shared.api;1.8.0 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.1 in maven2
[ivy:resolve] 	found commons-io#commons-io;1.4 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.3 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.3 in maven2
[ivy:resolve] 	found aspectj#aspectjrt;1.5.3 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-core-uberjar;0.9 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-ant;0.9 in maven2
[ivy:resolve] 	found javax.servlet#jsp-api;2.0 in maven2
[ivy:resolve] 	found javax.servlet#servlet-api;2.5 in maven2
[ivy:resolve] 	found javax.servlet#jstl;1.1.2 in maven2
[ivy:resolve] 	found taglibs#standard;1.1.2 in maven2
[ivy:resolve] 	found junitperf#junitperf;1.8 in maven2
[ivy:resolve] :: resolution report :: resolve 200ms :: artifacts dl 11ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   26  |   0   |   0   |   0   ||   26  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#hdfsproxy [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 26 already retrieved (0kB/7ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: hdfsproxy

compile-examples:

compile-test:
     [echo] contrib: hdfsproxy
    [javac] Compiling 4 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/test 

jar:
     [echo] 
     [echo]             Building the .jar files.
     [echo]         
      [jar] Building jar: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.jar 

local-package:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/logs 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 12 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/lib 
     [copy] Copying 13 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/bin 
     [copy] Copying 10 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/conf 
     [copy] Copying 2 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 8 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/src 

war:
     [echo] 
     [echo] 			Building the .war file
     [echo] 		
      [war] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 

load-tasks:

cactifywar:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target 
[cactifywar] Analyzing war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 
[cactifywar] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war 

test:
     [echo] Please take a deep breath while Cargo gets the Tomcat for running the servlet tests...
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/temp 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/logs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/reports 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
   [cactus] -----------------------------------------------------------------
   [cactus] Running tests against Tomcat 5.x @ http://localhost:8087
   [cactus] -----------------------------------------------------------------
   [cactus] Deploying [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war]  to [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps]... 
   [cactus] Tomcat 5.x starting...
   [cactus] Tomcat 5.x started on port [8087]

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build.xml :773: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/build.xml :48: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/hdfsproxy/build.xml :150: Failed to start the container after more than [180000] ms. Trying to connect to the [http://localhost:8087/test/ServletRedirector?Cactus_Service=RUN_TEST] test URL yielded a [-1] error code. Please run in debug mode for more details about the error.

Total time: 179 minutes 1 second
Recording fingerprints
Publishing Javadoc
Recording test results
Publishing Clover coverage report...


Build failed in Hudson: Hadoop-trunk #773

Posted by Apache Hudson Server <hu...@hudson.zones.apache.org>.
See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/773/changes

Changes:

[ddas] HADOOP-5285. Moved the message for 5285 commit in CHANGES.txt under 0.19.2.

[eyang] HADOOP-5031. Changed DFS throughput metrics calculation.

[yhemanth] HADOOP-5376. Fixes the code handling lost tasktrackers to set the task state to KILLED_UNCLEAN only for relevant type of tasks. Contributed by Amareshwari Sriramadasu.

[yhemanth] HADOOP-5338. Fix jobtracker restart to clear task completion events cached by tasktrackers to avoid missing events. Contributed by Amar Kamat.

[yhemanth] HADOOP-5341. Make hadoop-daemon scripts backwards compatible with the changes in HADOOP-4868. Contributed by Sharad Agarwal.

[eyang] HADOOP-5360.  * Changed RPM packaging to build daemon tools scripts.
              * Changed environment variable from CHUKWA_HOME/var/run to CHUKWA_PID_DIR.

[eyang] HADOOP-5055. Updated CHANGES.txt for HADOOP-5055.

[eyang] HADOOP-5055. Changed alert.conf location from $CHUKWA_HOME/conf/alert.conf to $CHUKWA_CONF_DIR/alert.conf.

[szetszwo] HADOOP-5347. Create a job output directory for the bbp examples.  (szetszwo)

[eyang] HADOOP-5409. Updated CHANGES.txt for release notes.

[eyang] HADOOP-5409.  Updated CHANGES.txt for HADOOP-5409.

[eyang] HADOOP-5409.  Commit this to resolve trunk test build issue.

[asrabkin] HADOOP-5228.  Chukwa tests shouldn't write to /tmp.

[asrabkin] HADOOP-5401. Standardize name of agent control port option.

[eyang] HADOOP-5409.  Remove opt directory.

[johan] HADOOP-5317. Provide documentation for LazyOutput Feature. (Jothi Padmanabhan via johan)

[hairong] HADOOP-5145. Balancer sometimes runs out of memory after running days or weeks. Contributed by Hairong Kuang.

[szetszwo] HADOOP-5384. Fix a problem that DataNodeCluster creates blocks with generationStamp == 1.  (szetszwo)

[asrabkin] patch for HADOOP-5057
Improves test coverage for chukwa agent startup, in particular checkpoint restore corner cases.
Patch by asrabkin.

[rangadi] HADOOP-5383. Avoid building an unused string in NameNode's
verifyReplication(). (Raghu Angadi)

------------------------------------------
[...truncated 302240 lines...]
    [javac] Compiling 35 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/streaming/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

check-contrib:

init:
     [echo] contrib: thriftfs
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/examples 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test/logs 

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 17ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/4ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: thriftfs
    [javac] Compiling 1 source file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 

init:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 17ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/3ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: vaidya
    [javac] Compiling 14 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/vaidya/src/java/org/apache/hadoop/vaidya/statistics/job/JobStatistics.java  uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

compile-ant-tasks:
    [javac] Compiling 5 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/ant 
   [clover] Clover Version 2.3.2, built on July 15 2008 (build-732)
   [clover] Loaded from: /home/hudson/tools/clover/latest/lib/clover.jar
   [clover] Clover: Open Source License registered to Apache Software Foundation.
   [clover] Updating existing database at 'http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/clover/db/hadoop_coverage.db'. 
   [clover] Processing files at 1.6 source level.
   [clover] Clover all over. Instrumented 0 files (0 packages).

compile:

test-contrib:

test:
Trying to override old definition of task macro_tar

check-contrib:

init:
     [echo] contrib: hdfsproxy

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#hdfsproxy;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-logging#commons-logging;1.1 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;3.8.2 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.core.framework.uberjar.javaEE.14;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.ant;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.shared.api;1.8.0 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.1 in maven2
[ivy:resolve] 	found commons-io#commons-io;1.4 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.3 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.3 in maven2
[ivy:resolve] 	found aspectj#aspectjrt;1.5.3 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-core-uberjar;0.9 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-ant;0.9 in maven2
[ivy:resolve] 	found javax.servlet#jsp-api;2.0 in maven2
[ivy:resolve] 	found javax.servlet#servlet-api;2.5 in maven2
[ivy:resolve] 	found javax.servlet#jstl;1.1.2 in maven2
[ivy:resolve] 	found taglibs#standard;1.1.2 in maven2
[ivy:resolve] 	found junitperf#junitperf;1.8 in maven2
[ivy:resolve] :: resolution report :: resolve 230ms :: artifacts dl 15ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   26  |   0   |   0   |   0   ||   26  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#hdfsproxy [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 26 already retrieved (0kB/8ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: hdfsproxy

compile-examples:

compile-test:
     [echo] contrib: hdfsproxy
    [javac] Compiling 4 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/test 

jar:
     [echo] 
     [echo]             Building the .jar files.
     [echo]         
      [jar] Building jar: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.jar 

local-package:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/logs 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 12 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/lib 
     [copy] Copying 13 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/bin 
     [copy] Copying 10 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/conf 
     [copy] Copying 2 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 8 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/src 

war:
     [echo] 
     [echo] 			Building the .war file
     [echo] 		
      [war] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 

load-tasks:

cactifywar:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target 
[cactifywar] Analyzing war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 
[cactifywar] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war 

test:
     [echo] Please take a deep breath while Cargo gets the Tomcat for running the servlet tests...
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/temp 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/logs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/reports 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
   [cactus] -----------------------------------------------------------------
   [cactus] Running tests against Tomcat 5.x @ http://localhost:8087
   [cactus] -----------------------------------------------------------------
   [cactus] Deploying [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war]  to [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps]... 
   [cactus] Tomcat 5.x starting...
   [cactus] Tomcat 5.x started on port [8087]

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build.xml :773: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/build.xml :48: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/hdfsproxy/build.xml :150: Failed to start the container after more than [180000] ms. Trying to connect to the [http://localhost:8087/test/ServletRedirector?Cactus_Service=RUN_TEST] test URL yielded a [-1] error code. Please run in debug mode for more details about the error.

Total time: 180 minutes 44 seconds
Recording fingerprints
Publishing Javadoc
Recording test results
Publishing Clover coverage report...


Build failed in Hudson: Hadoop-trunk #772

Posted by Apache Hudson Server <hu...@hudson.zones.apache.org>.
See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/772/changes

Changes:

[cdouglas] HADOOP-5066. Building binary tarball should not build docs/javadocs, copy
src, or run jdiff. Contributed by Giridharan Kesavan.

[cdouglas] HADOOP-5274. Fix gridmix2 dependency on wordcount example.

[rangadi] HADOOP-4103. NameNode keeps a count of missing blocks. It warns on
WebUI if there are such blocks. '-report' and '-metaSave' have extra
info to track such blocks. (Raghu Angadi)

[asrabkin] HADOOP-5373
Chukwa collectors now track lifetime-received chunks.  (patch by asrabkin)

[asrabkin] HADOOP-5370
SeqFileWriter won't write empty sink files.  (patch by asrabkin)

[asrabkin] Fixes HADOOP-5087
(This time for real.)  My patch, eric's +1.

------------------------------------------
[...truncated 300782 lines...]
    [javac] Compiling 35 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/streaming/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

check-contrib:

init:
     [echo] contrib: thriftfs
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/examples 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/test/logs 

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#thriftfs;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 17ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#thriftfs [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/4ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: thriftfs
    [javac] Compiling 1 source file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/thriftfs/classes 

init:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/bin 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/conf 

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#vaidya;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 18ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#vaidya [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/3ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: vaidya
    [javac] Compiling 14 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/vaidya/classes 
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/vaidya/src/java/org/apache/hadoop/vaidya/statistics/job/JobStatistics.java  uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

compile-ant-tasks:
    [javac] Compiling 5 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/ant 
   [clover] Clover Version 2.3.2, built on July 15 2008 (build-732)
   [clover] Loaded from: /home/hudson/tools/clover/latest/lib/clover.jar
   [clover] Clover: Open Source License registered to Apache Software Foundation.
   [clover] Updating existing database at 'http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/clover/db/hadoop_coverage.db'. 
   [clover] Processing files at 1.6 source level.
   [clover] Clover all over. Instrumented 0 files (0 packages).

compile:

test-contrib:

test:
Trying to override old definition of task macro_tar

check-contrib:

init:
     [echo] contrib: hdfsproxy

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar 
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#hdfsproxy;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-logging#commons-logging;1.1 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;3.8.2 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.14 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.core.framework.uberjar.javaEE.14;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.ant;1.8.0 in maven2
[ivy:resolve] 	found org.apache.cactus#cactus.integration.shared.api;1.8.0 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.1 in maven2
[ivy:resolve] 	found commons-io#commons-io;1.4 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.3 in maven2
[ivy:resolve] 	found commons-codec#commons-codec;1.3 in maven2
[ivy:resolve] 	found aspectj#aspectjrt;1.5.3 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-core-uberjar;0.9 in maven2
[ivy:resolve] 	found org.codehaus.cargo#cargo-ant;0.9 in maven2
[ivy:resolve] 	found javax.servlet#jsp-api;2.0 in maven2
[ivy:resolve] 	found javax.servlet#servlet-api;2.5 in maven2
[ivy:resolve] 	found javax.servlet#jstl;1.1.2 in maven2
[ivy:resolve] 	found taglibs#standard;1.1.2 in maven2
[ivy:resolve] 	found junitperf#junitperf;1.8 in maven2
[ivy:resolve] :: resolution report :: resolve 224ms :: artifacts dl 10ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   26  |   0   |   0   |   0   ||   26  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#hdfsproxy [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 26 already retrieved (0kB/7ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default instance will be used
:: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/ivy/ivysettings.xml 

compile:
     [echo] contrib: hdfsproxy

compile-examples:

compile-test:
     [echo] contrib: hdfsproxy
    [javac] Compiling 4 source files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/test 

jar:
     [echo] 
     [echo]             Building the .jar files.
     [echo]         
      [jar] Building jar: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.jar 

local-package:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/logs 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 12 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/lib 
     [copy] Copying 13 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/bin 
     [copy] Copying 10 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/conf 
     [copy] Copying 2 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0 
     [copy] Copying 8 files to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0/src 

war:
     [echo] 
     [echo] 			Building the .war file
     [echo] 		
      [war] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 

load-tasks:

cactifywar:
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target 
[cactifywar] Analyzing war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/hdfsproxy-1.0.war 
[cactifywar] Building war: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war 

test:
     [echo] Please take a deep breath while Cargo gets the Tomcat for running the servlet tests...
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/temp 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/logs 
    [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/reports 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
     [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf 
   [cactus] -----------------------------------------------------------------
   [cactus] Running tests against Tomcat 5.x @ http://localhost:8087
   [cactus] -----------------------------------------------------------------
   [cactus] Deploying [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/test.war]  to [http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps]... 
   [cactus] Tomcat 5.x starting...
   [cactus] Tomcat 5.x started on port [8087]

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build.xml :773: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/build.xml :48: The following error occurred while executing this line:
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/src/contrib/hdfsproxy/build.xml :150: Failed to start the container after more than [180000] ms. Trying to connect to the [http://localhost:8087/test/ServletRedirector?Cactus_Service=RUN_TEST] test URL yielded a [-1] error code. Please run in debug mode for more details about the error.

Total time: 193 minutes 32 seconds
Recording fingerprints
Publishing Javadoc
Recording test results
Publishing Clover coverage report...


Build failed in Hudson: Hadoop-trunk #771

Posted by Apache Hudson Server <hu...@hudson.zones.apache.org>.
See http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/771/

------------------------------------------
[...truncated 299228 lines...]
    [junit] 2009-03-03 16:39:59,863 INFO  ipc.Server (Server.java:run(934)) - IPC Server handler 1 on 56527: starting
    [junit] 2009-03-03 16:39:59,862 INFO  ipc.Server (Server.java:run(313)) - IPC Server listener on 56527: starting
    [junit] 2009-03-03 16:39:59,863 INFO  ipc.Server (Server.java:run(934)) - IPC Server handler 3 on 56527: starting
    [junit] 2009-03-03 16:39:59,906 INFO  mapred.IndexCache (IndexCache.java:<init>(46)) - IndexCache created with max memory = 10485760
    [junit] 2009-03-03 16:39:59,941 INFO  net.NetworkTopology (NetworkTopology.java:add(328)) - Adding a new node: /default-rack/host0.foo.com
    [junit] 2009-03-03 16:39:59,946 INFO  net.NetworkTopology (NetworkTopology.java:add(328)) - Adding a new node: /default-rack/host1.foo.com
    [junit] rootdir = /test/testDistCh
    [junit] root=/test/testDistCh, returnvalue=0
    [junit] results:
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-03 16:40 /test/testDistCh/f1
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-03 16:40 /test/testDistCh/f2
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-03 16:40 /test/testDistCh/f3
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub0
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub1
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub2
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-03 16:40 /test/testDistCh/sub2/f4
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub3
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-03 16:40 /test/testDistCh/sub3/f5
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub4
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-03 16:40 /test/testDistCh/sub4/f6
    [junit] 
    [junit] args=[/test/testDistCh/sub3:::734,
    [junit]    /test/testDistCh/sub1:::230,
    [junit]    /test/testDistCh/sub2:sub2:sub2:675]
    [junit] newstatus=[hudson:supergroup:rwxr-xr-x,
    [junit]    hudson:supergroup:-w--wx---,
    [junit]    sub2:sub2:rw-rwxr-x,
    [junit]    hudson:supergroup:rwx-wxr--,
    [junit]    hudson:supergroup:rwxr-xr-x]
    [junit] 2009-03-03 16:40:00,306 INFO  tools.DistTool (DistCh.java:run(376)) - ops=[/test/testDistCh/sub3:null:null:rwx-wxr--, /test/testDistCh/sub1:null:null:-w--wx---, /test/testDistCh/sub2:sub2:sub2:rw-rwxr-x]
    [junit] 2009-03-03 16:40:00,306 INFO  tools.DistTool (DistCh.java:run(377)) - isIgnoreFailures=false
    [junit] 2009-03-03 16:40:00,325 INFO  tools.DistTool (DistCh.java:setup(427)) - distch.job.dir=hdfs://localhost:55350/user/hudson/build/test/mapred/system/distch_moj773
    [junit] 2009-03-03 16:40:00,328 INFO  tools.DistTool (DistCh.java:setup(433)) - log=hdfs://localhost:55350/user/hudson/build/test/mapred/system/distch_moj773/_logs
    [junit] 2009-03-03 16:40:00,407 INFO  tools.DistTool (DistCh.java:setup(476)) - distch.op.count=5
    [junit] 2009-03-03 16:40:00,414 WARN  mapred.JobClient (JobClient.java:configureCommandLineOptions(539)) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 2009-03-03 16:40:00,418 WARN  mapred.JobClient (JobClient.java:configureCommandLineOptions(661)) - No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
    [junit] 2009-03-03 16:40:00,431 INFO  tools.DistTool (DistCh.java:getSplits(261)) - numSplits=1, splits.size()=1
    [junit] 2009-03-03 16:40:00,630 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - build/test/mapred/local/jobTracker/job_200903031639_0001.xml:a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] 2009-03-03 16:40:00,641 INFO  mapred.JobClient (JobClient.java:runJob(1268)) - Running job: job_200903031639_0001
    [junit] 2009-03-03 16:40:00,901 INFO  mapred.JobInProgress (JobInProgress.java:initTasks(426)) - Input size for job job_200903031639_0001 = 502
    [junit] 2009-03-03 16:40:00,901 INFO  mapred.JobInProgress (JobInProgress.java:initTasks(428)) - Split info for job:job_200903031639_0001 with 1 splits:
    [junit] 2009-03-03 16:40:01,648 INFO  mapred.JobClient (JobClient.java:runJob(1291)) -  map 0% reduce 0%
    [junit] 2009-03-03 16:40:02,980 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1692)) - Adding task 'attempt_200903031639_0001_m_000002_0' to tip task_200903031639_0001_m_000002, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:41686'
    [junit] 2009-03-03 16:40:03,064 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/0_0/taskTracker/jobcache/job_200903031639_0001/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] 2009-03-03 16:40:03,140 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903031639_0001_m_1414536143
    [junit] 2009-03-03 16:40:03,140 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903031639_0001_m_1414536143 spawned.
    [junit] 2009-03-03 16:40:04,039 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903031639_0001_m_1414536143 exited. Number of tasks it ran: 1
    [junit] 2009-03-03 16:40:05,988 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903031639_0001_m_000002_0' has completed task_200903031639_0001_m_000002 successfully.
    [junit] 2009-03-03 16:40:05,993 INFO  mapred.JobInProgress (JobInProgress.java:findNewMapTask(1820)) - Choosing a non-local task task_200903031639_0001_m_000000
    [junit] 2009-03-03 16:40:05,993 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1692)) - Adding task 'attempt_200903031639_0001_m_000000_0' to tip task_200903031639_0001_m_000000, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:41686'
    [junit] 2009-03-03 16:40:05,998 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903031639_0001_m_000002_0 done; removing files.
    [junit] 2009-03-03 16:40:05,999 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903031639_0001_m_000002_0 not found in cache
    [junit] 2009-03-03 16:40:06,049 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903031639_0001_m_1101868962
    [junit] 2009-03-03 16:40:06,050 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903031639_0001_m_1101868962 spawned.
    [junit] 2009-03-03 16:40:06,670 INFO  mapred.JobClient (JobClient.java:runJob(1340)) - Task Id : attempt_200903031639_0001_m_000002_0, Status : SUCCEEDED
    [junit] attempt_200903031639_0001_m_000002_0: 2009-03-03 16:40:03,733 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/0_0/taskTracker/jobcache/job_200903031639_0001/attempt_200903031639_0001_m_000002_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903031639_0001_m_000002_0: 2009-03-03 16:40:03,817 INFO  jvm.JvmMetrics (JvmMetrics.java:init(71)) - Initializing JVM Metrics with processName=MAP, sessionId=
    [junit] attempt_200903031639_0001_m_000002_0: 2009-03-03 16:40:03,906 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/0_0/taskTracker/jobcache/job_200903031639_0001/attempt_200903031639_0001_m_000002_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903031639_0001_m_000002_0: 2009-03-03 16:40:03,956 INFO  mapred.TaskRunner (Task.java:done(644)) - Task:attempt_200903031639_0001_m_000002_0 is done. And is in the process of commiting
    [junit] attempt_200903031639_0001_m_000002_0: 2009-03-03 16:40:03,964 INFO  mapred.TaskRunner (Task.java:sendDone(715)) - Task 'attempt_200903031639_0001_m_000002_0' done.
    [junit] 2009-03-03 16:40:10,096 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903031639_0001_m_1101868962 exited. Number of tasks it ran: 1
    [junit] 2009-03-03 16:40:12,002 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903031639_0001_m_000000_0' has completed task_200903031639_0001_m_000000 successfully.
    [junit] 2009-03-03 16:40:12,006 INFO  mapred.JobTracker (JobTracker.java:createTaskEntry(1692)) - Adding task 'attempt_200903031639_0001_m_000001_0' to tip task_200903031639_0001_m_000001, for tracker 'tracker_host0.foo.com:localhost/127.0.0.1:41686'
    [junit] 2009-03-03 16:40:12,033 INFO  mapred.JvmManager (JvmManager.java:<init>(323)) - In JvmRunner constructed JVM ID: jvm_200903031639_0001_m_-1876455404
    [junit] 2009-03-03 16:40:12,033 INFO  mapred.JvmManager (JvmManager.java:spawnNewJvm(294)) - JVM Runner jvm_200903031639_0001_m_-1876455404 spawned.
    [junit] 2009-03-03 16:40:12,748 INFO  mapred.JobClient (JobClient.java:runJob(1291)) -  map 100% reduce 0%
    [junit] 2009-03-03 16:40:12,749 INFO  mapred.JobClient (JobClient.java:runJob(1340)) - Task Id : attempt_200903031639_0001_m_000000_0, Status : SUCCEEDED
    [junit] attempt_200903031639_0001_m_000000_0: 2009-03-03 16:40:06,628 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/0_0/taskTracker/jobcache/job_200903031639_0001/attempt_200903031639_0001_m_000000_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903031639_0001_m_000000_0: 2009-03-03 16:40:06,708 INFO  jvm.JvmMetrics (JvmMetrics.java:init(71)) - Initializing JVM Metrics with processName=MAP, sessionId=
    [junit] attempt_200903031639_0001_m_000000_0: 2009-03-03 16:40:06,797 WARN  conf.Configuration (Configuration.java:loadResource(1153)) - http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build/test/mapred/local/0_0/taskTracker/jobcache/job_200903031639_0001/attempt_200903031639_0001_m_000000_0/job.xml :a attempt to override final parameter: hadoop.tmp.dir;  Ignoring.
    [junit] attempt_200903031639_0001_m_000000_0: 2009-03-03 16:40:06,898 INFO  mapred.MapTask (MapTask.java:runOldMapper(343)) - numReduceTasks: 0
    [junit] attempt_200903031639_0001_m_000000_0: 2009-03-03 16:40:06,943 INFO  mapred.TaskRunner (Task.java:done(644)) - Task:attempt_200903031639_0001_m_000000_0 is done. And is in the process of commiting
    [junit] attempt_200903031639_0001_m_000000_0: 2009-03-03 16:40:09,952 INFO  mapred.TaskRunner (Task.java:commit(744)) - Task attempt_200903031639_0001_m_000000_0 is allowed to commit now
    [junit] attempt_200903031639_0001_m_000000_0: 2009-03-03 16:40:09,964 INFO  mapred.FileOutputCommitter (FileOutputCommitter.java:commitTask(92)) - Saved output of task 'attempt_200903031639_0001_m_000000_0' to hdfs://localhost:55350/user/hudson/build/test/mapred/system/distch_moj773/_logs
    [junit] attempt_200903031639_0001_m_000000_0: 2009-03-03 16:40:09,970 INFO  mapred.TaskRunner (Task.java:sendDone(715)) - Task 'attempt_200903031639_0001_m_000000_0' done.
    [junit] 2009-03-03 16:40:13,024 INFO  mapred.JvmManager (JvmManager.java:runChild(347)) - JVM : jvm_200903031639_0001_m_-1876455404 exited. Number of tasks it ran: 1
    [junit] 2009-03-03 16:40:15,009 INFO  mapred.JobInProgress (JobInProgress.java:completedTask(1985)) - Task 'attempt_200903031639_0001_m_000001_0' has completed task_200903031639_0001_m_000001 successfully.
    [junit] 2009-03-03 16:40:15,010 INFO  mapred.JobInProgress (JobInProgress.java:jobComplete(2099)) - Job job_200903031639_0001 has completed successfully.
    [junit] 2009-03-03 16:40:15,125 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1813)) - Removed completed task 'attempt_200903031639_0001_m_000000_0' from 'tracker_host0.foo.com:localhost/127.0.0.1:41686'
    [junit] 2009-03-03 16:40:15,125 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1813)) - Removed completed task 'attempt_200903031639_0001_m_000001_0' from 'tracker_host0.foo.com:localhost/127.0.0.1:41686'
    [junit] 2009-03-03 16:40:15,126 INFO  mapred.JobTracker (JobTracker.java:removeMarkedTasks(1813)) - Removed completed task 'attempt_200903031639_0001_m_000002_0' from 'tracker_host0.foo.com:localhost/127.0.0.1:41686'
    [junit] 2009-03-03 16:40:15,126 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903031639_0001_m_000000_0 done; removing files.
    [junit] 2009-03-03 16:40:15,127 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903031639_0001_m_000000_0 not found in cache
    [junit] 2009-03-03 16:40:15,128 INFO  mapred.TaskRunner (MapTaskRunner.java:close(43)) - attempt_200903031639_0001_m_000001_0 done; removing files.
    [junit] 2009-03-03 16:40:15,128 INFO  mapred.IndexCache (IndexCache.java:removeMap(140)) - Map ID attempt_200903031639_0001_m_000001_0 not found in cache
    [junit] 2009-03-03 16:40:15,766 INFO  mapred.JobClient (JobClient.java:runJob(1358)) - Job complete: job_200903031639_0001
    [junit] 2009-03-03 16:40:15,769 INFO  mapred.JobClient (Counters.java:log(514)) - Counters: 7
    [junit] 2009-03-03 16:40:15,769 INFO  mapred.JobClient (Counters.java:log(516)) -   Job Counters 
    [junit] 2009-03-03 16:40:15,769 INFO  mapred.JobClient (Counters.java:log(518)) -     Launched map tasks=1
    [junit] 2009-03-03 16:40:15,769 INFO  mapred.JobClient (Counters.java:log(516)) -   org.apache.hadoop.tools.DistCh$Counter
    [junit] 2009-03-03 16:40:15,770 INFO  mapred.JobClient (Counters.java:log(518)) -     SUCCEED=5
    [junit] 2009-03-03 16:40:15,770 INFO  mapred.JobClient (Counters.java:log(516)) -   FileSystemCounters
    [junit] 2009-03-03 16:40:15,770 INFO  mapred.JobClient (Counters.java:log(518)) -     HDFS_BYTES_READ=502
    [junit] 2009-03-03 16:40:15,771 INFO  mapred.JobClient (Counters.java:log(516)) -   Map-Reduce Framework
    [junit] 2009-03-03 16:40:15,771 INFO  mapred.JobClient (Counters.java:log(518)) -     Map input records=5
    [junit] 2009-03-03 16:40:15,771 INFO  mapred.JobClient (Counters.java:log(518)) -     Spilled Records=0
    [junit] 2009-03-03 16:40:15,772 INFO  mapred.JobClient (Counters.java:log(518)) -     Map input bytes=405
    [junit] 2009-03-03 16:40:15,772 INFO  mapred.JobClient (Counters.java:log(518)) -     Map output records=0
    [junit] root=/test/testDistCh, returnvalue=0
    [junit] results:
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-03 16:40 /test/testDistCh/f1
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-03 16:40 /test/testDistCh/f2
    [junit] -rw-r--r--   2 hudson supergroup         38 2009-03-03 16:40 /test/testDistCh/f3
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub0
    [junit] d-w--wx---   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub1
    [junit] drw-rwxr-x   - sub2   sub2                0 2009-03-03 16:40 /test/testDistCh/sub2
    [junit] -rw-rw-r--   2 sub2 sub2         43 2009-03-03 16:40 /test/testDistCh/sub2/f4
    [junit] drwx-wxr--   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub3
    [junit] -rw--w-r--   2 hudson supergroup         43 2009-03-03 16:40 /test/testDistCh/sub3/f5
    [junit] drwxr-xr-x   - hudson supergroup          0 2009-03-03 16:40 /test/testDistCh/sub4
    [junit] -rw-r--r--   2 hudson supergroup         43 2009-03-03 16:40 /test/testDistCh/sub4/f6
    [junit] 
    [junit] Shutting down the Mini HDFS Cluster
    [junit] Shutting down DataNode 1
    [junit] 2009-03-03 16:40:15,901 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 56658
    [junit] 2009-03-03 16:40:15,902 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 56658: exiting
    [junit] 2009-03-03 16:40:15,902 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-03 16:40:15,902 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 56658
    [junit] 2009-03-03 16:40:15,902 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 56658: exiting
    [junit] 2009-03-03 16:40:15,903 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 56658: exiting
    [junit] 2009-03-03 16:40:16,479 INFO  datanode.DataBlockScanner (DataBlockScanner.java:run(603)) - Exiting DataBlockScanner thread.
    [junit] 2009-03-03 16:40:16,903 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 56658
    [junit] Shutting down DataNode 0
    [junit] 2009-03-03 16:40:17,004 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 42091
    [junit] 2009-03-03 16:40:17,004 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 42091: exiting
    [junit] 2009-03-03 16:40:17,004 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 42091: exiting
    [junit] 2009-03-03 16:40:17,004 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 42091: exiting
    [junit] 2009-03-03 16:40:17,004 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-03 16:40:17,004 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 42091
    [junit] 2009-03-03 16:40:17,357 INFO  datanode.DataBlockScanner (DataBlockScanner.java:run(603)) - Exiting DataBlockScanner thread.
    [junit] 2009-03-03 16:40:18,004 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 42091
    [junit] 2009-03-03 16:40:18,107 INFO  ipc.Server (Server.java:stop(1098)) - Stopping server on 55350
    [junit] 2009-03-03 16:40:18,107 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 0 on 55350: exiting
    [junit] 2009-03-03 16:40:18,107 INFO  namenode.DecommissionManager (DecommissionManager.java:run(67)) - Interrupted Monitor
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2009-03-03 16:40:18,108 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 9 on 55350: exiting
    [junit] 2009-03-03 16:40:18,108 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 8 on 55350: exiting
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 31.119 sec
    [junit] 2009-03-03 16:40:18,108 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 7 on 55350: exiting
    [junit] 2009-03-03 16:40:18,108 INFO  ipc.Server (Server.java:run(352)) - Stopping IPC Server listener on 55350
    [junit] 2009-03-03 16:40:18,108 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 6 on 55350: exiting
    [junit] 2009-03-03 16:40:18,108 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 5 on 55350: exiting
    [junit] 2009-03-03 16:40:18,107 INFO  ipc.Server (Server.java:run(536)) - Stopping IPC Server Responder
    [junit] 2009-03-03 16:40:18,107 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 3 on 55350: exiting
    [junit] 2009-03-03 16:40:18,107 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 4 on 55350: exiting
    [junit] 2009-03-03 16:40:18,107 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 1 on 55350: exiting
    [junit] 2009-03-03 16:40:18,107 INFO  ipc.Server (Server.java:run(992)) - IPC Server handler 2 on 55350: exiting
    [junit] Running org.apache.hadoop.util.TestCyclicIteration
    [junit] 
    [junit] 
    [junit] integers=[]
    [junit] map={}
    [junit] start=-1, iteration=[]
    [junit] 
    [junit] 
    [junit] integers=[0]
    [junit] map={0=0}
    [junit] start=-1, iteration=[0]
    [junit] start=0, iteration=[0]
    [junit] start=1, iteration=[0]
    [junit] 
    [junit] 
    [junit] integers=[0, 2]
    [junit] map={0=0, 2=2}
    [junit] start=-1, iteration=[0, 2]
    [junit] start=0, iteration=[2, 0]
    [junit] start=1, iteration=[2, 0]
    [junit] start=2, iteration=[0, 2]
    [junit] start=3, iteration=[0, 2]
    [junit] 
    [junit] 
    [junit] integers=[0, 2, 4]
    [junit] map={0=0, 2=2, 4=4}
    [junit] start=-1, iteration=[0, 2, 4]
    [junit] start=0, iteration=[2, 4, 0]
    [junit] start=1, iteration=[2, 4, 0]
    [junit] start=2, iteration=[4, 0, 2]
    [junit] start=3, iteration=[4, 0, 2]
    [junit] start=4, iteration=[0, 2, 4]
    [junit] start=5, iteration=[0, 2, 4]
    [junit] 
    [junit] 
    [junit] integers=[0, 2, 4, 6]
    [junit] map={0=0, 2=2, 4=4, 6=6}
    [junit] start=-1, iteration=[0, 2, 4, 6]
    [junit] start=0, iteration=[2, 4, 6, 0]
    [junit] start=1, iteration=[2, 4, 6, 0]
    [junit] start=2, iteration=[4, 6, 0, 2]
    [junit] start=3, iteration=[4, 6, 0, 2]
    [junit] start=4, iteration=[6, 0, 2, 4]
    [junit] start=5, iteration=[6, 0, 2, 4]
    [junit] start=6, iteration=[0, 2, 4, 6]
    [junit] start=7, iteration=[0, 2, 4, 6]
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.087 sec
    [junit] Running org.apache.hadoop.util.TestGenericsUtil
    [junit] 2009-03-03 16:40:19,221 WARN  conf.Configuration (Configuration.java:<clinit>(175)) - DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
    [junit] 2009-03-03 16:40:19,234 WARN  util.GenericOptionsParser (GenericOptionsParser.java:parseGeneralOptions(377)) - options parsing failed: Missing argument for option:jt
    [junit] usage: general options are:
    [junit]  -archives <paths>             comma separated archives to be unarchived
    [junit]                                on the compute machines.
    [junit]  -conf <configuration file>    specify an application configuration file
    [junit]  -D <property=value>           use value for given property
    [junit]  -files <paths>                comma separated files to be copied to the
    [junit]                                map reduce cluster
    [junit]  -fs <local|namenode:port>     specify a namenode
    [junit]  -jt <local|jobtracker:port>   specify a job tracker
    [junit]  -libjars <paths>              comma separated jar files to include in the
    [junit]                                classpath.
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.203 sec
    [junit] Running org.apache.hadoop.util.TestIndexedSort
    [junit] sortRandom seed: -3782364560261034336(org.apache.hadoop.util.QuickSort)
    [junit] testSorted seed: -4182938044046789909(org.apache.hadoop.util.QuickSort)
    [junit] testAllEqual setting min/max at 20/180(org.apache.hadoop.util.QuickSort)
    [junit] sortWritable seed: -5850729374941370260(org.apache.hadoop.util.QuickSort)
    [junit] QuickSort degen cmp/swp: 23252/3713(org.apache.hadoop.util.QuickSort)
    [junit] sortRandom seed: -8791890648896773456(org.apache.hadoop.util.HeapSort)
    [junit] testSorted seed: 1977073533388409920(org.apache.hadoop.util.HeapSort)
    [junit] testAllEqual setting min/max at 20/34(org.apache.hadoop.util.HeapSort)
    [junit] sortWritable seed: -3435724119905890317(org.apache.hadoop.util.HeapSort)
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.874 sec
    [junit] Running org.apache.hadoop.util.TestProcfsBasedProcessTree
    [junit] 2009-03-03 16:40:20,776 INFO  util.ProcessTree (ProcessTree.java:isSetsidSupported(54)) - setsid exited with exit code 0
    [junit] 2009-03-03 16:40:21,281 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(141)) - Root process pid: 12919
    [junit] 2009-03-03 16:40:21,329 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(146)) - ProcessTree: [ 12919 12921 12922 ]
    [junit] 2009-03-03 16:40:27,853 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(159)) - ProcessTree: [ 12919 12935 12921 12933 12923 12931 12925 12929 12927 ]
    [junit] 2009-03-03 16:40:27,865 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:run(64)) - Shell Command exit with a non-zero exit code. This is expected as we are killing the subprocesses of the task intentionally. org.apache.hadoop.util.Shell$ExitCodeException: 
    [junit] 2009-03-03 16:40:27,865 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:run(70)) - Exit code: 143
    [junit] 2009-03-03 16:40:27,866 INFO  util.ProcessTree (ProcessTree.java:destroyProcessGroup(160)) - Killing all processes in the process group 12919 with SIGTERM. Exit code 0
    [junit] 2009-03-03 16:40:27,945 INFO  util.TestProcfsBasedProcessTree (TestProcfsBasedProcessTree.java:testProcessTree(173)) - RogueTaskThread successfully joined.
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.225 sec
    [junit] Running org.apache.hadoop.util.TestReflectionUtils
    [junit] 2009-03-03 16:40:28,842 WARN  conf.Configuration (Configuration.java:<clinit>(175)) - DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.623 sec
    [junit] Running org.apache.hadoop.util.TestShell
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.179 sec
    [junit] Running org.apache.hadoop.util.TestStringUtils
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.085 sec

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/ws/trunk/build.xml :769: Tests failed!

Total time: 186 minutes 40 seconds
Recording fingerprints
Publishing Javadoc
Recording test results
Publishing Clover coverage report...