You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2011/07/30 07:20:45 UTC

Build failed in Jenkins: Hadoop-0.20.204-Build #20

See <https://builds.apache.org/job/Hadoop-0.20.204-Build/20/>

------------------------------------------
[...truncated 12274 lines...]
     [exec] checking for pthread_create in -lpthread... yes
     [exec] checking for HMAC_Init in -lssl... yes
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] checking for a sed that does not truncate output... /bin/sed
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] checking for shutdown in -lsocket... no
     [exec] checking for xdr_float in -lnsl... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: impl/config.h is unchanged
     [exec] config.status: executing depfiles commands

compile-c++-examples-pipes:
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin">
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-simple' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-simple'>
     [exec] /usr/bin/install -c wordcount-simple <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-simple>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-part' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-part'>
     [exec] /usr/bin/install -c wordcount-part <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-part>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-nopipe' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-nopipe'>
     [exec] /usr/bin/install -c wordcount-nopipe <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-nopipe>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'pipes-sort' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/pipes-sort'>
     [exec] /usr/bin/install -c pipes-sort <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/pipes-sort>
     [exec] make[1]: Nothing to be done for `install-data-am'.
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes'>

compile-c++-examples:

compile-examples:

generate-test-records:

compile-core-test:
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] Compiling 1 source file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
   [delete] Deleting: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar/testjob.jar>
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar/testjob.jar>
    [javac] Compiling 1 source file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell>
    [javac] Note: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/testshell/ExternalMapReduce.java> uses or overrides a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
   [delete] Deleting: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell/testshell.jar>
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell/testshell.jar>
   [delete] Deleting directory <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
   [delete] Deleting directory <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>

test-contrib:

test:
Trying to override old definition of task macro_tar

check-contrib:

init:
     [echo] contrib: hdfsproxy

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:
[ivy:resolve] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#hdfsproxy;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in default
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in default
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in default
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 152ms :: artifacts dl 7ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   25  |   0   |   0   |   3   ||   22  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#hdfsproxy [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 22 already retrieved (0kB/5ms)
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

compile:
     [echo] contrib: hdfsproxy

compile-examples:

compile-test:
     [echo] contrib: hdfsproxy
    [javac] Compiling 5 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/hdfsproxy/test>

test-junit:
     [copy] Copying 11 files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources/proxy-config>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources>
    [junit] Running org.apache.hadoop.hdfsproxy.TestHdfsProxy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 6.742 sec
    [junit] Test org.apache.hadoop.hdfsproxy.TestHdfsProxy FAILED

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1114: The following error occurred while executing this line:
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1103: The following error occurred while executing this line:
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/build.xml>:51: The following error occurred while executing this line:
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/hdfsproxy/build.xml>:278: Tests failed!

Total time: 250 minutes 25 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Jenkins build is back to normal : Hadoop-0.20.204-Build #24

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/24/>



Build failed in Jenkins: Hadoop-0.20.204-Build #23

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/23/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
hudson.util.IOException2: remote file operation failed: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/> at hudson.remoting.Channel@1b8093e6:ubuntu2
	at hudson.FilePath.act(FilePath.java:754)
	at hudson.FilePath.act(FilePath.java:740)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:684)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:633)
	at hudson.model.AbstractProject.checkout(AbstractProject.java:1181)
	at hudson.model.AbstractBuild$AbstractRunner.checkout(AbstractBuild.java:536)
	at hudson.model.AbstractBuild$AbstractRunner.run(AbstractBuild.java:424)
	at hudson.model.Run.run(Run.java:1374)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.io.IOException: Unable to delete <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/org/apache/hadoop> - files in dir: [<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/org/apache/hadoop/mapred]>
	at hudson.Util.deleteFile(Util.java:262)
	at hudson.Util.deleteRecursive(Util.java:305)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:67)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #22

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/22/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
hudson.util.IOException2: remote file operation failed: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/> at hudson.remoting.Channel@1b8093e6:ubuntu2
	at hudson.FilePath.act(FilePath.java:754)
	at hudson.FilePath.act(FilePath.java:740)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:684)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:633)
	at hudson.model.AbstractProject.checkout(AbstractProject.java:1181)
	at hudson.model.AbstractBuild$AbstractRunner.checkout(AbstractBuild.java:536)
	at hudson.model.AbstractBuild$AbstractRunner.run(AbstractBuild.java:424)
	at hudson.model.Run.run(Run.java:1374)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.io.IOException: Unable to delete <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib> - files in dir: [<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.LICENSE.txt,> <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.LICENSE.txt,> <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.2.jar,> <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.jar]>
	at hudson.Util.deleteFile(Util.java:262)
	at hudson.Util.deleteRecursive(Util.java:305)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:67)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #21

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/21/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
SCM check out aborted
Archiving artifacts
ERROR: Failed to archive artifacts: trunk/build/*.tar.gz
hudson.util.IOException2: hudson.util.IOException2: Failed to extract <https://builds.apache.org/job/Hadoop-0.20.204-Build/21/artifact/trunk/build/*.tar.gz>
	at hudson.FilePath.readFromTar(FilePath.java:1647)
	at hudson.FilePath.copyRecursiveTo(FilePath.java:1565)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:117)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
	at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:662)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:638)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:616)
	at hudson.model.Build$RunnerImpl.post2(Build.java:161)
	at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:585)
	at hudson.model.Run.run(Run.java:1398)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.io.IOException
	at hudson.remoting.FastPipedInputStream.read(FastPipedInputStream.java:175)
	at hudson.util.HeadBufferingStream.read(HeadBufferingStream.java:61)
	at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:221)
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:141)
	at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:92)
	at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257)
	at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223)
	at hudson.org.apache.tools.tar.TarInputStream.read(TarInputStream.java:345)
	at java.io.FilterInputStream.read(FilterInputStream.java:90)
	at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1025)
	at org.apache.commons.io.IOUtils.copy(IOUtils.java:999)
	at hudson.util.IOUtils.copy(IOUtils.java:38)
	at hudson.FilePath.readFromTar(FilePath.java:1639)
	... 12 more

	at hudson.FilePath.copyRecursiveTo(FilePath.java:1572)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:117)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
	at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:662)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:638)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:616)
	at hudson.model.Build$RunnerImpl.post2(Build.java:161)
	at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:585)
	at hudson.model.Run.run(Run.java:1398)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: Pipe is already closed
	at hudson.remoting.Channel$2.adapt(Channel.java:694)
	at hudson.remoting.Channel$2.adapt(Channel.java:689)
	at hudson.remoting.FutureAdapter.get(FutureAdapter.java:59)
	at hudson.FilePath.copyRecursiveTo(FilePath.java:1568)
	... 11 more
Caused by: java.io.IOException: Pipe is already closed
	at hudson.remoting.PipeWindow.checkDeath(PipeWindow.java:83)
	at hudson.remoting.PipeWindow$Real.get(PipeWindow.java:165)
	at hudson.remoting.ProxyOutputStream._write(ProxyOutputStream.java:118)
	at hudson.remoting.ProxyOutputStream.write(ProxyOutputStream.java:103)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:178)
	at java.util.zip.DeflaterOutputStream.write(DeflaterOutputStream.java:135)
	at java.util.zip.GZIPOutputStream.write(GZIPOutputStream.java:89)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at org.apache.tools.tar.TarBuffer.writeBlock(TarBuffer.java:410)
	at org.apache.tools.tar.TarBuffer.writeRecord(TarBuffer.java:351)
	at hudson.org.apache.tools.tar.TarOutputStream.writeEOFRecord(TarOutputStream.java:356)
	at hudson.org.apache.tools.tar.TarOutputStream.finish(TarOutputStream.java:137)
	at hudson.org.apache.tools.tar.TarOutputStream.close(TarOutputStream.java:149)
	at hudson.util.io.TarArchiver.close(TarArchiver.java:119)
	at hudson.FilePath.writeToTar(FilePath.java:1619)
	at hudson.FilePath.access$900(FilePath.java:164)
	at hudson.FilePath$33.invoke(FilePath.java:1558)
	at hudson.FilePath$33.invoke(FilePath.java:1555)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Caused by: java.io.IOException: Pipe is already closed
	at hudson.remoting.FastPipedOutputStream.write(FastPipedOutputStream.java:147)
	at hudson.remoting.FastPipedOutputStream.write(FastPipedOutputStream.java:131)
	at hudson.remoting.ProxyOutputStream$Chunk$1.run(ProxyOutputStream.java:185)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Caused by: hudson.remoting.FastPipedInputStream$ClosedBy: The pipe was closed at...
	at hudson.remoting.FastPipedInputStream.close(FastPipedInputStream.java:112)
	at java.io.FilterInputStream.close(FilterInputStream.java:155)
	at java.util.zip.InflaterInputStream.close(InflaterInputStream.java:210)
	at java.util.zip.GZIPInputStream.close(GZIPInputStream.java:113)
	at org.apache.tools.tar.TarBuffer.close(TarBuffer.java:456)
	at hudson.org.apache.tools.tar.TarInputStream.close(TarInputStream.java:110)
	at hudson.FilePath.readFromTar(FilePath.java:1649)
	at hudson.FilePath.copyRecursiveTo(FilePath.java:1565)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:117)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
	at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:662)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:638)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:616)
	at hudson.model.Build$RunnerImpl.post2(Build.java:161)
	at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:585)
	at hudson.model.Run.run(Run.java:1398)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: