You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2011/07/19 12:36:46 UTC

Build failed in Jenkins: Hadoop-0.20.204-Build #5

See <https://builds.apache.org/job/Hadoop-0.20.204-Build/5/changes>

Changes:

[omalley] HADOOP-7045. TestDU fails on systems with local file systems with 
extended attributes. (eli)

------------------------------------------
[...truncated 8899 lines...]
     [exec] checking for a sed that does not truncate output... /bin/sed
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: impl/config.h is unchanged
     [exec] config.status: executing depfiles commands

compile-c++-utils:
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib">
     [exec]  /usr/bin/install -c -m 644 'libhadooputils.a' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec]  ranlib '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop">
     [exec]  /usr/bin/install -c -m 644 '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/StringUtils.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/StringUtils.hh'>
     [exec]  /usr/bin/install -c -m 644 '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/SerialUtils.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/SerialUtils.hh'>
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>

compile-c++-pipes:
     [exec] depbase=`echo impl/HadoopPipes.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes> -I./impl    -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api> -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -MT impl/HadoopPipes.o -MD -MP -MF "$depbase.Tpo" -c -o impl/HadoopPipes.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>: In member function 'void HadoopPipes::TextUpwardProtocol::writeBuffer(const std::string&)':
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>:129: warning: format not a string literal and no format arguments
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>: In member function 'std::string HadoopPipes::BinaryProtocol::createDigest(std::string&, std::string&)':
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>:439: warning: value computed is not used
     [exec] rm -f libhadooppipes.a
     [exec] ar cru libhadooppipes.a impl/HadoopPipes.o 
     [exec] ranlib libhadooppipes.a
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib">
     [exec]  /usr/bin/install -c -m 644 'libhadooppipes.a' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec]  ranlib '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop">
     [exec]  /usr/bin/install -c -m 644 '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/Pipes.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/Pipes.hh'>
     [exec]  /usr/bin/install -c -m 644 '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/TemplateFactory.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/TemplateFactory.hh'>
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>

compile-c++:

compile-core:

test-c++-libhdfs:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/hdfs/name>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_test.o -MD -MP -MF ".deps/hdfs_test.Tpo" -c -o hdfs_test.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c;> \
     [exec] 	then mv -f ".deps/hdfs_test.Tpo" ".deps/hdfs_test.Po"; else rm -f ".deps/hdfs_test.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>: In function `main':
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:87: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:90: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:130: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:133: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:188: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:189: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:190: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:198: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:199: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:220: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:221: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:272: warning: implicit declaration of function `sleep'
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_test  hdfs_test.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_test hdfs_test.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_read.o -MD -MP -MF ".deps/hdfs_read.Tpo" -c -o hdfs_read.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c;> \
     [exec] 	then mv -f ".deps/hdfs_read.Tpo" ".deps/hdfs_read.Po"; else rm -f ".deps/hdfs_read.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>: In function `main':
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>:35: warning: unused variable `fileTotalSize'
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_read  hdfs_read.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_read hdfs_read.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_write.o -MD -MP -MF ".deps/hdfs_write.Tpo" -c -o hdfs_write.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_write.c;> \
     [exec] 	then mv -f ".deps/hdfs_write.Tpo" ".deps/hdfs_write.Po"; else rm -f ".deps/hdfs_write.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_write  hdfs_write.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_write hdfs_write.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/test-libhdfs.sh>	
     [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] LIB_JVM_DIR = /homes/hudson/tools/java/latest1.6/jre/lib/i386/server
     [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop>: line 53: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] 11/07/19 10:35:14 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
     [exec] 11/07/19 10:35:14 INFO namenode.NameNode: STARTUP_MSG: 
     [exec] /************************************************************
     [exec] STARTUP_MSG: Starting NameNode
     [exec] STARTUP_MSG:   host = h4.grid.sp2.yahoo.net/127.0.1.1
     [exec] STARTUP_MSG:   args = [-format]
     [exec] STARTUP_MSG:   version = 0.20.204
     [exec] STARTUP_MSG:   build = http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204 -r 1148069; compiled by 'hudson' on Tue Jul 19 09:18:59 UTC 2011
     [exec] ************************************************************/
     [exec] 11/07/19 10:35:14 INFO util.GSet: VM type       = 32-bit
     [exec] 11/07/19 10:35:14 INFO util.GSet: 2% max memory = 17.77875 MB
     [exec] 11/07/19 10:35:14 INFO util.GSet: capacity      = 2^22 = 4194304 entries
     [exec] 11/07/19 10:35:14 INFO util.GSet: recommended=4194304, actual=4194304
     [exec] 11/07/19 10:35:15 INFO namenode.FSNamesystem: fsOwner=hudson
     [exec] 11/07/19 10:35:15 INFO namenode.FSNamesystem: supergroup=supergroup
     [exec] 11/07/19 10:35:15 INFO namenode.FSNamesystem: isPermissionEnabled=true
     [exec] 11/07/19 10:35:15 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
     [exec] 11/07/19 10:35:15 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
     [exec] 11/07/19 10:35:15 INFO namenode.NameNode: Caching file names occuring more than 10 times 
     [exec] 11/07/19 10:35:15 INFO common.Storage: Image file of size 112 saved in 0 seconds.
     [exec] 11/07/19 10:35:15 INFO common.Storage: Storage directory build/test/libhdfs/dfs/name has been successfully formatted.
     [exec] 11/07/19 10:35:15 INFO namenode.NameNode: SHUTDOWN_MSG: 
     [exec] /************************************************************
     [exec] SHUTDOWN_MSG: Shutting down NameNode at h4.grid.sp2.yahoo.net/127.0.1.1
     [exec] ************************************************************/
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] starting namenode, logging to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-namenode-h4.grid.sp2.yahoo.net.out>
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] nice: starting datanode, logging to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-datanode-h4.grid.sp2.yahoo.net.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] CLASSPATH=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:/homes/hudson/tools/java/latest1.6/lib/tools.jar:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/jsp-2.0/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjrt-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjtools-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-1.7.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-core-1.8.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-cli-1.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-codec-1.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-collections-3.2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-configuration-1.6.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-daemon-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-digester-1.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-el-1.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-httpclient-3.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-lang-2.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-1.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-api-1.0.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-math-2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-net-1.4.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/core-3.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-core-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-mapper-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-compiler-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-runtime-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jdeb-0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jets3t-0.6.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-util-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jsch-0.1.42.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/junit-4.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/log4j-1.2.15.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/mockito-all-1.8.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/oro-2.0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/servlet-api-2.5-20081211.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-api-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-log4j12-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/xmlenc-0.52.jar> LD_PRELOAD=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so>:/homes/hudson/tools/java/latest1.6/jre/lib/i386/server/libjvm.so <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/libhdfs/hdfs_test>
     [exec] 11/07/19 10:36:01 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
     [exec] 11/07/19 10:36:01 WARN fs.FileSystem: "localhost:23000" is a deprecated filesystem name. Use "hdfs://localhost:23000/" instead.
     [exec] 11/07/19 10:36:02 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 0 time(s).
     [exec] 11/07/19 10:36:03 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 1 time(s).
     [exec] 11/07/19 10:36:04 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 2 time(s).
     [exec] 11/07/19 10:36:05 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 3 time(s).
     [exec] 11/07/19 10:36:06 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 4 time(s).
     [exec] 11/07/19 10:36:07 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 5 time(s).
     [exec] 11/07/19 10:36:08 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 6 time(s).
     [exec] 11/07/19 10:36:09 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 7 time(s).
     [exec] 11/07/19 10:36:10 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 8 time(s).
     [exec] 11/07/19 10:36:11 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 9 time(s).
     [exec] Exception in thread "main" java.net.ConnectException: Call to localhost/127.0.0.1:23000 failed on connection exception: java.net.ConnectException: Connection refused
     [exec] 	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1057)
     [exec] 	at org.apache.hadoop.ipc.Client.call(Client.java:1033)
     [exec] 	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:224)
     [exec] 	at $Proxy1.getProtocolVersion(Unknown Source)
     [exec] 	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:208)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:175)
     [exec] 	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
     [exec] 	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1310)
     [exec] 	at org.apache.hadoop.fs.FileSystem.access$100(FileSystem.java:65)
     [exec] 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1328)
     [exec] 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
     [exec] 	at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:103)
     [exec] 	at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:101)
     [exec] 	at java.security.AccessController.doPrivileged(Native Method)
     [exec] 	at javax.security.auth.Subject.doAs(Subject.java:396)
     [exec] 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
     [exec] 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:101)
     [exec] Caused by: java.net.ConnectException: Connection refused
     [exec] 	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
     [exec] 	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
     [exec] 	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
     [exec] 	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:406)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:414)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:527)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:187)
     [exec] 	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1164)
     [exec] 	at org.apache.hadoop.ipc.Client.call(Client.java:1010)
     [exec] 	... 17 more
     [exec] Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) failed!
     [exec] Oops! Failed to connect to hdfs!
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] no datanode to stop
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] no namenode to stop
     [exec] make: *** [test] Error 255
     [exec] exiting with 255

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1857: exec returned: 2

Total time: 243 minutes 41 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Jenkins build is back to normal : Hadoop-0.20.204-Build #24

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/24/>



Build failed in Jenkins: Hadoop-0.20.204-Build #23

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/23/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
hudson.util.IOException2: remote file operation failed: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/> at hudson.remoting.Channel@1b8093e6:ubuntu2
	at hudson.FilePath.act(FilePath.java:754)
	at hudson.FilePath.act(FilePath.java:740)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:684)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:633)
	at hudson.model.AbstractProject.checkout(AbstractProject.java:1181)
	at hudson.model.AbstractBuild$AbstractRunner.checkout(AbstractBuild.java:536)
	at hudson.model.AbstractBuild$AbstractRunner.run(AbstractBuild.java:424)
	at hudson.model.Run.run(Run.java:1374)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.io.IOException: Unable to delete <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/org/apache/hadoop> - files in dir: [<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/org/apache/hadoop/mapred]>
	at hudson.Util.deleteFile(Util.java:262)
	at hudson.Util.deleteRecursive(Util.java:305)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:67)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #22

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/22/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
hudson.util.IOException2: remote file operation failed: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/> at hudson.remoting.Channel@1b8093e6:ubuntu2
	at hudson.FilePath.act(FilePath.java:754)
	at hudson.FilePath.act(FilePath.java:740)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:684)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:633)
	at hudson.model.AbstractProject.checkout(AbstractProject.java:1181)
	at hudson.model.AbstractBuild$AbstractRunner.checkout(AbstractBuild.java:536)
	at hudson.model.AbstractBuild$AbstractRunner.run(AbstractBuild.java:424)
	at hudson.model.Run.run(Run.java:1374)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.io.IOException: Unable to delete <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib> - files in dir: [<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.LICENSE.txt,> <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.LICENSE.txt,> <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.2.jar,> <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.jar]>
	at hudson.Util.deleteFile(Util.java:262)
	at hudson.Util.deleteRecursive(Util.java:305)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:67)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #21

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/21/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
SCM check out aborted
Archiving artifacts
ERROR: Failed to archive artifacts: trunk/build/*.tar.gz
hudson.util.IOException2: hudson.util.IOException2: Failed to extract <https://builds.apache.org/job/Hadoop-0.20.204-Build/21/artifact/trunk/build/*.tar.gz>
	at hudson.FilePath.readFromTar(FilePath.java:1647)
	at hudson.FilePath.copyRecursiveTo(FilePath.java:1565)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:117)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
	at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:662)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:638)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:616)
	at hudson.model.Build$RunnerImpl.post2(Build.java:161)
	at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:585)
	at hudson.model.Run.run(Run.java:1398)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.io.IOException
	at hudson.remoting.FastPipedInputStream.read(FastPipedInputStream.java:175)
	at hudson.util.HeadBufferingStream.read(HeadBufferingStream.java:61)
	at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:221)
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:141)
	at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:92)
	at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257)
	at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223)
	at hudson.org.apache.tools.tar.TarInputStream.read(TarInputStream.java:345)
	at java.io.FilterInputStream.read(FilterInputStream.java:90)
	at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1025)
	at org.apache.commons.io.IOUtils.copy(IOUtils.java:999)
	at hudson.util.IOUtils.copy(IOUtils.java:38)
	at hudson.FilePath.readFromTar(FilePath.java:1639)
	... 12 more

	at hudson.FilePath.copyRecursiveTo(FilePath.java:1572)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:117)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
	at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:662)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:638)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:616)
	at hudson.model.Build$RunnerImpl.post2(Build.java:161)
	at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:585)
	at hudson.model.Run.run(Run.java:1398)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: Pipe is already closed
	at hudson.remoting.Channel$2.adapt(Channel.java:694)
	at hudson.remoting.Channel$2.adapt(Channel.java:689)
	at hudson.remoting.FutureAdapter.get(FutureAdapter.java:59)
	at hudson.FilePath.copyRecursiveTo(FilePath.java:1568)
	... 11 more
Caused by: java.io.IOException: Pipe is already closed
	at hudson.remoting.PipeWindow.checkDeath(PipeWindow.java:83)
	at hudson.remoting.PipeWindow$Real.get(PipeWindow.java:165)
	at hudson.remoting.ProxyOutputStream._write(ProxyOutputStream.java:118)
	at hudson.remoting.ProxyOutputStream.write(ProxyOutputStream.java:103)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
	at java.util.zip.DeflaterOutputStream.deflate(DeflaterOutputStream.java:178)
	at java.util.zip.DeflaterOutputStream.write(DeflaterOutputStream.java:135)
	at java.util.zip.GZIPOutputStream.write(GZIPOutputStream.java:89)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:122)
	at org.apache.tools.tar.TarBuffer.writeBlock(TarBuffer.java:410)
	at org.apache.tools.tar.TarBuffer.writeRecord(TarBuffer.java:351)
	at hudson.org.apache.tools.tar.TarOutputStream.writeEOFRecord(TarOutputStream.java:356)
	at hudson.org.apache.tools.tar.TarOutputStream.finish(TarOutputStream.java:137)
	at hudson.org.apache.tools.tar.TarOutputStream.close(TarOutputStream.java:149)
	at hudson.util.io.TarArchiver.close(TarArchiver.java:119)
	at hudson.FilePath.writeToTar(FilePath.java:1619)
	at hudson.FilePath.access$900(FilePath.java:164)
	at hudson.FilePath$33.invoke(FilePath.java:1558)
	at hudson.FilePath$33.invoke(FilePath.java:1555)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Caused by: java.io.IOException: Pipe is already closed
	at hudson.remoting.FastPipedOutputStream.write(FastPipedOutputStream.java:147)
	at hudson.remoting.FastPipedOutputStream.write(FastPipedOutputStream.java:131)
	at hudson.remoting.ProxyOutputStream$Chunk$1.run(ProxyOutputStream.java:185)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Caused by: hudson.remoting.FastPipedInputStream$ClosedBy: The pipe was closed at...
	at hudson.remoting.FastPipedInputStream.close(FastPipedInputStream.java:112)
	at java.io.FilterInputStream.close(FilterInputStream.java:155)
	at java.util.zip.InflaterInputStream.close(InflaterInputStream.java:210)
	at java.util.zip.GZIPInputStream.close(GZIPInputStream.java:113)
	at org.apache.tools.tar.TarBuffer.close(TarBuffer.java:456)
	at hudson.org.apache.tools.tar.TarInputStream.close(TarInputStream.java:110)
	at hudson.FilePath.readFromTar(FilePath.java:1649)
	at hudson.FilePath.copyRecursiveTo(FilePath.java:1565)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:117)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
	at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:662)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:638)
	at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:616)
	at hudson.model.Build$RunnerImpl.post2(Build.java:161)
	at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:585)
	at hudson.model.Run.run(Run.java:1398)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #20

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/20/>

------------------------------------------
[...truncated 12274 lines...]
     [exec] checking for pthread_create in -lpthread... yes
     [exec] checking for HMAC_Init in -lssl... yes
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] checking for a sed that does not truncate output... /bin/sed
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] checking for shutdown in -lsocket... no
     [exec] checking for xdr_float in -lnsl... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: impl/config.h is unchanged
     [exec] config.status: executing depfiles commands

compile-c++-examples-pipes:
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin">
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-simple' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-simple'>
     [exec] /usr/bin/install -c wordcount-simple <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-simple>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-part' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-part'>
     [exec] /usr/bin/install -c wordcount-part <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-part>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-nopipe' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-nopipe'>
     [exec] /usr/bin/install -c wordcount-nopipe <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-nopipe>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'pipes-sort' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/pipes-sort'>
     [exec] /usr/bin/install -c pipes-sort <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/pipes-sort>
     [exec] make[1]: Nothing to be done for `install-data-am'.
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes'>

compile-c++-examples:

compile-examples:

generate-test-records:

compile-core-test:
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] Compiling 1 source file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
   [delete] Deleting: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar/testjob.jar>
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar/testjob.jar>
    [javac] Compiling 1 source file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell>
    [javac] Note: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/testshell/ExternalMapReduce.java> uses or overrides a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
   [delete] Deleting: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell/testshell.jar>
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell/testshell.jar>
   [delete] Deleting directory <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
   [delete] Deleting directory <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>

test-contrib:

test:
Trying to override old definition of task macro_tar

check-contrib:

init:
     [echo] contrib: hdfsproxy

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:

ivy-resolve-common:
[ivy:resolve] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#hdfsproxy;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in default
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in default
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in default
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 152ms :: artifacts dl 7ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   25  |   0   |   0   |   3   ||   22  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#hdfsproxy [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	0 artifacts copied, 22 already retrieved (0kB/5ms)
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

compile:
     [echo] contrib: hdfsproxy

compile-examples:

compile-test:
     [echo] contrib: hdfsproxy
    [javac] Compiling 5 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/hdfsproxy/test>

test-junit:
     [copy] Copying 11 files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources/proxy-config>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources>
    [junit] Running org.apache.hadoop.hdfsproxy.TestHdfsProxy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 6.742 sec
    [junit] Test org.apache.hadoop.hdfsproxy.TestHdfsProxy FAILED

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1114: The following error occurred while executing this line:
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1103: The following error occurred while executing this line:
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/build.xml>:51: The following error occurred while executing this line:
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/hdfsproxy/build.xml>:278: Tests failed!

Total time: 250 minutes 25 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #19

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/19/>

------------------------------------------
[...truncated 7179 lines...]
     [exec] aclocal.m4:3644: _LT_AC_LANG_F77_CONFIG is expanded from...
     [exec] aclocal.m4:3643: AC_LIBTOOL_LANG_F77_CONFIG is expanded from...
     [exec] configure.ac:36: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works_GCJ, ...): suspicious cache-id, must contain _cv_ to be cached
     [exec] aclocal.m4:3744: _LT_AC_LANG_GCJ_CONFIG is expanded from...
     [exec] aclocal.m4:3743: AC_LIBTOOL_LANG_GCJ_CONFIG is expanded from...
     [exec] /bin/bash ./config.status --recheck
     [exec] running CONFIG_SHELL=/bin/bash /bin/bash <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/configure> --prefix=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64> --with-hadoop-utils=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64> --with-hadoop-pipes=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64> --no-create --no-recursion
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for gawk... no
     [exec] checking for mawk... mawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking for style of include used by make... GNU
     [exec] checking for gcc... gcc
     [exec] checking whether the C compiler works... yes
     [exec] checking for C compiler default output file name... a.out
     [exec] checking for suffix of executables... 
     [exec] checking whether we are cross compiling... no
     [exec] checking for suffix of object files... o
     [exec] checking whether we are using the GNU C compiler... yes
     [exec] checking whether gcc accepts -g... yes
     [exec] checking for gcc option to accept ISO C89... none needed
     [exec] checking dependency style of gcc... gcc3
     [exec] checking how to run the C preprocessor... gcc -E
     [exec] checking for grep that handles long lines and -e... /bin/grep
     [exec] checking for egrep... /bin/grep -E
     [exec] checking for ANSI C header files... yes
     [exec] checking for sys/types.h... yes
     [exec] checking for sys/stat.h... yes
     [exec] checking for stdlib.h... yes
     [exec] checking for string.h... yes
     [exec] checking for memory.h... yes
     [exec] checking for strings.h... yes
     [exec] checking for inttypes.h... yes
     [exec] checking for stdint.h... yes
     [exec] checking for unistd.h... yes
     [exec] checking minix/config.h usability... no
     [exec] checking minix/config.h presence... no
     [exec] checking for minix/config.h... no
     [exec] checking whether it is safe to define __EXTENSIONS__... yes
     [exec] checking for special C compiler options needed for large files... no
     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
     [exec] checking pthread.h usability... yes
     [exec] checking pthread.h presence... yes
     [exec] checking for pthread.h... yes
     [exec] checking for pthread_create in -lpthread... yes
     [exec] checking for HMAC_Init in -lssl... yes
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] checking for a sed that does not truncate output... /bin/sed
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] checking for shutdown in -lsocket... no
     [exec] checking for xdr_float in -lnsl... yes
     [exec] configure: creating ./config.status
     [exec]  /bin/bash ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: impl/config.h is unchanged
     [exec] config.status: executing depfiles commands
     [exec] depbase=`echo impl/wordcount-simple.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes> -I./impl    -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -MT impl/wordcount-simple.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-simple.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/impl/wordcount-simple.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2   -o wordcount-simple  impl/wordcount-simple.o -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread 
     [exec] mkdir .libs
     [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -o wordcount-simple impl/wordcount-simple.o  -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread
     [exec] depbase=`echo impl/wordcount-part.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes> -I./impl    -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -MT impl/wordcount-part.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-part.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/impl/wordcount-part.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2   -o wordcount-part  impl/wordcount-part.o -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread 
     [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -o wordcount-part impl/wordcount-part.o  -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread
     [exec] depbase=`echo impl/wordcount-nopipe.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes> -I./impl    -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -MT impl/wordcount-nopipe.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-nopipe.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/impl/wordcount-nopipe.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2   -o wordcount-nopipe  impl/wordcount-nopipe.o -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread 
     [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -o wordcount-nopipe impl/wordcount-nopipe.o  -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread
     [exec] depbase=`echo impl/sort.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes> -I./impl    -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -MT impl/sort.o -MD -MP -MF "$depbase.Tpo" -c -o impl/sort.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/impl/sort.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2   -o pipes-sort  impl/sort.o -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread 
     [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -o pipes-sort impl/sort.o  -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin">
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-simple' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-simple'>
     [exec] /usr/bin/install -c wordcount-simple <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-simple>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-part' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-part'>
     [exec] /usr/bin/install -c wordcount-part <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-part>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-nopipe' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-nopipe'>
     [exec] /usr/bin/install -c wordcount-nopipe <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-nopipe>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'pipes-sort' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/pipes-sort'>
     [exec] /usr/bin/install -c pipes-sort <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/pipes-sort>
     [exec] make[1]: Nothing to be done for `install-data-am'.
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes'>

compile-c++-examples:

compile-examples:
    [javac] Compiling 24 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/examples>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.

examples:
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/hadoop-examples-0.20.204.jar>

generate-test-records:

compile-core-test:
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] Compiling 496 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar/testjob.jar>
    [javac] Compiling 1 source file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell>
    [javac] Note: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/testshell/ExternalMapReduce.java> uses or overrides a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell/testshell.jar>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>

jar-test:
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/hadoop-test-0.20.204.jar>

ant-tasks:
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ant/org/apache/hadoop/ant>
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/hadoop-ant-0.20.204.jar>

compile-librecordio:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/librecordio>
     [exec] g++ -g3 -O0 -Wall -c -I/include -o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/librecordio/recordio.o> recordio.cc
     [exec] In file included from recordio.cc:22:
     [exec] xmlarchive.hh:22:41: error: xercesc/parsers/SAXParser.hpp: No such file or directory
     [exec] xmlarchive.hh:23:42: error: xercesc/util/PlatformUtils.hpp: No such file or directory
     [exec] xmlarchive.hh:24:43: error: xercesc/util/BinInputStream.hpp: No such file or directory
     [exec] xmlarchive.hh:25:39: error: xercesc/sax/HandlerBase.hpp: No such file or directory
     [exec] xmlarchive.hh:26:39: error: xercesc/sax/InputSource.hpp: No such file or directory
     [exec] In file included from recordio.cc:22:
     [exec] xmlarchive.hh:31: error: expected constructor, destructor, or type conversion before 'namespace'
     [exec] make: *** [<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/librecordio/recordio.o]> Error 1

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1878: exec returned: 2

Total time: 4 minutes 1 second
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #18

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/18/>

------------------------------------------
[...truncated 7218 lines...]
  [javadoc] Constructing Javadoc information...
  [javadoc] JDiff: doclet started ...
  [javadoc] Error: file '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/jdiff/hadoop_0.20.9.xml'> does not exist for the old API

create-c++-examples-pipes-makefile:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes>
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for gawk... no
     [exec] checking for mawk... mawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking for style of include used by make... GNU
     [exec] checking for gcc... gcc
     [exec] checking for C compiler default output file name... a.out
     [exec] checking whether the C compiler works... yes
     [exec] checking whether we are cross compiling... no
     [exec] checking for suffix of executables... 
     [exec] checking for suffix of object files... o
     [exec] checking whether we are using the GNU C compiler... yes
     [exec] checking whether gcc accepts -g... yes
     [exec] checking for gcc option to accept ISO C89... none needed
     [exec] checking dependency style of gcc... gcc3
     [exec] checking how to run the C preprocessor... gcc -E
     [exec] checking for grep that handles long lines and -e... /bin/grep
     [exec] checking for egrep... /bin/grep -E
     [exec] checking for ANSI C header files... yes
     [exec] checking for sys/types.h... yes
     [exec] checking for sys/stat.h... yes
     [exec] checking for stdlib.h... yes
     [exec] checking for string.h... yes
     [exec] checking for memory.h... yes
     [exec] checking for strings.h... yes
     [exec] checking for inttypes.h... yes
     [exec] checking for stdint.h... yes
     [exec] checking for unistd.h... yes
     [exec] checking minix/config.h usability... no
     [exec] checking minix/config.h presence... no
     [exec] checking for minix/config.h... no
     [exec] checking whether it is safe to define __EXTENSIONS__... yes
     [exec] checking for special C compiler options needed for large files... no
     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
     [exec] checking pthread.h usability... yes
     [exec] checking pthread.h presence... yes
     [exec] checking for pthread.h... yes
     [exec] checking for pthread_create in -lpthread... yes
     [exec] checking for HMAC_Init in -lssl... yes
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] checking for a sed that does not truncate output... /bin/sed
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] checking for shutdown in -lsocket... no
     [exec] checking for xdr_float in -lnsl... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: executing depfiles commands

compile-c++-examples-pipes:
     [exec] depbase=`echo impl/wordcount-simple.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes> -I./impl    -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -MT impl/wordcount-simple.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-simple.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/impl/wordcount-simple.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2   -o wordcount-simple  impl/wordcount-simple.o -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread 
     [exec] mkdir .libs
     [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -o wordcount-simple impl/wordcount-simple.o  -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread
     [exec] depbase=`echo impl/wordcount-part.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes> -I./impl    -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -MT impl/wordcount-part.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-part.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/impl/wordcount-part.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2   -o wordcount-part  impl/wordcount-part.o -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread 
     [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -o wordcount-part impl/wordcount-part.o  -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread
     [exec] depbase=`echo impl/wordcount-nopipe.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes> -I./impl    -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -MT impl/wordcount-nopipe.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-nopipe.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/impl/wordcount-nopipe.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2   -o wordcount-nopipe  impl/wordcount-nopipe.o -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread 
     [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -o wordcount-nopipe impl/wordcount-nopipe.o  -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread
     [exec] depbase=`echo impl/sort.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes> -I./impl    -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -MT impl/sort.o -MD -MP -MF "$depbase.Tpo" -c -o impl/sort.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/examples/pipes/impl/sort.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2   -o pipes-sort  impl/sort.o -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread 
     [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/include> -g -O2 -o pipes-sort impl/sort.o  -L<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-amd64-64/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin">
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-simple' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-simple'>
     [exec] /usr/bin/install -c wordcount-simple <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-simple>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-part' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-part'>
     [exec] /usr/bin/install -c wordcount-part <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-part>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-nopipe' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-nopipe'>
     [exec] /usr/bin/install -c wordcount-nopipe <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/wordcount-nopipe>
     [exec]   /bin/bash ./libtool --mode=install /usr/bin/install -c 'pipes-sort' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/pipes-sort'>
     [exec] /usr/bin/install -c pipes-sort <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-examples/Linux-amd64-64/bin/pipes-sort>
     [exec] make[1]: Nothing to be done for `install-data-am'.
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/examples/pipes'>

compile-c++-examples:

compile-examples:
    [javac] Compiling 24 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/examples>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.

examples:
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/hadoop-examples-0.20.204.jar>

generate-test-records:

compile-core-test:
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] Compiling 496 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testjar/testjob.jar>
    [javac] Compiling 1 source file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell>
    [javac] Note: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/testshell/ExternalMapReduce.java> uses or overrides a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/testshell/testshell.jar>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/debug>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/cache>

jar-test:
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/hadoop-test-0.20.204.jar>

ant-tasks:
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ant/org/apache/hadoop/ant>
      [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/hadoop-ant-0.20.204.jar>

compile-librecordio:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/librecordio>
     [exec] g++ -g3 -O0 -Wall -c -I/include -o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/librecordio/recordio.o> recordio.cc
     [exec] In file included from recordio.cc:22:
     [exec] xmlarchive.hh:22:41: error: xercesc/parsers/SAXParser.hpp: No such file or directory
     [exec] xmlarchive.hh:23:42: error: xercesc/util/PlatformUtils.hpp: No such file or directory
     [exec] xmlarchive.hh:24:43: error: xercesc/util/BinInputStream.hpp: No such file or directory
     [exec] xmlarchive.hh:25:39: error: xercesc/sax/HandlerBase.hpp: No such file or directory
     [exec] xmlarchive.hh:26:39: error: xercesc/sax/InputSource.hpp: No such file or directory
     [exec] In file included from recordio.cc:22:
     [exec] xmlarchive.hh:31: error: expected constructor, destructor, or type conversion before 'namespace'
     [exec] make: *** [<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/librecordio/recordio.o]> Error 1

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1878: exec returned: 2

Total time: 4 minutes 2 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #17

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/17/>

------------------------------------------
[...truncated 4991 lines...]
     [exec]      during execution
     [exec]    - add LIBDIR to the `LD_RUN_PATH' environment variable
     [exec]      during linking
     [exec]    - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
     [exec]    - have your system administrator add LIBDIR to `/etc/ld.so.conf'
     [exec] 
     [exec] See any operating system documentation about shared libraries for
     [exec] more information, such as the ld(1) and ld.so(8) manual pages.
     [exec] ----------------------------------------------------------------------
     [exec] make[1]: Nothing to be done for `install-data-am'.
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-amd64-64/libhdfs'>

compile-contrib:

compile:

check-contrib:

init:
     [echo] contrib: capacity-scheduler
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/capacity-scheduler>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/capacity-scheduler/classes>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/capacity-scheduler/test>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/capacity-scheduler/system>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/capacity-scheduler/system/classes>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/capacity-scheduler/examples>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/capacity-scheduler/test/logs>

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#capacity-scheduler;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in default
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in default
[ivy:resolve] 	found commons-codec#commons-codec;1.4 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] :: resolution report :: resolve 249ms :: artifacts dl 12ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   20  |   0   |   0   |   3   ||   17  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#capacity-scheduler [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	17 artifacts copied, 0 already retrieved (4642kB/25ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

compile:
     [echo] contrib: capacity-scheduler
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/capacity-scheduler/classes>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.

check-contrib:

init:
     [echo] contrib: datajoin
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/datajoin>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/datajoin/classes>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/datajoin/test>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/datajoin/system>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/datajoin/system/classes>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/datajoin/examples>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/datajoin/test/logs>

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#datajoin;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in default
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-configuration#commons-configuration;1.6 in maven2
[ivy:resolve] 	found commons-collections#commons-collections;3.2.1 in maven2
[ivy:resolve] 	found commons-lang#commons-lang;2.4 in default
[ivy:resolve] 	found commons-logging#commons-logging;1.1.1 in default
[ivy:resolve] 	found commons-digester#commons-digester;1.8 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils;1.7.0 in maven2
[ivy:resolve] 	found commons-beanutils#commons-beanutils-core;1.8.0 in maven2
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in default
[ivy:resolve] :: resolution report :: resolve 189ms :: artifacts dl 11ms
[ivy:resolve] 	:: evicted modules:
[ivy:resolve] 	commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common]
[ivy:resolve] 	commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   19  |   0   |   0   |   3   ||   16  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#datajoin [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	16 artifacts copied, 0 already retrieved (4585kB/23ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

compile:
     [echo] contrib: datajoin
    [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/datajoin/classes>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

check-contrib:

init:
     [echo] contrib: eclipse-plugin
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/eclipse-plugin>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/eclipse-plugin/classes>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/eclipse-plugin/test>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/eclipse-plugin/system>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/eclipse-plugin/system/classes>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/eclipse-plugin/examples>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/eclipse-plugin/test/logs>

init-contrib:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#eclipse-plugin;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in default
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] :: resolution report :: resolve 25ms :: artifacts dl 1ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   2   |   0   |   0   |   0   ||   2   |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#eclipse-plugin [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	2 artifacts copied, 0 already retrieved (419kB/9ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

compile:
     [echo] contrib: eclipse-plugin
    [javac] Compiling 45 source files to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/contrib/eclipse-plugin/classes>
    [javac] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/HadoopApplicationLaunchShortcut.java>:35: cannot find symbol
    [javac] symbol  : class JavaApplicationLaunchShortcut
    [javac] location: package org.eclipse.jdt.debug.ui.launchConfigurations
    [javac] import org.eclipse.jdt.debug.ui.launchConfigurations.JavaApplicationLaunchShortcut;
    [javac]                                                     ^
    [javac] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/HadoopApplicationLaunchShortcut.java>:49: cannot find symbol
    [javac] symbol: class JavaApplicationLaunchShortcut
    [javac]     JavaApplicationLaunchShortcut {
    [javac]     ^
    [javac] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/HadoopApplicationLaunchShortcut.java>:66: cannot find symbol
    [javac] symbol  : variable super
    [javac] location: class org.apache.hadoop.eclipse.launch.HadoopApplicationLaunchShortcut
    [javac]         super.findLaunchConfiguration(type, configType);
    [javac]         ^
    [javac] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/HadoopApplicationLaunchShortcut.java>:67: cannot find symbol
    [javac] symbol  : variable super
    [javac] location: class org.apache.hadoop.eclipse.launch.HadoopApplicationLaunchShortcut
    [javac]     if (iConf == null) iConf = super.createConfiguration(type);
    [javac]                                ^
    [javac] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/launch/HadoopApplicationLaunchShortcut.java>:60: method does not override or implement a method from a supertype
    [javac]   @Override
    [javac]   ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 5 errors

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:645: The following error occurred while executing this line:
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/build.xml>:30: The following error occurred while executing this line:
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/contrib/eclipse-plugin/build.xml>:61: Compile failed; see the compiler error output for details.

Total time: 1 minute 36 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #16

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/16/>

------------------------------------------
[...truncated 3366 lines...]
A         src/examples/org/apache/hadoop/examples/ExampleDriver.java
A         src/examples/org/apache/hadoop/examples/RandomWriter.java
A         src/examples/org/apache/hadoop/examples/package.html
A         src/examples/org/apache/hadoop/examples/RandomTextWriter.java
A         src/examples/org/apache/hadoop/examples/terasort
A         src/examples/org/apache/hadoop/examples/terasort/job_history_summary.py
A         src/examples/org/apache/hadoop/examples/terasort/TeraSort.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraInputFormat.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraGen.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraOutputFormat.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraValidate.java
A         src/examples/org/apache/hadoop/examples/terasort/package.html
A         src/examples/org/apache/hadoop/examples/dancing
A         src/examples/org/apache/hadoop/examples/dancing/puzzle1.dta
A         src/examples/org/apache/hadoop/examples/dancing/OneSidedPentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/DancingLinks.java
A         src/examples/org/apache/hadoop/examples/dancing/Pentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/Sudoku.java
A         src/examples/org/apache/hadoop/examples/dancing/DistributedPentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/package.html
A         src/examples/org/apache/hadoop/examples/WordCount.java
A         src/examples/org/apache/hadoop/examples/DBCountPageView.java
A         src/examples/org/apache/hadoop/examples/Sort.java
A         src/examples/org/apache/hadoop/examples/AggregateWordCount.java
A         src/examples/org/apache/hadoop/examples/Grep.java
A         src/packages
A         src/packages/hadoop-create-user.sh
A         src/packages/hadoop-setup-hdfs.sh
A         src/packages/hadoop-setup-single-node.sh
A         src/packages/hadoop-setup-conf.sh
A         src/packages/update-hadoop-env.sh
A         src/packages/deb
A         src/packages/deb/init.d
A         src/packages/deb/init.d/hadoop-tasktracker
A         src/packages/deb/init.d/hadoop-datanode
A         src/packages/deb/init.d/hadoop-jobtracker
A         src/packages/deb/init.d/hadoop-namenode
A         src/packages/deb/hadoop.control
A         src/packages/deb/hadoop.control/control
A         src/packages/deb/hadoop.control/postinst
A         src/packages/deb/hadoop.control/postrm
A         src/packages/deb/hadoop.control/preinst
A         src/packages/deb/hadoop.control/conffile
A         src/packages/deb/hadoop.control/prerm
A         src/packages/rpm
A         src/packages/rpm/init.d
A         src/packages/rpm/init.d/hadoop-tasktracker
A         src/packages/rpm/init.d/hadoop-datanode
A         src/packages/rpm/init.d/hadoop-jobtracker
A         src/packages/rpm/init.d/hadoop-namenode
A         src/packages/rpm/spec
A         src/packages/rpm/spec/hadoop.spec
A         src/packages/templates
A         src/packages/templates/conf
A         src/packages/templates/conf/hdfs-site.xml
A         src/packages/templates/conf/core-site.xml
A         src/packages/templates/conf/hadoop-env.sh
A         src/packages/templates/conf/mapred-site.xml
A         bin
A         bin/stop-jobhistoryserver.sh
AU        bin/start-dfs.sh
AU        bin/hadoop-daemon.sh
A         bin/hadoop-config.sh
A         bin/start-jobhistoryserver.sh
AU        bin/stop-balancer.sh
AU        bin/stop-all.sh
AU        bin/stop-mapred.sh
AU        bin/slaves.sh
AU        bin/hadoop-daemons.sh
AU        bin/rcc
AU        bin/stop-dfs.sh
AU        bin/hadoop
AU        bin/start-balancer.sh
AU        bin/start-all.sh
AU        bin/start-mapred.sh
A         README.txt
A         build.xml
 U        .
At revision 1152390
no revision recorded for http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204 in the previous build
[Hadoop-0.20.204-Build] $ /bin/bash -xe /tmp/hudson5966530285364984988.sh
+ export JAVA_HOME=/home/hudson/tools/java/latest1.6-64
+ JAVA_HOME=/home/hudson/tools/java/latest1.6-64
+ export ANT_HOME=/home/hudson/tools/ant/apache-ant-1.7.1
+ ANT_HOME=/home/hudson/tools/ant/apache-ant-1.7.1
+ export FORREST_HOME=/home/nigel/tools/forrest/latest
+ FORREST_HOME=/home/nigel/tools/forrest/latest
+ export ECLIPSE_HOME=/home/nigel/tools/eclipse/latest
+ ECLIPSE_HOME=/home/nigel/tools/eclipse/latest
+ export XERCES_HOME=/home/hudson/tools/xerces/c/latest
+ XERCES_HOME=/home/hudson/tools/xerces/c/latest
+ export JAVA5_HOME=/home/hudson/tools/java/latest1.5
+ JAVA5_HOME=/home/hudson/tools/java/latest1.5
+ export FINDBUGS_HOME=/home/hudson/tools/findbugs/latest
+ FINDBUGS_HOME=/home/hudson/tools/findbugs/latest
+ cd <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk>
+ /home/hudson/tools/ant/apache-ant-1.7.1/bin/ant -Dversion=0.20.204 -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true -Dlibrecordio=true -Dtest.junit.output.format=xml-Dxercescroot=/home/hudson/tools/xerces/c/latest -Declipse.home=/home/nigel/tools/eclipse/latest -Djava5.home=/home/hudson/tools/java/latest1.5 -Dforrest.home=/home/nigel/tools/forrest/latest -Dfindbugs.home=/home/hudson/tools/findbugs/latest veryclean tar test-c++-libhdfs test findbugs
Buildfile: build.xml

clean-contrib:

clean:

clean:
     [echo] contrib: capacity-scheduler

clean:
     [echo] contrib: datajoin

clean:
     [echo] contrib: eclipse-plugin

clean:
     [echo] contrib: failmon

clean:
     [echo] contrib: fairscheduler

check-libhdfs-fuse:

clean:

clean:
     [echo] contrib: gridmix
Trying to override old definition of task macro_tar

clean:
     [echo] contrib: hdfsproxy

clean:
     [echo] contrib: hod

clean:
     [echo] contrib: index

clean:
     [echo] contrib: streaming

clean:
     [echo] contrib: thriftfs

clean:
     [echo] contrib: vaidya

clean-sign:

clean-fi:

clean:

veryclean:

clover.setup:

clover.info:
     [echo] 
     [echo]      Clover not found. Code coverage reports disabled.
     [echo]   

clover:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivy-2.1.0.jar>

ivy-init-dirs:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/report>

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#Hadoop;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in default
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in default
[ivy:resolve] 	found commons-codec#commons-codec;1.4 in maven2
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in default
[ivy:resolve] 	found commons-daemon#commons-daemon;1.0.1 in maven2
[ivy:resolve] 	found net.java.dev.jets3t#jets3t;0.6.1 in maven2
[ivy:resolve] 	found commons-net#commons-net;1.4.1 in default
[ivy:resolve] 	found oro#oro;2.0.8 in default
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found tomcat#jasper-runtime;5.5.12 in default
[ivy:resolve] 	found tomcat#jasper-compiler;5.5.12 in default
[ivy:resolve] 	found commons-el#commons-el;1.0 in default
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in default
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found org.mockito#mockito-all;1.8.5 in maven2
[ivy:resolve] 	found com.jcraft#jsch;0.1.42 in maven2
[ivy:resolve] 	found org.aspectj#aspectjrt;1.6.5 in maven2
[ivy:resolve] 	found org.aspectj#aspectjtools;1.6.5 in maven2
[ivy:resolve] 	found org.vafer#jdeb;0.8 in maven2
[ivy:resolve] :: resolution report :: resolve 3324ms :: artifacts dl 18ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   30  |   0   |   0   |   0   ||   29  |   0   |
	---------------------------------------------------------------------
[ivy:resolve] 
[ivy:resolve] :: problems summary ::
[ivy:resolve] :::: WARNINGS
[ivy:resolve] 	io problem while parsing ivy file: http://repo1.maven.org/maven2/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.pom: Resetting to invalid mark
[ivy:resolve] 	io problem while parsing ivy file: https://oss.sonatype.org/content/groups/public/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.pom: Resetting to invalid mark
[ivy:resolve] 		module not found: commons-configuration#commons-configuration;1.6
[ivy:resolve] 	==== local: tried
[ivy:resolve] 	  /home/hudson/.ivy2/local/commons-configuration/commons-configuration/1.6/ivys/ivy.xml
[ivy:resolve] 	  -- artifact commons-configuration#commons-configuration;1.6!commons-configuration.jar:
[ivy:resolve] 	  /home/hudson/.ivy2/local/commons-configuration/commons-configuration/1.6/jars/commons-configuration.jar
[ivy:resolve] 	==== maven2: tried
[ivy:resolve] 	  http://repo1.maven.org/maven2/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.pom
[ivy:resolve] 	==== oss-sonatype: tried
[ivy:resolve] 	  https://oss.sonatype.org/content/groups/public/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.pom
[ivy:resolve] 		::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve] 		::          UNRESOLVED DEPENDENCIES         ::
[ivy:resolve] 		::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve] 		:: commons-configuration#commons-configuration;1.6: not found
[ivy:resolve] 		::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve] 
[ivy:resolve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:2324: impossible to resolve dependencies:
	resolve failed - see output for details

Total time: 7 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #15

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/15/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
hudson.util.IOException2: remote file operation failed: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/> at hudson.remoting.Channel@4cb129bb:ubuntu2
	at hudson.FilePath.act(FilePath.java:754)
	at hudson.FilePath.act(FilePath.java:740)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:684)
	at hudson.scm.SubversionSCM.checkout(SubversionSCM.java:633)
	at hudson.model.AbstractProject.checkout(AbstractProject.java:1181)
	at hudson.model.AbstractBuild$AbstractRunner.checkout(AbstractBuild.java:536)
	at hudson.model.AbstractBuild$AbstractRunner.run(AbstractBuild.java:424)
	at hudson.model.Run.run(Run.java:1374)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
	at hudson.model.ResourceController.execute(ResourceController.java:88)
	at hudson.model.Executor.run(Executor.java:145)
Caused by: java.io.IOException: Unable to delete <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/aop/org/apache/.svn> - files in dir: [<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/aop/org/apache/.svn/tmp,> <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/test/aop/org/apache/.svn/all-wcprops]>
	at hudson.Util.deleteFile(Util.java:262)
	at hudson.Util.deleteRecursive(Util.java:305)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.Util.deleteRecursive(Util.java:304)
	at hudson.Util.deleteContentsRecursive(Util.java:224)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:67)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #14

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/14/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
SCM check out aborted
Archiving artifacts
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #13

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/13/>

------------------------------------------
[...truncated 3389 lines...]
A         src/examples/org/apache/hadoop/examples/terasort/TeraGen.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraOutputFormat.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraValidate.java
A         src/examples/org/apache/hadoop/examples/terasort/package.html
A         src/examples/org/apache/hadoop/examples/dancing
A         src/examples/org/apache/hadoop/examples/dancing/puzzle1.dta
A         src/examples/org/apache/hadoop/examples/dancing/OneSidedPentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/DancingLinks.java
A         src/examples/org/apache/hadoop/examples/dancing/Pentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/Sudoku.java
A         src/examples/org/apache/hadoop/examples/dancing/DistributedPentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/package.html
A         src/examples/org/apache/hadoop/examples/WordCount.java
A         src/examples/org/apache/hadoop/examples/DBCountPageView.java
A         src/examples/org/apache/hadoop/examples/Sort.java
A         src/examples/org/apache/hadoop/examples/AggregateWordCount.java
A         src/examples/org/apache/hadoop/examples/Grep.java
A         src/packages
A         src/packages/hadoop-create-user.sh
A         src/packages/hadoop-setup-hdfs.sh
A         src/packages/hadoop-setup-single-node.sh
A         src/packages/hadoop-setup-conf.sh
A         src/packages/update-hadoop-env.sh
A         src/packages/deb
A         src/packages/deb/init.d
A         src/packages/deb/init.d/hadoop-tasktracker
A         src/packages/deb/init.d/hadoop-datanode
A         src/packages/deb/init.d/hadoop-jobtracker
A         src/packages/deb/init.d/hadoop-namenode
A         src/packages/deb/hadoop.control
A         src/packages/deb/hadoop.control/control
A         src/packages/deb/hadoop.control/postinst
A         src/packages/deb/hadoop.control/postrm
A         src/packages/deb/hadoop.control/preinst
A         src/packages/deb/hadoop.control/conffile
A         src/packages/deb/hadoop.control/prerm
A         src/packages/rpm
A         src/packages/rpm/init.d
A         src/packages/rpm/init.d/hadoop-tasktracker
A         src/packages/rpm/init.d/hadoop-datanode
A         src/packages/rpm/init.d/hadoop-jobtracker
A         src/packages/rpm/init.d/hadoop-namenode
A         src/packages/rpm/spec
A         src/packages/rpm/spec/hadoop.spec
A         src/packages/templates
A         src/packages/templates/conf
A         src/packages/templates/conf/hdfs-site.xml
A         src/packages/templates/conf/core-site.xml
A         src/packages/templates/conf/hadoop-env.sh
A         src/packages/templates/conf/mapred-site.xml
A         bin
A         bin/stop-jobhistoryserver.sh
AU        bin/start-dfs.sh
AU        bin/hadoop-daemon.sh
A         bin/hadoop-config.sh
A         bin/start-jobhistoryserver.sh
AU        bin/stop-balancer.sh
AU        bin/stop-all.sh
AU        bin/stop-mapred.sh
AU        bin/slaves.sh
AU        bin/hadoop-daemons.sh
AU        bin/rcc
AU        bin/stop-dfs.sh
AU        bin/hadoop
AU        bin/start-balancer.sh
AU        bin/start-all.sh
AU        bin/start-mapred.sh
A         README.txt
A         build.xml
 U        .
At revision 1152369
no revision recorded for http://svn.apache.org/repos/asf/hadoop/nightly in the previous build
no revision recorded for http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204 in the previous build
[Hadoop-0.20.204-Build] $ /bin/bash -xe /tmp/hudson3424346922042454072.sh
+ export JAVA_HOME=/home/hudson/tools/java/latest1.6-64
+ JAVA_HOME=/home/hudson/tools/java/latest1.6-64
+ export ANT_HOME=/home/hudson/tools/ant/apache-ant-1.7.1
+ ANT_HOME=/home/hudson/tools/ant/apache-ant-1.7.1
+ export FORREST_HOME=/home/nigel/tools/forrest/latest
+ FORREST_HOME=/home/nigel/tools/forrest/latest
+ export ECLIPSE_HOME=/home/nigel/tools/eclipse/latest
+ ECLIPSE_HOME=/home/nigel/tools/eclipse/latest
+ export XERCES_HOME=/home/hudson/tools/xerces/c/latest
+ XERCES_HOME=/home/hudson/tools/xerces/c/latest
+ export JAVA5_HOME=/home/hudson/tools/java/latest1.5
+ JAVA5_HOME=/home/hudson/tools/java/latest1.5
+ export FINDBUGS_HOME=/home/hudson/tools/findbugs/latest
+ FINDBUGS_HOME=/home/hudson/tools/findbugs/latest
+ cd <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk>
+ /home/hudson/tools/ant/apache-ant-1.7.1/bin/ant -Dversion=0.20.204 -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true -Dlibrecordio=true -Dtest.junit.output.format=xml-Dxercescroot=/home/hudson/tools/xerces/c/latest -Declipse.home=/home/nigel/tools/eclipse/latest -Djava5.home=/home/hudson/tools/java/latest1.5 -Dforrest.home=/home/nigel/tools/forrest/latest -Dfindbugs.home=/home/hudson/tools/findbugs/latest tar test-c++-libhdfs test findbugs
Buildfile: build.xml

clover.setup:

clover.info:
     [echo] 
     [echo]      Clover not found. Code coverage reports disabled.
     [echo]   

clover:

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivy-2.1.0.jar>

ivy-init-dirs:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/report>

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] :: resolving dependencies :: org.apache.hadoop#Hadoop;working@vesta.apache.org
[ivy:resolve] 	confs: [common]
[ivy:resolve] 	found commons-logging#commons-logging;1.0.4 in default
[ivy:resolve] 	found log4j#log4j;1.2.15 in maven2
[ivy:resolve] 	found commons-httpclient#commons-httpclient;3.0.1 in default
[ivy:resolve] 	found commons-codec#commons-codec;1.4 in maven2
[ivy:resolve] 	found commons-cli#commons-cli;1.2 in default
[ivy:resolve] 	found xmlenc#xmlenc;0.52 in default
[ivy:resolve] 	found commons-daemon#commons-daemon;1.0.1 in maven2
[ivy:resolve] 	found net.java.dev.jets3t#jets3t;0.6.1 in maven2
[ivy:resolve] 	found commons-net#commons-net;1.4.1 in default
[ivy:resolve] 	found oro#oro;2.0.8 in default
[ivy:resolve] 	found org.mortbay.jetty#jetty;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#jetty-util;6.1.26 in maven2
[ivy:resolve] 	found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2
[ivy:resolve] 	found tomcat#jasper-runtime;5.5.12 in default
[ivy:resolve] 	found tomcat#jasper-compiler;5.5.12 in default
[ivy:resolve] 	found commons-el#commons-el;1.0 in default
[ivy:resolve] 	found org.apache.commons#commons-math;2.1 in maven2
[ivy:resolve] 	found junit#junit;4.5 in maven2
[ivy:resolve] 	found commons-logging#commons-logging-api;1.0.4 in maven2
[ivy:resolve] 	found org.slf4j#slf4j-api;1.4.3 in maven2
[ivy:resolve] 	found org.eclipse.jdt#core;3.1.1 in default
[ivy:resolve] 	found org.slf4j#slf4j-log4j12;1.4.3 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2
[ivy:resolve] 	found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2
[ivy:resolve] 	found org.mockito#mockito-all;1.8.5 in maven2
[ivy:resolve] 	found com.jcraft#jsch;0.1.42 in maven2
[ivy:resolve] 	found org.aspectj#aspectjrt;1.6.5 in maven2
[ivy:resolve] 	found org.aspectj#aspectjtools;1.6.5 in maven2
[ivy:resolve] 	found org.vafer#jdeb;0.8 in maven2
[ivy:resolve] downloading http://repo1.maven.org/maven2/commons-codec/commons-codec/1.4/commons-codec-1.4.jar ...
[ivy:resolve] ...................................... (56kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] commons-codec#commons-codec;1.4!commons-codec.jar (892ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/commons-daemon/commons-daemon/1.0.1/commons-daemon-1.0.1.jar ...
[ivy:resolve] ........ (13kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] commons-daemon#commons-daemon;1.0.1!commons-daemon.jar (798ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar ...
[ivy:resolve] .......................................................................................................................................................................................................................................................... (527kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] org.mortbay.jetty#jetty;6.1.26!jetty.jar (1320ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar ...
[ivy:resolve] ................................................................................................................... (172kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] org.mortbay.jetty#jetty-util;6.1.26!jetty-util.jar (992ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar ...
[ivy:resolve] ................................................ (130kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] org.mortbay.jetty#servlet-api;2.5-20081211!servlet-api.jar (985ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/commons/commons-math/2.1/commons-math-2.1.jar ...
[ivy:resolve] ..................................................................................................................................................................................................................................................... (812kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] org.apache.commons#commons-math;2.1!commons-math.jar (1374ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/org/mockito/mockito-all/1.8.5/mockito-all-1.8.5.jar ...
[ivy:resolve] ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... (1386kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] org.mockito#mockito-all;1.8.5!mockito-all.jar (1168ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar ...
[ivy:resolve] ...................................................................................................... (181kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] com.jcraft#jsch;0.1.42!jsch.jar (986ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/org/aspectj/aspectjrt/1.6.5/aspectjrt-1.6.5.jar ...
[ivy:resolve] ............................................................................ (113kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] org.aspectj#aspectjrt;1.6.5!aspectjrt.jar (981ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/org/aspectj/aspectjtools/1.6.5/aspectjtools-1.6.5.jar ...
[ivy:resolve] .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... (8562kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] org.aspectj#aspectjtools;1.6.5!aspectjtools.jar (1932ms)
[ivy:resolve] downloading http://repo1.maven.org/maven2/org/vafer/jdeb/0.8/jdeb-0.8.jar ...
[ivy:resolve] ................................................ (215kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 	[SUCCESSFUL ] org.vafer#jdeb;0.8!jdeb.jar(maven-plugin) (948ms)
[ivy:resolve] :: resolution report :: resolve 25817ms :: artifacts dl 13130ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   30  |   11  |   11  |   0   ||   29  |   11  |
	---------------------------------------------------------------------
[ivy:resolve] 
[ivy:resolve] :: problems summary ::
[ivy:resolve] :::: WARNINGS
[ivy:resolve] 	io problem while parsing ivy file: http://repo1.maven.org/maven2/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.pom: Resetting to invalid mark
[ivy:resolve] 	io problem while parsing ivy file: https://oss.sonatype.org/content/groups/public/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.pom: Resetting to invalid mark
[ivy:resolve] 		module not found: commons-configuration#commons-configuration;1.6
[ivy:resolve] 	==== local: tried
[ivy:resolve] 	  /home/hudson/.ivy2/local/commons-configuration/commons-configuration/1.6/ivys/ivy.xml
[ivy:resolve] 	  -- artifact commons-configuration#commons-configuration;1.6!commons-configuration.jar:
[ivy:resolve] 	  /home/hudson/.ivy2/local/commons-configuration/commons-configuration/1.6/jars/commons-configuration.jar
[ivy:resolve] 	==== maven2: tried
[ivy:resolve] 	  http://repo1.maven.org/maven2/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.pom
[ivy:resolve] 	==== oss-sonatype: tried
[ivy:resolve] 	  https://oss.sonatype.org/content/groups/public/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.pom
[ivy:resolve] 		::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve] 		::          UNRESOLVED DEPENDENCIES         ::
[ivy:resolve] 		::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve] 		:: commons-configuration#commons-configuration;1.6: not found
[ivy:resolve] 		::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve] :::: ERRORS
[ivy:resolve] 	unknown resolver ibiblio
[ivy:resolve] 	unknown resolver chain
[ivy:resolve] 	unknown resolver ibiblio
[ivy:resolve] 	unknown resolver chain
[ivy:resolve] 	unknown resolver ibiblio
[ivy:resolve] 	unknown resolver chain
[ivy:resolve] 	unknown resolver ibiblio
[ivy:resolve] 	unknown resolver chain
[ivy:resolve] 	unknown resolver ibiblio
[ivy:resolve] 	unknown resolver chain
[ivy:resolve] 	unknown resolver ibiblio
[ivy:resolve] 	unknown resolver chain
[ivy:resolve] 	unknown resolver public
[ivy:resolve] 	unknown resolver ibiblio
[ivy:resolve] 	unknown resolver chain
[ivy:resolve] 
[ivy:resolve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:2324: impossible to resolve dependencies:
	resolve failed - see output for details

Total time: 41 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #12

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/12/>

------------------------------------------
Started by user gkesavan
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
Checking out http://svn.apache.org/repos/asf/hadoop/nightly
SCM check out aborted
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
ERROR: Failed to check out http://svn.apache.org/repos/asf/hadoop/nightly
org.tmatesoft.svn.core.SVNCancelException: svn: REPORT /repos/asf/!svn/vcc/default failed
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.request(HTTPConnection.java:287)
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.request(HTTPConnection.java:276)
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.request(HTTPConnection.java:264)
	at org.tmatesoft.svn.core.internal.io.dav.DAVConnection.doReport(DAVConnection.java:266)
	at org.tmatesoft.svn.core.internal.io.dav.DAVRepository.runReport(DAVRepository.java:1263)
	at org.tmatesoft.svn.core.internal.io.dav.DAVRepository.update(DAVRepository.java:820)
	at org.tmatesoft.svn.core.wc.SVNUpdateClient.update(SVNUpdateClient.java:564)
	at org.tmatesoft.svn.core.wc.SVNUpdateClient.doCheckout(SVNUpdateClient.java:922)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:83)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Caused by: org.tmatesoft.svn.core.SVNCancelException: svn: REPORT request failed on '/repos/asf/!svn/vcc/default'
svn: Operation cancelled
	at org.tmatesoft.svn.core.internal.wc.SVNErrorManager.error(SVNErrorManager.java:60)
	at org.tmatesoft.svn.core.internal.wc.SVNErrorManager.error(SVNErrorManager.java:51)
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection._request(HTTPConnection.java:638)
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.request(HTTPConnection.java:285)
	... 22 more
Caused by: org.tmatesoft.svn.core.SVNErrorMessage: svn: REPORT request failed on '/repos/asf/!svn/vcc/default'
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:200)
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:146)
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:89)
	at org.tmatesoft.svn.core.SVNErrorMessage.wrap(SVNErrorMessage.java:366)
	... 24 more
Caused by: org.tmatesoft.svn.core.SVNErrorMessage: svn: Operation cancelled
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:200)
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:146)
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:89)
	at org.tmatesoft.svn.core.SVNCancelException.<init>(SVNCancelException.java:33)
	at hudson.scm.subversion.SubversionUpdateEventHandler.checkCancelled(SubversionUpdateEventHandler.java:118)
	at org.tmatesoft.svn.core.wc.SVNBasicClient.checkCancelled(SVNBasicClient.java:460)
	at org.tmatesoft.svn.core.internal.wc.SVNCancellableEditor.targetRevision(SVNCancellableEditor.java:52)
	at org.tmatesoft.svn.core.internal.io.dav.handlers.DAVEditorHandler.startElement(DAVEditorHandler.java:264)
	at org.tmatesoft.svn.core.internal.io.dav.handlers.BasicDAVHandler.startElement(BasicDAVHandler.java:85)
	at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:504)
	at com.sun.org.apache.xerces.internal.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:182)
	at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:353)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:2732)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:625)
	at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:116)
	at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:488)
	at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:812)
	at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:741)
	at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:123)
	at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1208)
	at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:525)
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.readData(HTTPConnection.java:754)
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection.readData(HTTPConnection.java:719)
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPRequest.dispatch(HTTPRequest.java:216)
	at org.tmatesoft.svn.core.internal.io.dav.http.HTTPConnection._request(HTTPConnection.java:364)
	... 23 more
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #11

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/11/>

------------------------------------------
[...truncated 3236 lines...]
A         src/mapred/org/apache/hadoop/mapred/jobcontrol/JobControl.java
A         src/mapred/org/apache/hadoop/mapred/jobcontrol/package.html
A         src/mapred/org/apache/hadoop/mapred/TaskCompletionEvent.java
A         src/mapred/org/apache/hadoop/mapred/TaskScheduler.java
A         src/mapred/org/apache/hadoop/mapred/Counters.java
A         src/mapred/org/apache/hadoop/mapred/TaskUmbilicalProtocol.java
A         src/mapred/org/apache/hadoop/mapred/ShuffleServerInstrumentation.java
A         src/mapred/org/apache/hadoop/mapred/LimitTasksPerJobTaskScheduler.java
A         src/mapred/org/apache/hadoop/mapred/TaskTrackerMXBean.java
A         src/mapred/org/apache/hadoop/mapred/QueueMetrics.java
A         src/mapred/org/apache/hadoop/mapred/MRConstants.java
A         src/mapred/org/apache/hadoop/mapred/TextInputFormat.java
A         src/mapred/org/apache/hadoop/mapred/MapTaskCompletionEventsUpdate.java
A         src/mapred/org/apache/hadoop/mapred/IFileOutputStream.java
A         src/mapred/org/apache/hadoop/mapred/SequenceFileAsTextInputFormat.java
A         src/mapred/org/apache/hadoop/mapred/IndexCache.java
A         src/mapred/org/apache/hadoop/mapred/JobQueueInfo.java
A         src/mapred/org/apache/hadoop/mapred/JobQueueJobInProgressListener.java
A         src/mapred/org/apache/hadoop/mapred/MergeSorter.java
A         src/mapred/org/apache/hadoop/mapred/Partitioner.java
A         src/mapred/org/apache/hadoop/mapred/SortedRanges.java
A         src/mapred/org/apache/hadoop/mapred/SequenceFileInputFormat.java
A         src/mapred/org/apache/hadoop/mapred/JvmContext.java
A         src/mapred/org/apache/hadoop/mapred/ReinitTrackerAction.java
A         src/mapred/org/apache/hadoop/mapred/FileInputFormat.java
A         src/mapred/org/apache/hadoop/mapred/OutputFormat.java
A         src/mapred/org/apache/hadoop/mapred/DefaultTaskController.java
A         src/mapred/org/apache/hadoop/mapred/TextOutputFormat.java
A         src/mapred/org/apache/hadoop/mapred/JobContext.java
A         src/mapred/org/apache/hadoop/mapred/TaskLogsTruncater.java
A         src/mapred/org/apache/hadoop/mapred/Merger.java
A         src/mapred/org/apache/hadoop/mapred/SpillRecord.java
A         src/mapred/org/apache/hadoop/mapred/KeyValueTextInputFormat.java
A         src/mapred/org/apache/hadoop/mapred/FileOutputFormat_Counter.properties
A         src/mapred/org/apache/hadoop/mapred/Queue.java
A         src/mapred/org/apache/hadoop/mapred/InfoMap.java
A         src/mapred/org/apache/hadoop/mapred/MapTaskRunner.java
A         src/mapred/org/apache/hadoop/mapred/SequenceFileRecordReader.java
A         src/mapred/org/apache/hadoop/mapred/LaunchTaskAction.java
A         src/mapred/org/apache/hadoop/mapred/SequenceFileInputFilter.java
A         src/mapred/mapred-default.xml
A         src/benchmarks
A         src/benchmarks/gridmix
A         src/benchmarks/gridmix/webdatasort
A         src/benchmarks/gridmix/webdatasort/webdata_sort.small
A         src/benchmarks/gridmix/webdatasort/webdata_sort.large
A         src/benchmarks/gridmix/webdatasort/webdata_sort.medium
A         src/benchmarks/gridmix/streamsort
A         src/benchmarks/gridmix/streamsort/text-sort.small
A         src/benchmarks/gridmix/streamsort/text-sort.large
A         src/benchmarks/gridmix/streamsort/text-sort.medium
A         src/benchmarks/gridmix/submissionScripts
A         src/benchmarks/gridmix/submissionScripts/monsterQueriesHod
A         src/benchmarks/gridmix/submissionScripts/monsterQueriesToSameCluster
A         src/benchmarks/gridmix/submissionScripts/allToSameCluster
A         src/benchmarks/gridmix/submissionScripts/allThroughHod
A         src/benchmarks/gridmix/submissionScripts/maxentHod
A         src/benchmarks/gridmix/submissionScripts/maxentToSameCluster
A         src/benchmarks/gridmix/submissionScripts/textSortHod
A         src/benchmarks/gridmix/submissionScripts/textSortToSameCluster
A         src/benchmarks/gridmix/submissionScripts/webdataScanHod
A         src/benchmarks/gridmix/submissionScripts/webdataScanToSameCluster
A         src/benchmarks/gridmix/submissionScripts/sleep_if_too_busy
A         src/benchmarks/gridmix/submissionScripts/webdataSortHod
A         src/benchmarks/gridmix/submissionScripts/webdataSortToSameCluster
A         src/benchmarks/gridmix/pipesort
A         src/benchmarks/gridmix/pipesort/text-sort.small
A         src/benchmarks/gridmix/pipesort/text-sort.large
A         src/benchmarks/gridmix/pipesort/text-sort.medium
A         src/benchmarks/gridmix/gridmix-env
A         src/benchmarks/gridmix/javasort
A         src/benchmarks/gridmix/javasort/text-sort.small
A         src/benchmarks/gridmix/javasort/text-sort.large
A         src/benchmarks/gridmix/javasort/text-sort.medium
A         src/benchmarks/gridmix/maxent
A         src/benchmarks/gridmix/maxent/maxent.large
A         src/benchmarks/gridmix/webdatascan
A         src/benchmarks/gridmix/webdatascan/webdata_scan.small
A         src/benchmarks/gridmix/webdatascan/webdata_scan.large
A         src/benchmarks/gridmix/webdatascan/webdata_scan.medium
A         src/benchmarks/gridmix/README
A         src/benchmarks/gridmix/generateData.sh
A         src/benchmarks/gridmix/monsterQuery
A         src/benchmarks/gridmix/monsterQuery/monster_query.small
A         src/benchmarks/gridmix/monsterQuery/monster_query.large
A         src/benchmarks/gridmix/monsterQuery/monster_query.medium
A         src/benchmarks/gridmix2
A         src/benchmarks/gridmix2/README.gridmix2
A         src/benchmarks/gridmix2/generateGridmix2data.sh
A         src/benchmarks/gridmix2/gridmix_config.xml
A         src/benchmarks/gridmix2/src
A         src/benchmarks/gridmix2/src/java
A         src/benchmarks/gridmix2/src/java/org
A         src/benchmarks/gridmix2/src/java/org/apache
A         src/benchmarks/gridmix2/src/java/org/apache/hadoop
A         src/benchmarks/gridmix2/src/java/org/apache/hadoop/mapred
A         src/benchmarks/gridmix2/src/java/org/apache/hadoop/mapred/GenericMRLoadJobCreator.java
A         src/benchmarks/gridmix2/src/java/org/apache/hadoop/mapred/CombinerJobCreator.java
A         src/benchmarks/gridmix2/src/java/org/apache/hadoop/mapred/GridMixRunner.java
A         src/benchmarks/gridmix2/gridmix-env-2
A         src/benchmarks/gridmix2/rungridmix_2
A         src/benchmarks/gridmix2/build.xml
AU        src/saveVersion.sh
A         src/examples
A         src/examples/pipes
AU        src/examples/pipes/configure
A         src/examples/pipes/impl
A         src/examples/pipes/impl/wordcount-simple.cc
A         src/examples/pipes/impl/config.h.in
A         src/examples/pipes/impl/wordcount-nopipe.cc
A         src/examples/pipes/impl/sort.cc
A         src/examples/pipes/impl/wordcount-part.cc
A         src/examples/pipes/Makefile.in
A         src/examples/pipes/configure.ac
A         src/examples/pipes/conf
A         src/examples/pipes/conf/word.xml
A         src/examples/pipes/conf/word-part.xml
A         src/examples/pipes/depcomp
A         src/examples/pipes/Makefile.am
A         src/examples/pipes/missing
A         src/examples/pipes/config.guess
A         src/examples/pipes/README.txt
A         src/examples/pipes/aclocal.m4
A         src/examples/pipes/config.sub
A         src/examples/pipes/ltmain.sh
A         src/examples/pipes/.autom4te.cfg
A         src/examples/pipes/install-sh
A         src/examples/python
A         src/examples/python/pyAbacus
A         src/examples/python/pyAbacus/wordcountaggregator.spec
A         src/examples/python/pyAbacus/JyAbacusWCPlugIN.py
A         src/examples/python/pyAbacus/JythonAbacus.py
A         src/examples/python/pyAbacus/compile
A         src/examples/python/compile
A         src/examples/python/WordCount.py
A         src/examples/org
A         src/examples/org/apache
A         src/examples/org/apache/hadoop
A         src/examples/org/apache/hadoop/examples
A         src/examples/org/apache/hadoop/examples/Join.java
A         src/examples/org/apache/hadoop/examples/MultiFileWordCount.java
A         src/examples/org/apache/hadoop/examples/SecondarySort.java
A         src/examples/org/apache/hadoop/examples/AggregateWordHistogram.java
A         src/examples/org/apache/hadoop/examples/PiEstimator.java
A         src/examples/org/apache/hadoop/examples/SleepJob.java
A         src/examples/org/apache/hadoop/examples/ExampleDriver.java
A         src/examples/org/apache/hadoop/examples/RandomWriter.java
A         src/examples/org/apache/hadoop/examples/package.html
A         src/examples/org/apache/hadoop/examples/RandomTextWriter.java
A         src/examples/org/apache/hadoop/examples/terasort
A         src/examples/org/apache/hadoop/examples/terasort/job_history_summary.py
A         src/examples/org/apache/hadoop/examples/terasort/TeraSort.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraInputFormat.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraGen.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraOutputFormat.java
A         src/examples/org/apache/hadoop/examples/terasort/TeraValidate.java
A         src/examples/org/apache/hadoop/examples/terasort/package.html
A         src/examples/org/apache/hadoop/examples/dancing
A         src/examples/org/apache/hadoop/examples/dancing/puzzle1.dta
A         src/examples/org/apache/hadoop/examples/dancing/OneSidedPentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/DancingLinks.java
A         src/examples/org/apache/hadoop/examples/dancing/Pentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/Sudoku.java
A         src/examples/org/apache/hadoop/examples/dancing/DistributedPentomino.java
A         src/examples/org/apache/hadoop/examples/dancing/package.html
A         src/examples/org/apache/hadoop/examples/WordCount.java
A         src/examples/org/apache/hadoop/examples/DBCountPageView.java
A         src/examples/org/apache/hadoop/examples/Sort.java
A         src/examples/org/apache/hadoop/examples/AggregateWordCount.java
A         src/examples/org/apache/hadoop/examples/Grep.java
A         src/packages
A         src/packages/hadoop-create-user.sh
A         src/packages/hadoop-setup-hdfs.sh
A         src/packages/hadoop-setup-single-node.sh
A         src/packages/hadoop-setup-conf.sh
A         src/packages/update-hadoop-env.sh
A         src/packages/deb
A         src/packages/deb/init.d
A         src/packages/deb/init.d/hadoop-tasktracker
A         src/packages/deb/init.d/hadoop-datanode
A         src/packages/deb/init.d/hadoop-jobtracker
A         src/packages/deb/init.d/hadoop-namenode
A         src/packages/deb/hadoop.control
A         src/packages/deb/hadoop.control/control
A         src/packages/deb/hadoop.control/postinst
A         src/packages/deb/hadoop.control/postrm
A         src/packages/deb/hadoop.control/preinst
A         src/packages/deb/hadoop.control/conffile
A         src/packages/deb/hadoop.control/prerm
A         src/packages/rpm
A         src/packages/rpm/init.d
A         src/packages/rpm/init.d/hadoop-tasktracker
A         src/packages/rpm/init.d/hadoop-datanode
A         src/packages/rpm/init.d/hadoop-jobtracker
A         src/packages/rpm/init.d/hadoop-namenode
A         src/packages/rpm/spec
A         src/packages/rpm/spec/hadoop.spec
A         src/packages/templates
A         src/packages/templates/conf
A         src/packages/templates/conf/hdfs-site.xml
A         src/packages/templates/conf/core-site.xml
A         src/packages/templates/conf/hadoop-env.sh
A         src/packages/templates/conf/mapred-site.xml
A         bin
A         bin/stop-jobhistoryserver.sh
AU        bin/start-dfs.sh
AU        bin/hadoop-daemon.sh
A         bin/hadoop-config.sh
A         bin/start-jobhistoryserver.sh
AU        bin/stop-balancer.sh
AU        bin/stop-all.sh
AU        bin/stop-mapred.sh
AU        bin/slaves.sh
AU        bin/hadoop-daemons.sh
AU        bin/rcc
AU        bin/stop-dfs.sh
AU        bin/hadoop
AU        bin/start-balancer.sh
AU        bin/start-all.sh
AU        bin/start-mapred.sh
A         README.txt
A         build.xml
 U        .
At revision 1152367
no revision recorded for http://svn.apache.org/repos/asf/hadoop/nightly in the previous build
no revision recorded for http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204 in the previous build
[Hadoop-0.20.204-Build] $ /bin/bash -xe /tmp/hudson5971975965062063948.sh
+ export JAVA_HOME=/home/hudson/tools/java/latest1.6-64
+ JAVA_HOME=/home/hudson/tools/java/latest1.6-64
+ export ANT_HOME=/home/hudson/tools/ant/apache-ant-1.7.1
+ ANT_HOME=/home/hudson/tools/ant/apache-ant-1.7.1
+ export FORREST_HOME=/home/nigel/tools/forrest/latest
+ FORREST_HOME=/home/nigel/tools/forrest/latest
+ export ECLIPSE_HOME=/home/nigel/tools/eclipse/latest
+ ECLIPSE_HOME=/home/nigel/tools/eclipse/latest
+ export XERCES_HOME=/home/hudson/tools/xerces/c/latest
+ XERCES_HOME=/home/hudson/tools/xerces/c/latest
+ export JAVA5_HOME=/home/hudson/tools/java/latest1.5
+ JAVA5_HOME=/home/hudson/tools/java/latest1.5
+ export FINDBUGS_HOME=/home/hudson/tools/findbugs/latest
+ FINDBUGS_HOME=/home/hudson/tools/findbugs/latest
+ /home/hudson/tools/ant/apache-ant-1.7.1/bin/ant -Dversion=0.20.204 -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=true -Dlibrecordio=true -Dtest.junit.output.format=xml-Dxercescroot=/home/hudson/tools/xerces/c/latest -Declipse.home=/home/nigel/tools/eclipse/latest -Djava5.home=/home/hudson/tools/java/latest1.5 -Dforrest.home=/home/nigel/tools/forrest/latest -Dfindbugs.home=/home/hudson/tools/findbugs/latest tar test-c++-libhdfs test findbugs
Buildfile: build.xml does not exist!
Build failed
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #10

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/10/>

------------------------------------------
Started by an SCM change
Building remotely on ubuntu2
Cleaning workspace <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/>
Checking out http://svn.apache.org/repos/asf/hadoop/nightly
AU        tar-munge
A         commitBuild.sh
A         hudsonEnv.sh
AU        hudsonBuildHadoopNightly.sh
A         buildMR-279Branch.sh
AU        hudsonBuildHadoopPatch.sh
AU        hudsonBuildHadoopRelease.sh
AU        processHadoopPatchEmailRemote.sh
AU        hudsonPatchQueueAdmin.sh
AU        processHadoopPatchEmail.sh
A         README.txt
A         test-patch
A         test-patch/test-patch.sh
At revision 1152350
Checking out http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204
SCM check out aborted
Archiving artifacts
ERROR: Failed to check out http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204
Recording test results
Publishing Javadoc
Recording fingerprints
org.tmatesoft.svn.core.SVNCancelException: svn: Operation cancelled
	at hudson.scm.subversion.SubversionUpdateEventHandler.checkCancelled(SubversionUpdateEventHandler.java:118)
	at org.tmatesoft.svn.core.wc.SVNBasicClient.checkCancelled(SVNBasicClient.java:460)
	at org.tmatesoft.svn.core.internal.wc.admin.SVNWCAccess.checkCancelled(SVNWCAccess.java:84)
	at org.tmatesoft.svn.core.internal.wc.admin.SVNWCAccess.doOpen(SVNWCAccess.java:383)
	at org.tmatesoft.svn.core.internal.wc.admin.SVNWCAccess.open(SVNWCAccess.java:274)
	at org.tmatesoft.svn.core.internal.wc.admin.SVNWCAccess.open(SVNWCAccess.java:267)
	at org.tmatesoft.svn.core.internal.wc.admin.SVNWCAccess.openAnchor(SVNWCAccess.java:162)
	at org.tmatesoft.svn.core.wc.SVNUpdateClient.update(SVNUpdateClient.java:512)
	at org.tmatesoft.svn.core.wc.SVNUpdateClient.doCheckout(SVNUpdateClient.java:922)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:83)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1979)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
	at java.lang.Thread.run(Thread.java:636)
Caused by: org.tmatesoft.svn.core.SVNErrorMessage: svn: Operation cancelled
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:200)
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:146)
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:89)
	at org.tmatesoft.svn.core.SVNCancelException.<init>(SVNCancelException.java:33)
	... 24 more
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #9

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/9/changes>

Changes:

[ddas] Merge -r 1150859:1150860 from branch-0.20-security onto branch-0.20-security-204

[ddas] Merge -r 1150527:1150528 from branch-0.20-security onto branch-0.20-security-204

------------------------------------------
[...truncated 11489 lines...]
     [exec] checking whether it is safe to define __EXTENSIONS__... yes
     [exec] checking for special C compiler options needed for large files... no
     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] checking for a sed that does not truncate output... /bin/sed
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: impl/config.h is unchanged
     [exec] config.status: executing depfiles commands

compile-c++-utils:
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib">
     [exec]  /usr/bin/install -c -m 644 -C 'libhadooputils.a' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec]  ranlib '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop">
     [exec]  /usr/bin/install -c -m 644 -C '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/StringUtils.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/StringUtils.hh'>
     [exec]  /usr/bin/install -c -m 644 -C '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/SerialUtils.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/SerialUtils.hh'>
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>

compile-c++-pipes:
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib">
     [exec]  /usr/bin/install -c -m 644 -C 'libhadooppipes.a' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec]  ranlib '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop">
     [exec]  /usr/bin/install -c -m 644 -C '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/Pipes.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/Pipes.hh'>
     [exec]  /usr/bin/install -c -m 644 -C '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/TemplateFactory.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/TemplateFactory.hh'>
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>

compile-c++:

compile-core:

test-c++-libhdfs:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/hdfs/name>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_test.o -MD -MP -MF ".deps/hdfs_test.Tpo" -c -o hdfs_test.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c;> \
     [exec] 	then mv -f ".deps/hdfs_test.Tpo" ".deps/hdfs_test.Po"; else rm -f ".deps/hdfs_test.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>: In function ‘main’:
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:87: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:90: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:130: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:133: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:188: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:189: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:190: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:198: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:199: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:220: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:221: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:272: warning: implicit declaration of function ‘sleep’
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_test  hdfs_test.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_test hdfs_test.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_read.o -MD -MP -MF ".deps/hdfs_read.Tpo" -c -o hdfs_read.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c;> \
     [exec] 	then mv -f ".deps/hdfs_read.Tpo" ".deps/hdfs_read.Po"; else rm -f ".deps/hdfs_read.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>: In function ‘main’:
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>:35: warning: unused variable ‘fileTotalSize’
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_read  hdfs_read.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_read hdfs_read.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_write.o -MD -MP -MF ".deps/hdfs_write.Tpo" -c -o hdfs_write.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_write.c;> \
     [exec] 	then mv -f ".deps/hdfs_write.Tpo" ".deps/hdfs_write.Po"; else rm -f ".deps/hdfs_write.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_write  hdfs_write.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_write hdfs_write.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/test-libhdfs.sh>	
     [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] LIB_JVM_DIR = /homes/hudson/tools/java/latest1.6/jre/lib/i386/server
     [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop>: line 53: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] 11/07/26 06:52:15 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
     [exec] 11/07/26 06:52:15 INFO namenode.NameNode: STARTUP_MSG: 
     [exec] /************************************************************
     [exec] STARTUP_MSG: Starting NameNode
     [exec] STARTUP_MSG:   host = h3/127.0.1.1
     [exec] STARTUP_MSG:   args = [-format]
     [exec] STARTUP_MSG:   version = 0.20.204
     [exec] STARTUP_MSG:   build = http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204 -r 1150862; compiled by 'hudson' on Tue Jul 26 06:51:45 UTC 2011
     [exec] ************************************************************/
     [exec] 11/07/26 06:52:15 INFO util.GSet: VM type       = 32-bit
     [exec] 11/07/26 06:52:15 INFO util.GSet: 2% max memory = 17.77875 MB
     [exec] 11/07/26 06:52:15 INFO util.GSet: capacity      = 2^22 = 4194304 entries
     [exec] 11/07/26 06:52:15 INFO util.GSet: recommended=4194304, actual=4194304
     [exec] 11/07/26 06:52:15 INFO namenode.FSNamesystem: fsOwner=hudson
     [exec] 11/07/26 06:52:15 INFO namenode.FSNamesystem: supergroup=supergroup
     [exec] 11/07/26 06:52:15 INFO namenode.FSNamesystem: isPermissionEnabled=true
     [exec] 11/07/26 06:52:15 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
     [exec] 11/07/26 06:52:15 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
     [exec] 11/07/26 06:52:15 INFO namenode.NameNode: Caching file names occuring more than 10 times 
     [exec] 11/07/26 06:52:16 INFO common.Storage: Image file of size 112 saved in 0 seconds.
     [exec] 11/07/26 06:52:16 INFO common.Storage: Storage directory build/test/libhdfs/dfs/name has been successfully formatted.
     [exec] 11/07/26 06:52:16 INFO namenode.NameNode: SHUTDOWN_MSG: 
     [exec] /************************************************************
     [exec] SHUTDOWN_MSG: Shutting down NameNode at h3/127.0.1.1
     [exec] ************************************************************/
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] starting namenode, logging to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-namenode-h3.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] starting datanode, logging to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-datanode-h3.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] CLASSPATH=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:/homes/hudson/tools/java/latest1.6/lib/tools.jar:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/jsp-2.0/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjrt-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjtools-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-1.7.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-core-1.8.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-cli-1.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-codec-1.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-collections-3.2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-configuration-1.6.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-daemon-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-digester-1.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-el-1.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-httpclient-3.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-lang-2.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-1.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-api-1.0.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-math-2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-net-1.4.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/core-3.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-core-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-mapper-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-compiler-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-runtime-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jdeb-0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jets3t-0.6.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-util-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jsch-0.1.42.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/junit-4.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/log4j-1.2.15.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/mockito-all-1.8.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/oro-2.0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/servlet-api-2.5-20081211.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-api-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-log4j12-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/xmlenc-0.52.jar> LD_PRELOAD=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so>:/homes/hudson/tools/java/latest1.6/jre/lib/i386/server/libjvm.so <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/libhdfs/hdfs_test>
     [exec] 11/07/26 06:52:42 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
     [exec] 11/07/26 06:52:42 WARN fs.FileSystem: "localhost:23000" is a deprecated filesystem name. Use "hdfs://localhost:23000/" instead.
     [exec] 11/07/26 06:52:44 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 0 time(s).
     [exec] 11/07/26 06:52:45 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 1 time(s).
     [exec] 11/07/26 06:52:46 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 2 time(s).
     [exec] 11/07/26 06:52:47 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 3 time(s).
     [exec] 11/07/26 06:52:48 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 4 time(s).
     [exec] 11/07/26 06:52:49 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 5 time(s).
     [exec] 11/07/26 06:52:50 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 6 time(s).
     [exec] 11/07/26 06:52:51 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 7 time(s).
     [exec] 11/07/26 06:52:52 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 8 time(s).
     [exec] 11/07/26 06:52:53 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 9 time(s).
     [exec] Exception in thread "main" java.net.ConnectException: Call to localhost/127.0.0.1:23000 failed on connection exception: java.net.ConnectException: Connection refused
     [exec] 	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1057)
     [exec] 	at org.apache.hadoop.ipc.Client.call(Client.java:1033)
     [exec] 	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:224)
     [exec] 	at $Proxy1.getProtocolVersion(Unknown Source)
     [exec] 	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:208)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:175)
     [exec] 	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
     [exec] 	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1310)
     [exec] 	at org.apache.hadoop.fs.FileSystem.access$100(FileSystem.java:65)
     [exec] 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1328)
     [exec] 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
     [exec] 	at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:103)
     [exec] 	at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:101)
     [exec] 	at java.security.AccessController.doPrivileged(Native Method)
     [exec] 	at javax.security.auth.Subject.doAs(Subject.java:396)
     [exec] 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
     [exec] 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:101)
     [exec] Caused by: java.net.ConnectException: Connection refused
     [exec] 	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
     [exec] 	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
     [exec] 	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
     [exec] 	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:406)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:414)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:527)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:187)
     [exec] 	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1164)
     [exec] 	at org.apache.hadoop.ipc.Client.call(Client.java:1010)
     [exec] 	... 17 more
     [exec] Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) failed!
     [exec] Oops! Failed to connect to hdfs!
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] no datanode to stop
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] no namenode to stop
     [exec] exiting with 255
     [exec] make: *** [test] Error 255

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1857: exec returned: 2

Total time: 6 minutes 4 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #8

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/8/>

------------------------------------------
[...truncated 11159 lines...]
     [exec] checking whether it is safe to define __EXTENSIONS__... yes
     [exec] checking for special C compiler options needed for large files... no
     [exec] checking for _FILE_OFFSET_BITS value needed for large files... no
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking build system type... x86_64-unknown-linux-gnu
     [exec] checking host system type... x86_64-unknown-linux-gnu
     [exec] checking for a sed that does not truncate output... /bin/sed
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: impl/config.h is unchanged
     [exec] config.status: executing depfiles commands

compile-c++-utils:
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib">
     [exec]  /usr/bin/install -c -m 644 -C 'libhadooputils.a' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec]  ranlib '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop">
     [exec]  /usr/bin/install -c -m 644 -C '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/StringUtils.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/StringUtils.hh'>
     [exec]  /usr/bin/install -c -m 644 -C '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/SerialUtils.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/SerialUtils.hh'>
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>

compile-c++-pipes:
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib">
     [exec]  /usr/bin/install -c -m 644 -C 'libhadooppipes.a' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec]  ranlib '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop">
     [exec]  /usr/bin/install -c -m 644 -C '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/Pipes.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/Pipes.hh'>
     [exec]  /usr/bin/install -c -m 644 -C '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/TemplateFactory.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/TemplateFactory.hh'>
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>

compile-c++:

compile-core:

test-c++-libhdfs:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/hdfs/name>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_test.o -MD -MP -MF ".deps/hdfs_test.Tpo" -c -o hdfs_test.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c;> \
     [exec] 	then mv -f ".deps/hdfs_test.Tpo" ".deps/hdfs_test.Po"; else rm -f ".deps/hdfs_test.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>: In function ‘main’:
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:87: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:90: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:130: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:133: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:188: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:189: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:190: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:198: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:199: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:220: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:221: warning: format ‘%ld’ expects type ‘long int’, but argument 3 has type ‘tOffset’
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:272: warning: implicit declaration of function ‘sleep’
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_test  hdfs_test.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_test hdfs_test.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_read.o -MD -MP -MF ".deps/hdfs_read.Tpo" -c -o hdfs_read.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c;> \
     [exec] 	then mv -f ".deps/hdfs_read.Tpo" ".deps/hdfs_read.Po"; else rm -f ".deps/hdfs_read.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>: In function ‘main’:
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>:35: warning: unused variable ‘fileTotalSize’
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_read  hdfs_read.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_read hdfs_read.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_write.o -MD -MP -MF ".deps/hdfs_write.Tpo" -c -o hdfs_write.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_write.c;> \
     [exec] 	then mv -f ".deps/hdfs_write.Tpo" ".deps/hdfs_write.Po"; else rm -f ".deps/hdfs_write.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_write  hdfs_write.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_write hdfs_write.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/test-libhdfs.sh>	
     [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] LIB_JVM_DIR = /homes/hudson/tools/java/latest1.6/jre/lib/i386/server
     [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop>: line 53: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] 11/07/23 00:35:34 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
     [exec] 11/07/23 00:35:34 INFO namenode.NameNode: STARTUP_MSG: 
     [exec] /************************************************************
     [exec] STARTUP_MSG: Starting NameNode
     [exec] STARTUP_MSG:   host = h3/127.0.1.1
     [exec] STARTUP_MSG:   args = [-format]
     [exec] STARTUP_MSG:   version = 0.20.204
     [exec] STARTUP_MSG:   build = http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204 -r 1149316; compiled by 'hudson' on Sat Jul 23 00:35:04 UTC 2011
     [exec] ************************************************************/
     [exec] 11/07/23 00:35:34 INFO util.GSet: VM type       = 32-bit
     [exec] 11/07/23 00:35:34 INFO util.GSet: 2% max memory = 17.77875 MB
     [exec] 11/07/23 00:35:34 INFO util.GSet: capacity      = 2^22 = 4194304 entries
     [exec] 11/07/23 00:35:34 INFO util.GSet: recommended=4194304, actual=4194304
     [exec] 11/07/23 00:35:34 INFO namenode.FSNamesystem: fsOwner=hudson
     [exec] 11/07/23 00:35:34 INFO namenode.FSNamesystem: supergroup=supergroup
     [exec] 11/07/23 00:35:34 INFO namenode.FSNamesystem: isPermissionEnabled=true
     [exec] 11/07/23 00:35:34 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
     [exec] 11/07/23 00:35:34 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
     [exec] 11/07/23 00:35:34 INFO namenode.NameNode: Caching file names occuring more than 10 times 
     [exec] 11/07/23 00:35:34 INFO common.Storage: Image file of size 112 saved in 0 seconds.
     [exec] 11/07/23 00:35:35 INFO common.Storage: Storage directory build/test/libhdfs/dfs/name has been successfully formatted.
     [exec] 11/07/23 00:35:35 INFO namenode.NameNode: SHUTDOWN_MSG: 
     [exec] /************************************************************
     [exec] SHUTDOWN_MSG: Shutting down NameNode at h3/127.0.1.1
     [exec] ************************************************************/
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] starting namenode, logging to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-namenode-h3.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] starting datanode, logging to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-datanode-h3.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] CLASSPATH=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:/homes/hudson/tools/java/latest1.6/lib/tools.jar:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/jsp-2.0/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjrt-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjtools-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-1.7.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-core-1.8.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-cli-1.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-codec-1.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-collections-3.2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-configuration-1.6.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-daemon-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-digester-1.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-el-1.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-httpclient-3.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-lang-2.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-1.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-api-1.0.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-math-2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-net-1.4.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/core-3.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-core-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-mapper-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-compiler-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-runtime-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jdeb-0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jets3t-0.6.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-util-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jsch-0.1.42.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/junit-4.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/log4j-1.2.15.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/mockito-all-1.8.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/oro-2.0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/servlet-api-2.5-20081211.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-api-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-log4j12-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/xmlenc-0.52.jar> LD_PRELOAD=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so>:/homes/hudson/tools/java/latest1.6/jre/lib/i386/server/libjvm.so <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/libhdfs/hdfs_test>
     [exec] 11/07/23 00:36:01 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
     [exec] 11/07/23 00:36:01 WARN fs.FileSystem: "localhost:23000" is a deprecated filesystem name. Use "hdfs://localhost:23000/" instead.
     [exec] 11/07/23 00:36:02 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 0 time(s).
     [exec] 11/07/23 00:36:03 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 1 time(s).
     [exec] 11/07/23 00:36:04 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 2 time(s).
     [exec] 11/07/23 00:36:05 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 3 time(s).
     [exec] 11/07/23 00:36:06 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 4 time(s).
     [exec] 11/07/23 00:36:07 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 5 time(s).
     [exec] 11/07/23 00:36:08 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 6 time(s).
     [exec] 11/07/23 00:36:09 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 7 time(s).
     [exec] 11/07/23 00:36:10 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 8 time(s).
     [exec] 11/07/23 00:36:11 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 9 time(s).
     [exec] Exception in thread "main" java.net.ConnectException: Call to localhost/127.0.0.1:23000 failed on connection exception: java.net.ConnectException: Connection refused
     [exec] 	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1057)
     [exec] 	at org.apache.hadoop.ipc.Client.call(Client.java:1033)
     [exec] 	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:224)
     [exec] 	at $Proxy1.getProtocolVersion(Unknown Source)
     [exec] 	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:208)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:175)
     [exec] 	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
     [exec] 	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1310)
     [exec] 	at org.apache.hadoop.fs.FileSystem.access$100(FileSystem.java:65)
     [exec] 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1328)
     [exec] 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
     [exec] 	at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:103)
     [exec] 	at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:101)
     [exec] 	at java.security.AccessController.doPrivileged(Native Method)
     [exec] 	at javax.security.auth.Subject.doAs(Subject.java:396)
     [exec] 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
     [exec] 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:101)
     [exec] Caused by: java.net.ConnectException: Connection refused
     [exec] 	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
     [exec] 	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
     [exec] 	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
     [exec] 	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:406)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:414)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:527)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:187)
     [exec] 	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1164)
     [exec] 	at org.apache.hadoop.ipc.Client.call(Client.java:1010)
     [exec] 	... 17 more
     [exec] Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) failed!
     [exec] Oops! Failed to connect to hdfs!
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] no datanode to stop
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] no namenode to stop
     [exec] exiting with 255
     [exec] make: *** [test] Error 255

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1857: exec returned: 2

Total time: 5 minutes 49 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #7

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/7/>

------------------------------------------
[...truncated 12598 lines...]
      [rpm] -rw-rw-r-- 0/0            3753 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/Hod/nodePool.py
      [rpm] -rw-rw-r-- 0/0             770 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/HodRing/__init__.py
      [rpm] -rw-rw-r-- 0/0           32604 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/HodRing/hodRing.py
      [rpm] -rw-rw-r-- 0/0             770 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/NodePools/__init__.py
      [rpm] -rw-rw-r-- 0/0           10359 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/NodePools/torque.py
      [rpm] -rw-rw-r-- 0/0             770 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/RingMaster/__init__.py
      [rpm] -rw-rw-r-- 0/0            9106 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/RingMaster/idleJobTracker.py
      [rpm] -rw-rw-r-- 0/0           35563 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/RingMaster/ringMaster.py
      [rpm] -rw-rw-r-- 0/0             770 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/Schedulers/__init__.py
      [rpm] -rw-rw-r-- 0/0            5568 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/Schedulers/torque.py
      [rpm] -rw-rw-r-- 0/0             770 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/ServiceProxy/__init__.py
      [rpm] -rw-rw-r-- 0/0            2235 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/ServiceProxy/serviceProxy.py
      [rpm] -rw-rw-r-- 0/0             770 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/ServiceRegistry/__init__.py
      [rpm] -rw-rw-r-- 0/0            5048 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/ServiceRegistry/serviceRegistry.py
      [rpm] -rw-rw-r-- 0/0             771 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/hodlib/__init__.py
      [rpm] -rw-rw-r-- 0/0             822 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/ivy.xml
      [rpm] -rw-rw-r-- 0/0             292 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/ivy/libraries.properties
      [rpm] -rw-rw-r-- 0/0            1801 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/support/checklimits.sh
      [rpm] -rw-rw-r-- 0/0            7108 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/support/logcondense.py
      [rpm] -rw-rw-r-- 0/0             770 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/__init__.py
      [rpm] -rw-rw-r-- 0/0            1122 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/helper.py
      [rpm] -rw-rw-r-- 0/0            3458 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/lib.py
      [rpm] -rw-rw-r-- 0/0            2928 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/main.py
      [rpm] -rw-rw-r-- 0/0            4674 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testHadoop.py
      [rpm] -rw-rw-r-- 0/0           14428 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testHod.py
      [rpm] -rw-rw-r-- 0/0            4063 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testHodCleanup.py
      [rpm] -rw-rw-r-- 0/0            4351 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testHodRing.py
      [rpm] -rw-rw-r-- 0/0            2187 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testModule.py
      [rpm] -rw-rw-r-- 0/0            5759 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testRingmasterRPCs.py
      [rpm] -rw-rw-r-- 0/0            3075 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testThreads.py
      [rpm] -rw-rw-r-- 0/0            7386 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testTypes.py
      [rpm] -rw-rw-r-- 0/0            1796 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testUtil.py
      [rpm] -rw-rw-r-- 0/0            3655 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/hod/testing/testXmlrpc.py
      [rpm] -rw-rw-r-- 0/0           63178 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/index/hadoop-index-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0           65353 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/streaming/hadoop-streaming-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0           10435 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/thriftfs/hadoop-thriftfs-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0            1770 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/vaidya/bin/vaidya.sh
      [rpm] -rw-rw-r-- 0/0            6067 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/vaidya/conf/postex_diagnosis_tests.xml
      [rpm] -rw-rw-r-- 0/0           44839 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/contrib/vaidya/hadoop-vaidya-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0            6839 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/hadoop-ant-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0         3511897 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/hadoop-core-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0          142465 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/hadoop-examples-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0         2310871 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/hadoop-test-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0          284884 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/hadoop-tools-0.20.204.jar
      [rpm] -rw-rw-r-- 0/0          116039 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/aspectjrt-1.6.5.jar
      [rpm] -rw-rw-r-- 0/0         8768365 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/aspectjtools-1.6.5.jar
      [rpm] -rw-rw-r-- 0/0          188671 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-beanutils-1.7.0.jar
      [rpm] -rw-rw-r-- 0/0          206035 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-beanutils-core-1.8.0.jar
      [rpm] -rw-rw-r-- 0/0           41123 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-cli-1.2.jar
      [rpm] -rw-rw-r-- 0/0           58160 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-codec-1.4.jar
      [rpm] -rw-rw-r-- 0/0          575389 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-collections-3.2.1.jar
      [rpm] -rw-rw-r-- 0/0          298829 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-configuration-1.6.jar
      [rpm] -rw-rw-r-- 0/0           13619 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-daemon-1.0.1.jar
      [rpm] -rw-rw-r-- 0/0          143602 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-digester-1.8.jar
      [rpm] -rw-rw-r-- 0/0          112341 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-el-1.0.jar
      [rpm] -rw-rw-r-- 0/0          279781 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-httpclient-3.0.1.jar
      [rpm] -rw-rw-r-- 0/0          261809 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-lang-2.4.jar
      [rpm] -rw-rw-r-- 0/0           60686 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-logging-1.1.1.jar
      [rpm] -rw-rw-r-- 0/0           26202 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-logging-api-1.0.4.jar
      [rpm] -rw-rw-r-- 0/0          832410 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-math-2.1.jar
      [rpm] -rw-rw-r-- 0/0          180792 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/commons-net-1.4.1.jar
      [rpm] -rw-rw-r-- 0/0         3566844 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/core-3.1.1.jar
      [rpm] -rw-rw-r-- 0/0            3434 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/hsqldb-1.8.0.10.LICENSE.txt
      [rpm] -rw-rw-r-- 0/0          706710 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/hsqldb-1.8.0.10.jar
      [rpm] -rw-rw-r-- 0/0          136059 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jackson-core-asl-1.0.1.jar
      [rpm] -rw-rw-r-- 0/0          270781 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jackson-mapper-asl-1.0.1.jar
      [rpm] -rw-rw-r-- 0/0          405086 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jasper-compiler-5.5.12.jar
      [rpm] -rw-rw-r-- 0/0           76698 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jasper-runtime-5.5.12.jar
      [rpm] -rw-rw-r-- 0/0          220920 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdeb-0.8.jar
      [rpm] -rw-rw-r-- 0/0         1811420 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.17.0.xml
      [rpm] -rw-rw-r-- 0/0         1883089 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.18.1.xml
      [rpm] -rw-rw-r-- 0/0         1637020 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.18.2.xml
      [rpm] -rw-rw-r-- 0/0         1638706 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.18.3.xml
      [rpm] -rw-rw-r-- 0/0         1852608 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.19.0.xml
      [rpm] -rw-rw-r-- 0/0         1861926 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.19.1.xml
      [rpm] -rw-rw-r-- 0/0         1861594 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.19.2.xml
      [rpm] -rw-rw-r-- 0/0         2255765 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.20.1.xml
      [rpm] -rw-rw-r-- 0/0         3107311 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jdiff/hadoop_0.20.204.xml
      [rpm] -rw-rw-r-- 0/0          321806 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jets3t-0.6.1.jar
      [rpm] -rw-rw-r-- 0/0          539912 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jetty-6.1.26.jar
      [rpm] -rw-rw-r-- 0/0          177131 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jetty-util-6.1.26.jar
      [rpm] -rw-rw-r-- 0/0          185746 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jsch-0.1.42.jar
      [rpm] -rw-rw-r-- 0/0         1024681 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jsp-2.1/jsp-2.1.jar
      [rpm] -rw-rw-r-- 0/0          134910 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/jsp-2.1/jsp-api-2.1.jar
      [rpm] -rw-rw-r-- 0/0          198945 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/junit-4.5.jar
      [rpm] -rw-rw-r-- 0/0           11428 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/kfs-0.2.2.jar
      [rpm] -rw-rw-r-- 0/0           11358 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/kfs-0.2.LICENSE.txt
      [rpm] -rw-rw-r-- 0/0          391834 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/log4j-1.2.15.jar
      [rpm] -rw-rw-r-- 0/0         1419869 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/mockito-all-1.8.5.jar
      [rpm] -rw-rw-r-- 0/0           65261 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/oro-2.0.8.jar
      [rpm] -rw-rw-r-- 0/0          134133 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/servlet-api-2.5-20081211.jar
      [rpm] -rw-rw-r-- 0/0           15345 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/slf4j-api-1.4.3.jar
      [rpm] -rw-rw-r-- 0/0            8601 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/slf4j-log4j12-1.4.3.jar
      [rpm] -rw-rw-r-- 0/0           15010 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/lib/xmlenc-0.52.jar
      [rpm] -rw-rw-r-- 0/0             274 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/templates/conf/core-site.xml
      [rpm] -rw-rw-r-- 0/0            2612 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/templates/conf/hadoop-env.sh
      [rpm] -rw-rw-r-- 0/0             547 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/templates/conf/hdfs-site.xml
      [rpm] -rw-rw-r-- 0/0             686 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/templates/conf/mapred-site.xml
      [rpm] -rw-rw-r-- 0/0            1494 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/datanode/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0            1542 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/hdfs/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0             246 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/hdfs/index.html
      [rpm] -rw-rw-r-- 0/0            4172 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/history/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0            8552 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0           10594 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/analysejobhistory.jsp
      [rpm] -rw-rw-r-- 0/0            2326 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/gethistory.jsp
      [rpm] -rw-rw-r-- 0/0             241 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/index.html
      [rpm] -rw-rw-r-- 0/0            1384 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/job_authorization_error.jsp
      [rpm] -rw-rw-r-- 0/0            2226 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobblacklistedtrackers.jsp
      [rpm] -rw-rw-r-- 0/0            1738 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobconf.jsp
      [rpm] -rw-rw-r-- 0/0            1964 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobconf_history.jsp
      [rpm] -rw-rw-r-- 0/0           19056 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobdetails.jsp
      [rpm] -rw-rw-r-- 0/0           13484 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobdetailshistory.jsp
      [rpm] -rw-rw-r-- 0/0            6283 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobfailures.jsp
      [rpm] -rw-rw-r-- 0/0            1235 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobhistory.jsp
      [rpm] -rw-rw-r-- 0/0           20250 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobhistoryhome.jsp
      [rpm] -rw-rw-r-- 0/0            1988 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobqueue_details.jsp
      [rpm] -rw-rw-r-- 0/0            5788 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobtasks.jsp
      [rpm] -rw-rw-r-- 0/0            2847 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobtaskshistory.jsp
      [rpm] -rw-rw-r-- 0/0            7044 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/jobtracker.jsp
      [rpm] -rw-rw-r-- 0/0           12180 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/legacyjobhistory.jsp
      [rpm] -rw-rw-r-- 0/0            1692 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/loadhistory.jsp
      [rpm] -rw-rw-r-- 0/0            6590 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/machines.jsp
      [rpm] -rw-rw-r-- 0/0           12774 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/taskdetails.jsp
      [rpm] -rw-rw-r-- 0/0            4838 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/taskdetailshistory.jsp
      [rpm] -rw-rw-r-- 0/0            2925 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/taskstats.jsp
      [rpm] -rw-rw-r-- 0/0            3722 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/job/taskstatshistory.jsp
      [rpm] -rw-rw-r-- 0/0            9443 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/static/hadoop-logo.jpg
      [rpm] -rw-rw-r-- 0/0            2703 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/static/hadoop.css
      [rpm] -rw-rw-r-- 0/0             461 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/static/jobconf.xsl
      [rpm] -rw-rw-r-- 0/0            3827 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/static/jobtracker.js
      [rpm] -rw-rw-r-- 0/0           16919 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/static/sorttable.js
      [rpm] -rw-rw-r-- 0/0             652 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/task/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0              61 2011-07-22 23:49 hadoop-0.20.204/share/hadoop/webapps/task/index.html
      [rpm] -rw-rw-r-- 0/0            1494 2011-07-22 23:48 hadoop-0.20.204/webapps/datanode/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0            1542 2011-07-22 23:48 hadoop-0.20.204/webapps/hdfs/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0             246 2011-07-22 23:48 hadoop-0.20.204/webapps/hdfs/index.html
      [rpm] -rw-rw-r-- 0/0            4172 2011-07-22 23:48 hadoop-0.20.204/webapps/history/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0            8552 2011-07-22 23:48 hadoop-0.20.204/webapps/job/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0           10594 2011-07-22 23:48 hadoop-0.20.204/webapps/job/analysejobhistory.jsp
      [rpm] -rw-rw-r-- 0/0            2326 2011-07-22 23:48 hadoop-0.20.204/webapps/job/gethistory.jsp
      [rpm] -rw-rw-r-- 0/0             241 2011-07-22 23:48 hadoop-0.20.204/webapps/job/index.html
      [rpm] -rw-rw-r-- 0/0            1384 2011-07-22 23:48 hadoop-0.20.204/webapps/job/job_authorization_error.jsp
      [rpm] -rw-rw-r-- 0/0            2226 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobblacklistedtrackers.jsp
      [rpm] -rw-rw-r-- 0/0            1738 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobconf.jsp
      [rpm] -rw-rw-r-- 0/0            1964 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobconf_history.jsp
      [rpm] -rw-rw-r-- 0/0           19056 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobdetails.jsp
      [rpm] -rw-rw-r-- 0/0           13484 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobdetailshistory.jsp
      [rpm] -rw-rw-r-- 0/0            6283 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobfailures.jsp
      [rpm] -rw-rw-r-- 0/0            1235 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobhistory.jsp
      [rpm] -rw-rw-r-- 0/0           20250 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobhistoryhome.jsp
      [rpm] -rw-rw-r-- 0/0            1988 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobqueue_details.jsp
      [rpm] -rw-rw-r-- 0/0            5788 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobtasks.jsp
      [rpm] -rw-rw-r-- 0/0            2847 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobtaskshistory.jsp
      [rpm] -rw-rw-r-- 0/0            7044 2011-07-22 23:48 hadoop-0.20.204/webapps/job/jobtracker.jsp
      [rpm] -rw-rw-r-- 0/0           12180 2011-07-22 23:48 hadoop-0.20.204/webapps/job/legacyjobhistory.jsp
      [rpm] -rw-rw-r-- 0/0            1692 2011-07-22 23:48 hadoop-0.20.204/webapps/job/loadhistory.jsp
      [rpm] -rw-rw-r-- 0/0            6590 2011-07-22 23:48 hadoop-0.20.204/webapps/job/machines.jsp
      [rpm] -rw-rw-r-- 0/0           12774 2011-07-22 23:48 hadoop-0.20.204/webapps/job/taskdetails.jsp
      [rpm] -rw-rw-r-- 0/0            4838 2011-07-22 23:48 hadoop-0.20.204/webapps/job/taskdetailshistory.jsp
      [rpm] -rw-rw-r-- 0/0            2925 2011-07-22 23:48 hadoop-0.20.204/webapps/job/taskstats.jsp
      [rpm] -rw-rw-r-- 0/0            3722 2011-07-22 23:48 hadoop-0.20.204/webapps/job/taskstatshistory.jsp
      [rpm] -rw-rw-r-- 0/0            9443 2011-07-22 23:48 hadoop-0.20.204/webapps/static/hadoop-logo.jpg
      [rpm] -rw-rw-r-- 0/0            2703 2011-07-22 23:48 hadoop-0.20.204/webapps/static/hadoop.css
      [rpm] -rw-rw-r-- 0/0             461 2011-07-22 23:48 hadoop-0.20.204/webapps/static/jobconf.xsl
      [rpm] -rw-rw-r-- 0/0            3827 2011-07-22 23:48 hadoop-0.20.204/webapps/static/jobtracker.js
      [rpm] -rw-rw-r-- 0/0           16919 2011-07-22 23:48 hadoop-0.20.204/webapps/static/sorttable.js
      [rpm] -rw-rw-r-- 0/0             652 2011-07-22 23:48 hadoop-0.20.204/webapps/task/WEB-INF/web.xml
      [rpm] -rw-rw-r-- 0/0              61 2011-07-22 23:48 hadoop-0.20.204/webapps/task/index.html
      [rpm] -rwxr-xr-x 0/0           13179 2011-07-22 23:48 hadoop-0.20.204/bin/hadoop
      [rpm] -rwxr-xr-x 0/0            2210 2011-07-22 23:48 hadoop-0.20.204/bin/hadoop-config.sh
      [rpm] -rwxr-xr-x 0/0            4095 2011-07-22 23:48 hadoop-0.20.204/bin/hadoop-daemon.sh
      [rpm] -rwxr-xr-x 0/0            1238 2011-07-22 23:48 hadoop-0.20.204/bin/hadoop-daemons.sh
      [rpm] -rwxr-xr-x 0/0            2721 2011-07-22 23:48 hadoop-0.20.204/bin/rcc
      [rpm] -rwxr-xr-x 0/0            2054 2011-07-22 23:48 hadoop-0.20.204/bin/slaves.sh
      [rpm] -rwxr-xr-x 0/0            1077 2011-07-22 23:48 hadoop-0.20.204/bin/start-all.sh
      [rpm] -rwxr-xr-x 0/0             976 2011-07-22 23:48 hadoop-0.20.204/bin/start-balancer.sh
      [rpm] -rwxr-xr-x 0/0            1656 2011-07-22 23:48 hadoop-0.20.204/bin/start-dfs.sh
      [rpm] -rwxr-xr-x 0/0            1056 2011-07-22 23:48 hadoop-0.20.204/bin/start-jobhistoryserver.sh
      [rpm] -rwxr-xr-x 0/0            1170 2011-07-22 23:48 hadoop-0.20.204/bin/start-mapred.sh
      [rpm] -rwxr-xr-x 0/0            1030 2011-07-22 23:48 hadoop-0.20.204/bin/stop-all.sh
      [rpm] -rwxr-xr-x 0/0            1027 2011-07-22 23:48 hadoop-0.20.204/bin/stop-balancer.sh
      [rpm] -rwxr-xr-x 0/0            1157 2011-07-22 23:48 hadoop-0.20.204/bin/stop-dfs.sh
      [rpm] -rwxr-xr-x 0/0            1042 2011-07-22 23:48 hadoop-0.20.204/bin/stop-jobhistoryserver.sh
      [rpm] -rwxr-xr-x 0/0            1079 2011-07-22 23:48 hadoop-0.20.204/bin/stop-mapred.sh
      [rpm] + STATUS=0
      [rpm] + [ 0 -ne 0 ]
      [rpm] + cd hadoop-0.20.204
      [rpm] + /bin/gzip -dc /tmp/hadoop_package_build_hudson/SOURCES/hadoop-0.20.204-script.tar.gz+ 
      [rpm] /bin/tar -xvvf -
      [rpm] -rwxr-xr-x 0/0            2014 2011-07-22 23:44 hadoop-datanode
      [rpm] -rwxr-xr-x 0/0            2042 2011-07-22 23:44 hadoop-jobtracker
      [rpm] -rwxr-xr-x 0/0            2207 2011-07-22 23:44 hadoop-namenode
      [rpm] -rwxr-xr-x 0/0            2053 2011-07-22 23:44 hadoop-tasktracker
      [rpm] + STATUS=0
      [rpm] + [ 0 -ne 0 ]
      [rpm] + exit 0
      [rpm] Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.uxcjC7
      [rpm] + umask 022
      [rpm] + cd /tmp/hadoop_package_build_hudson/BUILD
      [rpm] + cd hadoop-0.20.204
      [rpm] + [ -d /tmp/hadoop_package_build_hudson/BUILD/usr ]
      [rpm] + [ -d /tmp/hadoop_package_build_hudson/BUILD/var/log/hadoop ]
      [rpm] + [ -d /tmp/hadoop_package_build_hudson/BUILD/etc/hadoop ]
      [rpm] + [ -d /tmp/hadoop_package_build_hudson/BUILD/var/run/hadoop ]
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/usr
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/usr/bin
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/usr/include
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/usr/lib
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/usr/libexec
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/var/log/hadoop
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/etc/hadoop
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/usr/man
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/var/run/hadoop
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/usr/sbin
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/usr/share
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/var/lib/hadoop
      [rpm] + mkdir -p /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d
      [rpm] + mv /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-namenode /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d/hadoop-namenode
      [rpm] + mv /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-datanode /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d/hadoop-datanode
      [rpm] + mv /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-jobtracker /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d/hadoop-jobtracker
      [rpm] + mv /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-tasktracker /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d/hadoop-tasktracker
      [rpm] + chmod 0755 /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d/hadoop-datanode /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d/hadoop-jobtracker /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d/hadoop-namenode /tmp/hadoop_package_build_hudson/BUILD/etc/rc.d/init.d/hadoop-tasktracker
      [rpm] + chmod 0755 /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/sbin/hadoop-create-user.sh /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/sbin/hadoop-daemon.sh /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/sbin/hadoop-daemons.sh /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/sbin/hadoop-setup-conf.sh /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/sbin/hadoop-setup-hdfs.sh /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/sbin/hadoop-setup-single-node.sh
      [rpm] + exit 0
      [rpm] Executing(%install): /bin/sh -e /var/tmp/rpm-tmp.ZRfvER
      [rpm] + umask 022
      [rpm] + cd /tmp/hadoop_package_build_hudson/BUILD
      [rpm] + cd hadoop-0.20.204
      [rpm] + mv /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/capacity-scheduler.xml /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/configuration.xsl /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/core-site.xml /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/hadoop-env.sh /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/hadoop-metrics2.properties /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/hadoop-policy.xml /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/hdfs-site.xml /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/log4j.properties /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/mapred-queue-acls.xml /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/mapred-site.xml /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/masters /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/slaves /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/ssl-client.xml.example /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/ssl-server.xml.example /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc/hadoop/taskcontroller.cfg /tmp/hadoop_package_build_hudson/BUILD/etc/hadoop
      [rpm] + mv /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/CHANGES.txt /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/LICENSE.txt /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/NOTICE.txt /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/README.txt /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/bin /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/build.xml /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/c++ /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/conf /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/contrib /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/etc /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-ant-0.20.204.jar /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-core-0.20.204.jar /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-examples-0.20.204.jar /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-test-0.20.204.jar /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/hadoop-tools-0.20.204.jar /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/include /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/ivy /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/ivy.xml /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/lib /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/libexec /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/sbin /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/share /tmp/hadoop_package_build_hudson/BUILD/hadoop-0.20.204/webapps /tmp/hadoop_package_build_hudson/BUILD/usr
      [rpm] + [ /tmp/hadoop_package_build_hudson/BUILD/etc/hadoop != /tmp/hadoop_package_build_hudson/BUILD//usr/conf ]
      [rpm] + rm -rf /tmp/hadoop_package_build_hudson/BUILD//usr/etc
      [rpm] + /usr/lib/rpm/redhat/brp-compress
      [rpm] /var/tmp/rpm-tmp.ZRfvER: 1: /usr/lib/rpm/redhat/brp-compress: not found
      [rpm] error: Bad exit status from /var/tmp/rpm-tmp.ZRfvER (%install)
      [rpm]     Bad exit status from /var/tmp/rpm-tmp.ZRfvER (%install)
      [rpm] 
      [rpm] 
      [rpm] RPM build errors:

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1720: '/usr/bin/rpmbuild' failed with exit code 1

Total time: 4 minutes 44 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: 


Build failed in Jenkins: Hadoop-0.20.204-Build #6

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-0.20.204-Build/6/changes>

Changes:

[omalley] HADOOP-7475. Fix hadoop-setup-single-node.sh to reflect new layout. (eyang
via omalley)

------------------------------------------
[...truncated 8732 lines...]
     [exec] checking for ld used by gcc... /usr/bin/ld
     [exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
     [exec] checking for /usr/bin/ld option to reload object files... -r
     [exec] checking for BSD-compatible nm... /usr/bin/nm -B
     [exec] checking whether ln -s works... yes
     [exec] checking how to recognise dependent libraries... pass_all
     [exec] checking dlfcn.h usability... yes
     [exec] checking dlfcn.h presence... yes
     [exec] checking for dlfcn.h... yes
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for xlf... no
     [exec] checking for f77... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for cf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for xlf90... no
     [exec] checking for f90... no
     [exec] checking for pgf90... no
     [exec] checking for pghpf... no
     [exec] checking for epcf90... no
     [exec] checking for gfortran... no
     [exec] checking for g95... no
     [exec] checking for xlf95... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for ifort... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for ftn... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for unistd.h... (cached) yes
     [exec] checking for stdbool.h that conforms to C99... yes
     [exec] checking for _Bool... no
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for off_t... yes
     [exec] checking for size_t... yes
     [exec] checking whether strerror_r is declared... yes
     [exec] checking for strerror_r... yes
     [exec] checking whether strerror_r returns char *... yes
     [exec] checking for mkdir... yes
     [exec] checking for uname... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating impl/config.h
     [exec] config.status: impl/config.h is unchanged
     [exec] config.status: executing depfiles commands

compile-c++-utils:
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib">
     [exec]  /usr/bin/install -c -m 644 'libhadooputils.a' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec]  ranlib '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooputils.a'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop">
     [exec]  /usr/bin/install -c -m 644 '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/StringUtils.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/StringUtils.hh'>
     [exec]  /usr/bin/install -c -m 644 '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/utils/api/hadoop/SerialUtils.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/SerialUtils.hh'>
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/utils'>

compile-c++-pipes:
     [exec] depbase=`echo impl/HadoopPipes.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \
     [exec] 	if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes> -I./impl    -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api> -Wall -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -MT impl/HadoopPipes.o -MD -MP -MF "$depbase.Tpo" -c -o impl/HadoopPipes.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc;> \
     [exec] 	then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>: In member function 'void HadoopPipes::TextUpwardProtocol::writeBuffer(const std::string&)':
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>:129: warning: format not a string literal and no format arguments
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>: In member function 'std::string HadoopPipes::BinaryProtocol::createDigest(std::string&, std::string&)':
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/impl/HadoopPipes.cc>:439: warning: value computed is not used
     [exec] rm -f libhadooppipes.a
     [exec] ar cru libhadooppipes.a impl/HadoopPipes.o 
     [exec] ranlib libhadooppipes.a
     [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib">
     [exec]  /usr/bin/install -c -m 644 'libhadooppipes.a' '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec]  ranlib '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhadooppipes.a'>
     [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop">
     [exec]  /usr/bin/install -c -m 644 '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/Pipes.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/Pipes.hh'>
     [exec]  /usr/bin/install -c -m 644 '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/pipes/api/hadoop/TemplateFactory.hh'> '<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/include/hadoop/TemplateFactory.hh'>
     [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/pipes'>

compile-c++:

compile-core:

test-c++-libhdfs:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/hdfs/name>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_test.o -MD -MP -MF ".deps/hdfs_test.Tpo" -c -o hdfs_test.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c;> \
     [exec] 	then mv -f ".deps/hdfs_test.Tpo" ".deps/hdfs_test.Po"; else rm -f ".deps/hdfs_test.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>: In function `main':
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:87: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:90: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:130: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:133: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:188: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:189: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:190: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:198: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:199: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:220: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:221: warning: long int format, different type arg (arg 3)
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_test.c>:272: warning: implicit declaration of function `sleep'
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_test  hdfs_test.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_test hdfs_test.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_read.o -MD -MP -MF ".deps/hdfs_read.Tpo" -c -o hdfs_read.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c;> \
     [exec] 	then mv -f ".deps/hdfs_read.Tpo" ".deps/hdfs_read.Po"; else rm -f ".deps/hdfs_read.Tpo"; exit 1; fi
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>: In function `main':
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_read.c>:35: warning: unused variable `fileTotalSize'
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_read  hdfs_read.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_read hdfs_read.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] if gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -DHAVE_STRDUP=1 -DHAVE_STRERROR=1 -DHAVE_STRTOUL=1 -DHAVE_FCNTL_H=1 -DHAVE__BOOL=1 -DHAVE_STDBOOL_H=1 -I. -I<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs>     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -MT hdfs_write.o -MD -MP -MF ".deps/hdfs_write.Tpo" -c -o hdfs_write.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/hdfs_write.c;> \
     [exec] 	then mv -f ".deps/hdfs_write.Tpo" ".deps/hdfs_write.Po"; else rm -f ".deps/hdfs_write.Tpo"; exit 1; fi
     [exec] /bin/bash ./libtool --mode=link --tag=CC gcc  -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes  -m32 -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server  -ljvm -shared -Wl,-x -o hdfs_write  hdfs_write.o <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.la>  -ldl -lpthread
     [exec] libtool: link: gcc -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -Wall -Wstrict-prototypes -m32 -Wl,-x -o hdfs_write hdfs_write.o  -L/homes/hudson/tools/java/latest1.6/jre/lib/i386/server <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so> -ljvm -ldl -lpthread -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib> -Wl,-rpath -Wl,<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib>
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/test-libhdfs.sh>	
     [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] LIB_JVM_DIR = /homes/hudson/tools/java/latest1.6/jre/lib/i386/server
     [exec] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop>: line 53: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] 11/07/22 10:12:32 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
     [exec] 11/07/22 10:12:32 INFO namenode.NameNode: STARTUP_MSG: 
     [exec] /************************************************************
     [exec] STARTUP_MSG: Starting NameNode
     [exec] STARTUP_MSG:   host = h4.grid.sp2.yahoo.net/127.0.1.1
     [exec] STARTUP_MSG:   args = [-format]
     [exec] STARTUP_MSG:   version = 0.20.204
     [exec] STARTUP_MSG:   build = http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-204 -r 1149316; compiled by 'hudson' on Fri Jul 22 08:56:57 UTC 2011
     [exec] ************************************************************/
     [exec] 11/07/22 10:12:32 INFO util.GSet: VM type       = 32-bit
     [exec] 11/07/22 10:12:32 INFO util.GSet: 2% max memory = 17.77875 MB
     [exec] 11/07/22 10:12:32 INFO util.GSet: capacity      = 2^22 = 4194304 entries
     [exec] 11/07/22 10:12:32 INFO util.GSet: recommended=4194304, actual=4194304
     [exec] 11/07/22 10:12:32 INFO namenode.FSNamesystem: fsOwner=hudson
     [exec] 11/07/22 10:12:32 INFO namenode.FSNamesystem: supergroup=supergroup
     [exec] 11/07/22 10:12:32 INFO namenode.FSNamesystem: isPermissionEnabled=true
     [exec] 11/07/22 10:12:32 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
     [exec] 11/07/22 10:12:32 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
     [exec] 11/07/22 10:12:32 INFO namenode.NameNode: Caching file names occuring more than 10 times 
     [exec] 11/07/22 10:12:33 INFO common.Storage: Image file of size 112 saved in 0 seconds.
     [exec] 11/07/22 10:12:33 INFO common.Storage: Storage directory build/test/libhdfs/dfs/name has been successfully formatted.
     [exec] 11/07/22 10:12:33 INFO namenode.NameNode: SHUTDOWN_MSG: 
     [exec] /************************************************************
     [exec] SHUTDOWN_MSG: Shutting down NameNode at h4.grid.sp2.yahoo.net/127.0.1.1
     [exec] ************************************************************/
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] starting namenode, logging to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-namenode-h4.grid.sp2.yahoo.net.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] starting datanode, logging to <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/libhdfs/logs/hadoop-hudson-datanode-h4.grid.sp2.yahoo.net.out>
     [exec] nice: /bin/hadoop: No such file or directory
     [exec] CLASSPATH=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/src/c++/libhdfs/tests/conf>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/conf>:/homes/hudson/tools/java/latest1.6/lib/tools.jar:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/test/classes>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/hsqldb-1.8.0.10.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/kfs-0.2.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/lib/jsp-2.0/*.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjrt-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/aspectjtools-1.6.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-1.7.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-beanutils-core-1.8.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-cli-1.2.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-codec-1.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-collections-3.2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-configuration-1.6.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-daemon-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-digester-1.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-el-1.0.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-httpclient-3.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-lang-2.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-1.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-logging-api-1.0.4.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-math-2.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/commons-net-1.4.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/core-3.1.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-core-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jackson-mapper-asl-1.0.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-compiler-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jasper-runtime-5.5.12.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jdeb-0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jets3t-0.6.1.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jetty-util-6.1.26.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/jsch-0.1.42.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/junit-4.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/log4j-1.2.15.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/mockito-all-1.8.5.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/oro-2.0.8.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/servlet-api-2.5-20081211.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-api-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/slf4j-log4j12-1.4.3.jar>:<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/ivy/lib/Hadoop/common/xmlenc-0.52.jar> LD_PRELOAD=<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++/Linux-i386-32/lib/libhdfs.so>:/homes/hudson/tools/java/latest1.6/jre/lib/i386/server/libjvm.so <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build/c++-build/Linux-i386-32/libhdfs/hdfs_test>
     [exec] 11/07/22 10:13:23 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
     [exec] 11/07/22 10:13:23 WARN fs.FileSystem: "localhost:23000" is a deprecated filesystem name. Use "hdfs://localhost:23000/" instead.
     [exec] 11/07/22 10:13:25 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 0 time(s).
     [exec] 11/07/22 10:13:26 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 1 time(s).
     [exec] 11/07/22 10:13:27 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 2 time(s).
     [exec] 11/07/22 10:13:28 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 3 time(s).
     [exec] 11/07/22 10:13:29 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 4 time(s).
     [exec] 11/07/22 10:13:30 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 5 time(s).
     [exec] 11/07/22 10:13:31 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 6 time(s).
     [exec] 11/07/22 10:13:32 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 7 time(s).
     [exec] 11/07/22 10:13:33 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 8 time(s).
     [exec] 11/07/22 10:13:34 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:23000. Already tried 9 time(s).
     [exec] Exception in thread "main" java.net.ConnectException: Call to localhost/127.0.0.1:23000 failed on connection exception: java.net.ConnectException: Connection refused
     [exec] 	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1057)
     [exec] 	at org.apache.hadoop.ipc.Client.call(Client.java:1033)
     [exec] 	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:224)
     [exec] 	at $Proxy1.getProtocolVersion(Unknown Source)
     [exec] 	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:208)
     [exec] 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:175)
     [exec] 	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
     [exec] 	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1310)
     [exec] 	at org.apache.hadoop.fs.FileSystem.access$100(FileSystem.java:65)
     [exec] 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1328)
     [exec] 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
     [exec] 	at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:103)
     [exec] 	at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:101)
     [exec] 	at java.security.AccessController.doPrivileged(Native Method)
     [exec] 	at javax.security.auth.Subject.doAs(Subject.java:396)
     [exec] 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
     [exec] 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:101)
     [exec] Caused by: java.net.ConnectException: Connection refused
     [exec] 	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
     [exec] 	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
     [exec] 	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
     [exec] 	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:406)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:414)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:527)
     [exec] 	at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:187)
     [exec] 	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1164)
     [exec] 	at org.apache.hadoop.ipc.Client.call(Client.java:1010)
     [exec] 	... 17 more
     [exec] Call to org.apache.hadoop.fs.Filesystem::get(URI, Configuration) failed!
     [exec] Oops! Failed to connect to hdfs!
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] no datanode to stop
     [exec] <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/hadoop-daemon.sh>: line 42: <https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/bin/../libexec/hadoop-config.sh>: No such file or directory
     [exec] no namenode to stop
     [exec] make: *** [test] Error 255
     [exec] exiting with 255

BUILD FAILED
<https://builds.apache.org/job/Hadoop-0.20.204-Build/ws/trunk/build.xml>:1857: exec returned: 2

Total time: 220 minutes 49 seconds
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Description set: