You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Apache Hudson Server <hu...@hudson.apache.org> on 2010/11/19 12:01:51 UTC
Build failed in Hudson: Hadoop-Common-trunk #517
See <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/517/changes>
Changes:
[eli] HADOOP-7045. TestDU fails on systems with local file systems with extended attributes. Contributed by Eli Collins
[nigel] HADOOP-7042. Updates to test-patch.sh to include failed test names and improve other messaging. Contributed by nigel.
[nigel] HADOOP-7042. Updates to test-patch.sh to include failed test names and improve other messaging. Contributed by nigel.
[eli] HADOOP-7015. RawLocalFileSystem#listStatus does not deal with a directory whose entries are changing (e.g. in a multi-thread or multi-process environment). Contributed by Sanjay Radia
[nigel] HADOOP-7042. Updates to test-patch.sh to include failed test names and improve other messaging. Contributed by nigel.
[cos] Adding IntelliJ IDEA specific extentions to be ignored.
------------------------------------------
[...truncated 1547 lines...]
[exec] checking fcntl.h usability... yes
[exec] checking fcntl.h presence... yes
[exec] checking for fcntl.h... yes
[exec] checking for stdlib.h... (cached) yes
[exec] checking for string.h... (cached) yes
[exec] checking for unistd.h... (cached) yes
[exec] checking for an ANSI C-conforming const... yes
[exec] checking for memset... yes
[exec] configure: creating ./config.status
[exec] config.status: creating Makefile
[exec] config.status: creating config.h
[exec] config.status: executing depfiles commands
[exec] cd <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> && /bin/bash <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/config/missing> --run automake-1.9 --gnu
[exec] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/config/missing>: line 52: automake-1.9: command not found
[exec] WARNING: `automake-1.9' is missing on your system. You should only need it if
[exec] you modified `Makefile.am', `acinclude.m4' or `configure.ac'.
[exec] You might want to install the `Automake' and `Perl' packages.
[exec] Grab them from any GNU archive site.
[exec] cd <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> && /bin/bash <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/config/missing> --run autoconf
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] ../../lib/autoconf/general.m4:1974: AC_CACHE_VAL is expanded from...
[exec] ../../lib/autoconf/general.m4:1994: AC_CACHE_CHECK is expanded from...
[exec] aclocal.m4:621: AC_LIBTOOL_COMPILER_OPTION is expanded from...
[exec] aclocal.m4:4829: AC_LIBTOOL_PROG_COMPILER_PIC is expanded from...
[exec] aclocal.m4:2674: _LT_AC_LANG_C_CONFIG is expanded from...
[exec] aclocal.m4:2673: AC_LIBTOOL_LANG_C_CONFIG is expanded from...
[exec] aclocal.m4:86: AC_LIBTOOL_SETUP is expanded from...
[exec] aclocal.m4:66: _AC_PROG_LIBTOOL is expanded from...
[exec] aclocal.m4:31: AC_PROG_LIBTOOL is expanded from...
[exec] configure.ac:46: the top level
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_static_works, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] aclocal.m4:666: AC_LIBTOOL_LINKER_OPTION is expanded from...
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works_CXX, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] aclocal.m4:2751: _LT_AC_LANG_CXX_CONFIG is expanded from...
[exec] aclocal.m4:2750: AC_LIBTOOL_LANG_CXX_CONFIG is expanded from...
[exec] aclocal.m4:1810: _LT_AC_TAGCONFIG is expanded from...
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_static_works_CXX, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works_F77, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] aclocal.m4:3914: _LT_AC_LANG_F77_CONFIG is expanded from...
[exec] aclocal.m4:3913: AC_LIBTOOL_LANG_F77_CONFIG is expanded from...
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_static_works_F77, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works_GCJ, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] aclocal.m4:4016: _LT_AC_LANG_GCJ_CONFIG is expanded from...
[exec] aclocal.m4:4015: AC_LIBTOOL_LANG_GCJ_CONFIG is expanded from...
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_static_works_GCJ, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] /bin/bash ./config.status --recheck
[exec] running /bin/bash <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/configure> --no-create --no-recursion
[exec] checking for a BSD-compatible install... /usr/bin/install -c
[exec] checking whether build environment is sane... yes
[exec] checking for gawk... no
[exec] checking for mawk... mawk
[exec] checking whether make sets $(MAKE)... yes
[exec] checking for gcc... gcc
[exec] checking for C compiler default output file name... a.out
[exec] checking whether the C compiler works... yes
[exec] checking whether we are cross compiling... no
[exec] checking for suffix of executables...
[exec] checking for suffix of object files... o
[exec] checking whether we are using the GNU C compiler... yes
[exec] checking whether gcc accepts -g... yes
[exec] checking for gcc option to accept ISO C89... none needed
[exec] checking for style of include used by make... GNU
[exec] checking dependency style of gcc... gcc3
[exec] checking build system type... x86_64-unknown-linux-gnu
[exec] checking host system type... x86_64-unknown-linux-gnu
[exec] checking for a sed that does not truncate output... /bin/sed
[exec] checking for grep that handles long lines and -e... /bin/grep
[exec] checking for egrep... /bin/grep -E
[exec] checking for ld used by gcc... /usr/bin/ld
[exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
[exec] checking for /usr/bin/ld option to reload object files... -r
[exec] checking for BSD-compatible nm... /usr/bin/nm -B
[exec] checking whether ln -s works... yes
[exec] checking how to recognise dependent libraries... pass_all
[exec] checking how to run the C preprocessor... gcc -E
[exec] checking for ANSI C header files... yes
[exec] checking for sys/types.h... yes
[exec] checking for sys/stat.h... yes
[exec] checking for stdlib.h... yes
[exec] checking for string.h... yes
[exec] checking for memory.h... yes
[exec] checking for strings.h... yes
[exec] checking for inttypes.h... yes
[exec] checking for stdint.h... yes
[exec] checking for unistd.h... yes
[exec] checking dlfcn.h usability... yes
[exec] checking dlfcn.h presence... yes
[exec] checking for dlfcn.h... yes
[exec] checking for g++... g++
[exec] checking whether we are using the GNU C++ compiler... yes
[exec] checking whether g++ accepts -g... yes
[exec] checking dependency style of g++... gcc3
[exec] checking how to run the C++ preprocessor... g++ -E
[exec] checking for g77... no
[exec] checking for xlf... no
[exec] checking for f77... no
[exec] checking for frt... no
[exec] checking for pgf77... no
[exec] checking for cf77... no
[exec] checking for fort77... no
[exec] checking for fl32... no
[exec] checking for af77... no
[exec] checking for xlf90... no
[exec] checking for f90... no
[exec] checking for pgf90... no
[exec] checking for pghpf... no
[exec] checking for epcf90... no
[exec] checking for gfortran... no
[exec] checking for g95... no
[exec] checking for xlf95... no
[exec] checking for f95... no
[exec] checking for fort... no
[exec] checking for ifort... no
[exec] checking for ifc... no
[exec] checking for efc... no
[exec] checking for pgf95... no
[exec] checking for lf95... no
[exec] checking for ftn... no
[exec] checking whether we are using the GNU Fortran 77 compiler... no
[exec] checking whether accepts -g... no
[exec] checking the maximum length of command line arguments... 32768
[exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
[exec] checking for objdir... .libs
[exec] checking for ar... ar
[exec] checking for ranlib... ranlib
[exec] checking for strip... strip
[exec] checking if gcc supports -fno-rtti -fno-exceptions... no
[exec] checking for gcc option to produce PIC... -fPIC
[exec] checking if gcc PIC flag -fPIC works... yes
[exec] checking if gcc static flag -static works... yes
[exec] checking if gcc supports -c -o file.o... yes
[exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking whether -lc should be explicitly linked in... no
[exec] checking dynamic linker characteristics... GNU/Linux ld.so
[exec] checking how to hardcode library paths into programs... immediate
[exec] checking whether stripping libraries is possible... yes
[exec] checking if libtool supports shared libraries... yes
[exec] checking whether to build shared libraries... yes
[exec] checking whether to build static libraries... yes
[exec] configure: creating libtool
[exec] appending configuration tag "CXX" to libtool
[exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
[exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
[exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking for g++ option to produce PIC... -fPIC
[exec] checking if g++ PIC flag -fPIC works... yes
[exec] checking if g++ static flag -static works... yes
[exec] checking if g++ supports -c -o file.o... yes
[exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking dynamic linker characteristics... GNU/Linux ld.so
[exec] checking how to hardcode library paths into programs... immediate
[exec] appending configuration tag "F77" to libtool
[exec] checking for dlopen in -ldl... yes
[exec] checking for JNI_GetCreatedJavaVMs in -ljvm... no
[exec] checking for ANSI C header files... (cached) yes
[exec] checking stdio.h usability... yes
[exec] checking stdio.h presence... yes
[exec] checking for stdio.h... yes
[exec] checking stddef.h usability... yes
[exec] checking stddef.h presence... yes
[exec] checking for stddef.h... yes
[exec] checking jni.h usability... yes
[exec] checking jni.h presence... yes
[exec] checking for jni.h... yes
[exec] checking zlib.h usability... yes
[exec] checking zlib.h presence... yes
[exec] checking for zlib.h... yes
[exec] checking Checking for the 'actual' dynamic-library for '-lz'... "libz.so.1"
[exec] checking zconf.h usability... yes
[exec] checking zconf.h presence... yes
[exec] checking for zconf.h... yes
[exec] checking Checking for the 'actual' dynamic-library for '-lz'... (cached) "libz.so.1"
[exec] checking fcntl.h usability... yes
[exec] checking fcntl.h presence... yes
[exec] checking for fcntl.h... yes
[exec] checking for stdlib.h... (cached) yes
[exec] checking for string.h... (cached) yes
[exec] checking for unistd.h... (cached) yes
[exec] checking for an ANSI C-conforming const... yes
[exec] checking for memset... yes
[exec] configure: creating ./config.status
[exec] /bin/bash ./config.status
[exec] config.status: creating Makefile
[exec] config.status: WARNING: '<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/Makefile.in'> seems to ignore the --datarootdir setting
[exec] config.status: creating config.h
[exec] config.status: executing depfiles commands
[exec] cd <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> && /bin/sh <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/config/missing> --run autoheader
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] ../../lib/autoconf/general.m4:1974: AC_CACHE_VAL is expanded from...
[exec] ../../lib/autoconf/general.m4:1994: AC_CACHE_CHECK is expanded from...
[exec] aclocal.m4:621: AC_LIBTOOL_COMPILER_OPTION is expanded from...
[exec] aclocal.m4:4829: AC_LIBTOOL_PROG_COMPILER_PIC is expanded from...
[exec] aclocal.m4:2674: _LT_AC_LANG_C_CONFIG is expanded from...
[exec] aclocal.m4:2673: AC_LIBTOOL_LANG_C_CONFIG is expanded from...
[exec] aclocal.m4:86: AC_LIBTOOL_SETUP is expanded from...
[exec] aclocal.m4:66: _AC_PROG_LIBTOOL is expanded from...
[exec] aclocal.m4:31: AC_PROG_LIBTOOL is expanded from...
[exec] configure.ac:46: the top level
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_static_works, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] aclocal.m4:666: AC_LIBTOOL_LINKER_OPTION is expanded from...
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works_CXX, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] aclocal.m4:2751: _LT_AC_LANG_CXX_CONFIG is expanded from...
[exec] aclocal.m4:2750: AC_LIBTOOL_LANG_CXX_CONFIG is expanded from...
[exec] aclocal.m4:1810: _LT_AC_TAGCONFIG is expanded from...
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_static_works_CXX, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works_F77, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] aclocal.m4:3914: _LT_AC_LANG_F77_CONFIG is expanded from...
[exec] aclocal.m4:3913: AC_LIBTOOL_LANG_F77_CONFIG is expanded from...
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_static_works_F77, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_pic_works_GCJ, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] aclocal.m4:4016: _LT_AC_LANG_GCJ_CONFIG is expanded from...
[exec] aclocal.m4:4015: AC_LIBTOOL_LANG_GCJ_CONFIG is expanded from...
[exec] configure.ac:46: warning: AC_CACHE_VAL(lt_prog_compiler_static_works_GCJ, ...): suspicious cache-id, must contain _cv_ to be cached
[exec] rm -f stamp-h1
[exec] touch <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/config.h.in>
[exec] cd . && /bin/sh ./config.status config.h
[exec] config.status: creating config.h
[exec] config.status: config.h is unchanged
[exec] make all-am
[exec] make[1]: Entering directory `<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'>
[exec] if /bin/sh ./libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> -I. -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src> -Isrc/org/apache/hadoop/io/compress/zlib -Isrc/org/apache/hadoop/security -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF ".deps/ZlibCompressor.Tpo" -c -o ZlibCompressor.lo `test -f 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo '<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c;> \
[exec] then mv -f ".deps/ZlibCompressor.Tpo" ".deps/ZlibCompressor.Plo"; else rm -f ".deps/ZlibCompressor.Tpo"; exit 1; fi
[exec] mkdir .libs
[exec] gcc -DHAVE_CONFIG_H -I. -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> -I. -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src> -Isrc/org/apache/hadoop/io/compress/zlib -Isrc/org/apache/hadoop/security -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo -c <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c> -fPIC -DPIC -o .libs/ZlibCompressor.o
[exec] In file included from /usr/include/features.h:354,
[exec] from /usr/include/stdio.h:28,
[exec] from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c>:24:
[exec] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such file or directory
[exec] make[1]: Leaving directory `<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'>
[exec] make[1]: *** [ZlibCompressor.lo] Error 1
[exec] make: *** [all] Error 2
BUILD FAILED
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build.xml>:398: exec returned: 2
Total time: 33 seconds
mv: cannot stat `build/*.tar.tgz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
[WARNINGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Hudson build is back to normal : Hadoop-Common-trunk #520
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/520/>
Build failed in Hudson: Hadoop-Common-trunk #519
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/519/changes>
Changes:
[nigel] Fix bug in tar file name
------------------------------------------
[...truncated 1366 lines...]
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/slaves.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/conf/slaves>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/masters.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/conf/masters>
[copy] Copying 1 file to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/contrib>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/contrib/ec2/bin/hadoop-ec2-env.sh.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/contrib/ec2/bin/hadoop-ec2-env.sh>
record-parser:
compile-rcc-compiler:
[javac] Compiling 29 source files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes>
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
compile-core-classes:
[javac] Compiling 393 source files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes>
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:31: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.Config;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:32: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.KrbException;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:81: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] private static Config kerbConf;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:39: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] import sun.security.jgss.krb5.Krb5Util;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:40: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.Credentials;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:41: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.PrincipalName;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:85: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] kerbConf = Config.getInstance();
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:87: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release
[javac] } catch (KrbException ke) {
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:120: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] Credentials serviceCred = null;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:122: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName principal = new PrincipalName(serviceName,
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:122: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName principal = new PrincipalName(serviceName,
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:123: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName.KRB_NT_SRV_HST);
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:125: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] .toString(), Krb5Util.ticketToCreds(getTgtFromSubject()));
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:124: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] serviceCred = Credentials.acquireServiceCreds(principal
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:134: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] .add(Krb5Util.credsToTicket(serviceCred));
[javac] ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] 15 warnings
[copy] Copying 1 file to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes>
compile-core-native:
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security>
[javah] [Search path = /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/resources.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/sunrsasign.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/jsse.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/jce.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/charsets.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/classes:<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes]>
[javah] [Loaded <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes/org/apache/hadoop/io/compress/zlib/ZlibCompressor.class]>
[javah] [Loaded /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
[javah] [Forcefully writing file <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor.h]>
[javah] [Loaded <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.class]>
[javah] [Forcefully writing file <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibDecompressor.h]>
[javah] [Search path = /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/resources.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/sunrsasign.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/jsse.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/jce.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/charsets.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/classes:<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes]>
[javah] [Loaded <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]>
[javah] [Loaded /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
[javah] [Forcefully writing file <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsMapping.h]>
[exec] checking for a BSD-compatible install... /usr/bin/install -c
[exec] checking whether build environment is sane... yes
[exec] checking for gawk... no
[exec] checking for mawk... mawk
[exec] checking whether make sets $(MAKE)... yes
[exec] checking for gcc... gcc
[exec] checking for C compiler default output file name... a.out
[exec] checking whether the C compiler works... yes
[exec] checking whether we are cross compiling... no
[exec] checking for suffix of executables...
[exec] checking for suffix of object files... o
[exec] checking whether we are using the GNU C compiler... yes
[exec] checking whether gcc accepts -g... yes
[exec] checking for gcc option to accept ANSI C... none needed
[exec] checking for style of include used by make... GNU
[exec] checking dependency style of gcc... gcc3
[exec] checking build system type... x86_64-unknown-linux-gnu
[exec] checking host system type... x86_64-unknown-linux-gnu
[exec] checking for a sed that does not truncate output... /bin/sed
[exec] checking for egrep... grep -E
[exec] checking for ld used by gcc... /usr/bin/ld
[exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
[exec] checking for /usr/bin/ld option to reload object files... -r
[exec] checking for BSD-compatible nm... /usr/bin/nm -B
[exec] checking whether ln -s works... yes
[exec] checking how to recognise dependent libraries... pass_all
[exec] checking how to run the C preprocessor... gcc -E
[exec] checking for ANSI C header files... yes
[exec] checking for sys/types.h... yes
[exec] checking for sys/stat.h... yes
[exec] checking for stdlib.h... yes
[exec] checking for string.h... yes
[exec] checking for memory.h... yes
[exec] checking for strings.h... yes
[exec] checking for inttypes.h... yes
[exec] checking for stdint.h... yes
[exec] checking for unistd.h... yes
[exec] checking dlfcn.h usability... yes
[exec] checking dlfcn.h presence... yes
[exec] checking for dlfcn.h... yes
[exec] checking for g++... g++
[exec] checking whether we are using the GNU C++ compiler... yes
[exec] checking whether g++ accepts -g... yes
[exec] checking dependency style of g++... gcc3
[exec] checking how to run the C++ preprocessor... g++ -E
[exec] checking for g77... no
[exec] checking for f77... no
[exec] checking for xlf... no
[exec] checking for frt... no
[exec] checking for pgf77... no
[exec] checking for fort77... no
[exec] checking for fl32... no
[exec] checking for af77... no
[exec] checking for f90... no
[exec] checking for xlf90... no
[exec] checking for pgf90... no
[exec] checking for epcf90... no
[exec] checking for f95... no
[exec] checking for fort... no
[exec] checking for xlf95... no
[exec] checking for ifc... no
[exec] checking for efc... no
[exec] checking for pgf95... no
[exec] checking for lf95... no
[exec] checking for gfortran... no
[exec] checking whether we are using the GNU Fortran 77 compiler... no
[exec] checking whether accepts -g... no
[exec] checking the maximum length of command line arguments... 32768
[exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
[exec] checking for objdir... .libs
[exec] checking for ar... ar
[exec] checking for ranlib... ranlib
[exec] checking for strip... strip
[exec] checking if gcc supports -fno-rtti -fno-exceptions... no
[exec] checking for gcc option to produce PIC... -fPIC
[exec] checking if gcc PIC flag -fPIC works... yes
[exec] checking if gcc static flag -static works... yes
[exec] checking if gcc supports -c -o file.o... yes
[exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking whether -lc should be explicitly linked in... no
[exec] checking dynamic linker characteristics... GNU/Linux ld.so
[exec] checking how to hardcode library paths into programs... immediate
[exec] checking whether stripping libraries is possible... yes
[exec] checking if libtool supports shared libraries... yes
[exec] checking whether to build shared libraries... yes
[exec] checking whether to build static libraries... yes
[exec] configure: creating libtool
[exec] appending configuration tag "CXX" to libtool
[exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
[exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
[exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking for g++ option to produce PIC... -fPIC
[exec] checking if g++ PIC flag -fPIC works... yes
[exec] checking if g++ static flag -static works... yes
[exec] checking if g++ supports -c -o file.o... yes
[exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking dynamic linker characteristics... GNU/Linux ld.so
[exec] checking how to hardcode library paths into programs... immediate
[exec] appending configuration tag "F77" to libtool
[exec] checking for dlopen in -ldl... yes
[exec] checking for JNI_GetCreatedJavaVMs in -ljvm... no
[exec] checking for ANSI C header files... (cached) yes
[exec] checking stdio.h usability... yes
[exec] checking stdio.h presence... yes
[exec] checking for stdio.h... yes
[exec] checking stddef.h usability... yes
[exec] checking stddef.h presence... yes
[exec] checking for stddef.h... yes
[exec] checking jni.h usability... yes
[exec] checking jni.h presence... yes
[exec] checking for jni.h... yes
[exec] checking zlib.h usability... yes
[exec] checking zlib.h presence... yes
[exec] checking for zlib.h... yes
[exec] checking Checking for the 'actual' dynamic-library for '-lz'... "libz.so.1"
[exec] checking zconf.h usability... yes
[exec] checking zconf.h presence... yes
[exec] checking for zconf.h... yes
[exec] checking Checking for the 'actual' dynamic-library for '-lz'... (cached) "libz.so.1"
[exec] checking fcntl.h usability... yes
[exec] checking fcntl.h presence... yes
[exec] checking for fcntl.h... yes
[exec] checking for stdlib.h... (cached) yes
[exec] checking for string.h... (cached) yes
[exec] checking for unistd.h... (cached) yes
[exec] checking for an ANSI C-conforming const... yes
[exec] checking for memset... yes
[exec] configure: creating ./config.status
[exec] config.status: creating Makefile
[exec] config.status: creating config.h
[exec] config.status: executing depfiles commands
[exec] make all-am
[exec] make[1]: Entering directory `<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'>
[exec] if /bin/bash ./libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> -I. -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src> -Isrc/org/apache/hadoop/io/compress/zlib -Isrc/org/apache/hadoop/security -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF ".deps/ZlibCompressor.Tpo" -c -o ZlibCompressor.lo `test -f 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo '<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c;> \
[exec] then mv -f ".deps/ZlibCompressor.Tpo" ".deps/ZlibCompressor.Plo"; else rm -f ".deps/ZlibCompressor.Tpo"; exit 1; fi
[exec] mkdir .libs
[exec] gcc -DHAVE_CONFIG_H -I. -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> -I. -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src> -Isrc/org/apache/hadoop/io/compress/zlib -Isrc/org/apache/hadoop/security -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo -c <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c> -fPIC -DPIC -o .libs/ZlibCompressor.o
[exec] In file included from /usr/include/features.h:354,
[exec] from /usr/include/stdio.h:28,
[exec] from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c>:24:
[exec] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such file or directory
[exec] make[1]: Leaving directory `<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'>
[exec] make[1]: *** [ZlibCompressor.lo] Error 1
[exec] make: *** [all] Error 2
BUILD FAILED
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build.xml>:398: exec returned: 2
Total time: 21 seconds
======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================
mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
[WARNINGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Build failed in Hudson: Hadoop-Common-trunk #518
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/518/changes>
Changes:
[nigel] Add some comments to commitBuild.sh and put artifacts in a single directory that can be cleaned up.
------------------------------------------
[...truncated 3447 lines...]
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.163 sec
[junit] Running org.apache.hadoop.io.compress.TestCodec
[junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 63.575 sec
[junit] Running org.apache.hadoop.io.compress.TestCodecFactory
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.387 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFile
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 1.459 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileByteArrays
[junit] Tests run: 25, Failures: 0, Errors: 0, Time elapsed: 3.396 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileComparator2
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.436 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileComparators
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.362 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileJClassComparatorByteArrays
[junit] Tests run: 25, Failures: 0, Errors: 0, Time elapsed: 3.477 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsByteArrays
[junit] Tests run: 25, Failures: 0, Errors: 0, Time elapsed: 0.176 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams
[junit] Tests run: 19, Failures: 0, Errors: 0, Time elapsed: 0.157 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsByteArrays
[junit] Tests run: 25, Failures: 0, Errors: 0, Time elapsed: 1.519 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsJClassComparatorByteArrays
[junit] Tests run: 25, Failures: 0, Errors: 0, Time elapsed: 1.584 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsStreams
[junit] Tests run: 19, Failures: 0, Errors: 0, Time elapsed: 1.821 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileSeek
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 5.478 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileSeqFileComparison
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 12.708 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileSplit
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 12.687 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileStreams
[junit] Tests run: 19, Failures: 0, Errors: 0, Time elapsed: 1.903 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestTFileUnsortedByteArrays
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.522 sec
[junit] Running org.apache.hadoop.io.file.tfile.TestVLong
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 2.981 sec
[junit] Running org.apache.hadoop.io.retry.TestRetryProxy
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.316 sec
[junit] Running org.apache.hadoop.io.serializer.TestWritableSerialization
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.179 sec
[junit] Running org.apache.hadoop.io.serializer.avro.TestAvroSerialization
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.404 sec
[junit] Running org.apache.hadoop.ipc.TestAvroRpc
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.741 sec
[junit] Running org.apache.hadoop.ipc.TestIPC
[junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 51.835 sec
[junit] Running org.apache.hadoop.ipc.TestIPCServerResponder
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 5.447 sec
[junit] Running org.apache.hadoop.ipc.TestMiniRPCBenchmark
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.984 sec
[junit] Running org.apache.hadoop.ipc.TestRPC
[junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 55.642 sec
[junit] Running org.apache.hadoop.ipc.TestSaslRPC
[junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 0.881 sec
[junit] Running org.apache.hadoop.log.TestLogLevel
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 34.854 sec
[junit] Running org.apache.hadoop.metrics.TestMetricsServlet
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.199 sec
[junit] Running org.apache.hadoop.metrics.spi.TestOutputRecord
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.165 sec
[junit] Running org.apache.hadoop.net.TestDNS
[junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 0.108 sec
[junit] Running org.apache.hadoop.net.TestNetUtils
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.303 sec
[junit] Running org.apache.hadoop.net.TestScriptBasedMapping
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.327 sec
[junit] Running org.apache.hadoop.net.TestSocketIOWithTimeout
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.16 sec
[junit] Running org.apache.hadoop.record.TestBuffer
[junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.082 sec
[junit] Running org.apache.hadoop.record.TestRecordIO
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.159 sec
[junit] Running org.apache.hadoop.record.TestRecordVersioning
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.166 sec
[junit] Running org.apache.hadoop.security.TestCredentials
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.501 sec
[junit] Running org.apache.hadoop.security.TestDoAsEffectiveUser
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.968 sec
[junit] Running org.apache.hadoop.security.TestJNIGroupsMapping
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.212 sec
[junit] Running org.apache.hadoop.security.TestKerberosName
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.311 sec
[junit] Running org.apache.hadoop.security.TestSecurityUtil
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 0.325 sec
[junit] Running org.apache.hadoop.security.TestUserGroupInformation
[junit] Tests run: 12, Failures: 0, Errors: 0, Time elapsed: 0.349 sec
[junit] Running org.apache.hadoop.security.authorize.TestAccessControlList
[junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 0.371 sec
[junit] Running org.apache.hadoop.security.token.TestToken
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.287 sec
[junit] Running org.apache.hadoop.security.token.delegation.TestDelegationToken
[junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 29.722 sec
[junit] Running org.apache.hadoop.util.TestAsyncDiskService
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.134 sec
[junit] Running org.apache.hadoop.util.TestCyclicIteration
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.155 sec
[junit] Running org.apache.hadoop.util.TestDiskChecker
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.553 sec
[junit] Running org.apache.hadoop.util.TestGenericOptionsParser
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.681 sec
[junit] Running org.apache.hadoop.util.TestGenericsUtil
[junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.322 sec
[junit] Running org.apache.hadoop.util.TestHostsFileReader
[junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.245 sec
[junit] Running org.apache.hadoop.util.TestIndexedSort
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 1.574 sec
[junit] Running org.apache.hadoop.util.TestOptions
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.156 sec
[junit] Running org.apache.hadoop.util.TestPureJavaCrc32
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.576 sec
[junit] Running org.apache.hadoop.util.TestReflectionUtils
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.652 sec
[junit] Running org.apache.hadoop.util.TestRunJar
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.24 sec
[junit] Running org.apache.hadoop.util.TestShell
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 4.256 sec
[junit] Running org.apache.hadoop.util.TestStringUtils
[junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.193 sec
checkfailure:
injectfaults:
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi>
ivy-download:
[get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
[get] To: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/ivy/ivy-2.1.0.jar>
[get] Not modified - so not downloaded
ivy-init-dirs:
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/ivy>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/ivy/lib>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/ivy/report>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/ivy/maven>
ivy-probe-antlib:
ivy-init-antlib:
ivy-init:
[ivy:configure] :: loading settings :: file = <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/ivy/ivysettings.xml>
ivy-resolve-common:
ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/ivy/ivysettings.xml>
init:
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/classes>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/src>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/webapps>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/classes>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/extraconf>
[touch] Creating /tmp/null805681371
[delete] Deleting: /tmp/null805681371
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/conf>
[copy] Copying 5 files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/conf>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/hadoop-policy.xml.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/conf/hadoop-policy.xml>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/core-site.xml.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/conf/core-site.xml>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/hadoop-env.sh.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/conf/hadoop-env.sh>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/slaves.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/conf/slaves>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/masters.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/test/conf/masters>
record-parser:
compile-rcc-compiler:
[javac] Compiling 29 source files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/classes>
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
Trying to override old definition of task recordcc
compile-core-classes:
[javac] Compiling 393 source files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/classes>
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:31: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.Config;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:32: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.KrbException;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:81: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] private static Config kerbConf;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:39: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] import sun.security.jgss.krb5.Krb5Util;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:40: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.Credentials;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:41: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.PrincipalName;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:85: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] kerbConf = Config.getInstance();
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:87: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release
[javac] } catch (KrbException ke) {
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:120: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] Credentials serviceCred = null;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:122: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName principal = new PrincipalName(serviceName,
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:122: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName principal = new PrincipalName(serviceName,
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:123: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName.KRB_NT_SRV_HST);
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:125: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] .toString(), Krb5Util.ticketToCreds(getTgtFromSubject()));
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:124: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] serviceCred = Credentials.acquireServiceCreds(principal
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:134: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] .add(Krb5Util.credsToTicket(serviceCred));
[javac] ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] 15 warnings
[copy] Copying 1 file to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build-fi/classes>
ivy-resolve-test:
ivy-retrieve-test:
generate-test-records:
generate-avro-records:
BUILD FAILED
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build.xml>:769: The following error occurred while executing this line:
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/test/aop/build/aop.xml>:119: The following error occurred while executing this line:
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/test/aop/build/aop.xml>:147: The following error occurred while executing this line:
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build.xml>:477: taskdef class org.apache.avro.specific.SchemaTask cannot be found
Total time: 15 minutes 5 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
[WARNINGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure