You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Apache Hudson Server <hu...@hudson.apache.org> on 2010/11/21 12:01:41 UTC
Build failed in Hudson: Hadoop-Common-trunk #521
See <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/521/>
------------------------------------------
[...truncated 1369 lines...]
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/slaves.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/conf/slaves>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/conf/masters.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/conf/masters>
[copy] Copying 1 file to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/contrib>
[copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/contrib/ec2/bin/hadoop-ec2-env.sh.template> to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/contrib/ec2/bin/hadoop-ec2-env.sh>
record-parser:
compile-rcc-compiler:
[javac] Compiling 29 source files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes>
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
compile-core-classes:
[javac] Compiling 393 source files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes>
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:31: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.Config;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:32: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.KrbException;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:81: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] private static Config kerbConf;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:39: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] import sun.security.jgss.krb5.Krb5Util;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:40: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.Credentials;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:41: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] import sun.security.krb5.PrincipalName;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:85: warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
[javac] kerbConf = Config.getInstance();
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:87: warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future release
[javac] } catch (KrbException ke) {
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:120: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] Credentials serviceCred = null;
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:122: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName principal = new PrincipalName(serviceName,
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:122: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName principal = new PrincipalName(serviceName,
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:123: warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future release
[javac] PrincipalName.KRB_NT_SRV_HST);
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:125: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] .toString(), Krb5Util.ticketToCreds(getTgtFromSubject()));
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:124: warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future release
[javac] serviceCred = Credentials.acquireServiceCreds(principal
[javac] ^
[javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:134: warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future release
[javac] .add(Krb5Util.credsToTicket(serviceCred));
[javac] ^
[javac] Note: Some input files use or override a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] 15 warnings
[copy] Copying 1 file to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes>
compile-core-native:
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib>
[mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security>
[javah] [Search path = /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/resources.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/sunrsasign.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/jsse.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/jce.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/charsets.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/classes:<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes]>
[javah] [Loaded <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes/org/apache/hadoop/io/compress/zlib/ZlibCompressor.class]>
[javah] [Loaded /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
[javah] [Forcefully writing file <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibCompressor.h]>
[javah] [Loaded <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.class]>
[javah] [Forcefully writing file <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib/org_apache_hadoop_io_compress_zlib_ZlibDecompressor.h]>
[javah] [Search path = /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/resources.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/sunrsasign.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/jsse.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/jce.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/charsets.jar:/homes/hudson/tools/java/jdk1.6.0_11-32/jre/classes:<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes]>
[javah] [Loaded <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes/org/apache/hadoop/security/JniBasedUnixGroupsMapping.class]>
[javah] [Loaded /homes/hudson/tools/java/jdk1.6.0_11-32/jre/lib/rt.jar(java/lang/Object.class)]
[javah] [Forcefully writing file <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/security/org_apache_hadoop_security_JniBasedUnixGroupsMapping.h]>
[exec] checking for a BSD-compatible install... /usr/bin/install -c
[exec] checking whether build environment is sane... yes
[exec] checking for gawk... no
[exec] checking for mawk... mawk
[exec] checking whether make sets $(MAKE)... yes
[exec] checking for gcc... gcc
[exec] checking for C compiler default output file name... a.out
[exec] checking whether the C compiler works... yes
[exec] checking whether we are cross compiling... no
[exec] checking for suffix of executables...
[exec] checking for suffix of object files... o
[exec] checking whether we are using the GNU C compiler... yes
[exec] checking whether gcc accepts -g... yes
[exec] checking for gcc option to accept ANSI C... none needed
[exec] checking for style of include used by make... GNU
[exec] checking dependency style of gcc... gcc3
[exec] checking build system type... x86_64-unknown-linux-gnu
[exec] checking host system type... x86_64-unknown-linux-gnu
[exec] checking for a sed that does not truncate output... /bin/sed
[exec] checking for egrep... grep -E
[exec] checking for ld used by gcc... /usr/bin/ld
[exec] checking if the linker (/usr/bin/ld) is GNU ld... yes
[exec] checking for /usr/bin/ld option to reload object files... -r
[exec] checking for BSD-compatible nm... /usr/bin/nm -B
[exec] checking whether ln -s works... yes
[exec] checking how to recognise dependent libraries... pass_all
[exec] checking how to run the C preprocessor... gcc -E
[exec] checking for ANSI C header files... yes
[exec] checking for sys/types.h... yes
[exec] checking for sys/stat.h... yes
[exec] checking for stdlib.h... yes
[exec] checking for string.h... yes
[exec] checking for memory.h... yes
[exec] checking for strings.h... yes
[exec] checking for inttypes.h... yes
[exec] checking for stdint.h... yes
[exec] checking for unistd.h... yes
[exec] checking dlfcn.h usability... yes
[exec] checking dlfcn.h presence... yes
[exec] checking for dlfcn.h... yes
[exec] checking for g++... g++
[exec] checking whether we are using the GNU C++ compiler... yes
[exec] checking whether g++ accepts -g... yes
[exec] checking dependency style of g++... gcc3
[exec] checking how to run the C++ preprocessor... g++ -E
[exec] checking for g77... no
[exec] checking for f77... no
[exec] checking for xlf... no
[exec] checking for frt... no
[exec] checking for pgf77... no
[exec] checking for fort77... no
[exec] checking for fl32... no
[exec] checking for af77... no
[exec] checking for f90... no
[exec] checking for xlf90... no
[exec] checking for pgf90... no
[exec] checking for epcf90... no
[exec] checking for f95... no
[exec] checking for fort... no
[exec] checking for xlf95... no
[exec] checking for ifc... no
[exec] checking for efc... no
[exec] checking for pgf95... no
[exec] checking for lf95... no
[exec] checking for gfortran... no
[exec] checking whether we are using the GNU Fortran 77 compiler... no
[exec] checking whether accepts -g... no
[exec] checking the maximum length of command line arguments... 32768
[exec] checking command to parse /usr/bin/nm -B output from gcc object... ok
[exec] checking for objdir... .libs
[exec] checking for ar... ar
[exec] checking for ranlib... ranlib
[exec] checking for strip... strip
[exec] checking if gcc supports -fno-rtti -fno-exceptions... no
[exec] checking for gcc option to produce PIC... -fPIC
[exec] checking if gcc PIC flag -fPIC works... yes
[exec] checking if gcc static flag -static works... yes
[exec] checking if gcc supports -c -o file.o... yes
[exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking whether -lc should be explicitly linked in... no
[exec] checking dynamic linker characteristics... GNU/Linux ld.so
[exec] checking how to hardcode library paths into programs... immediate
[exec] checking whether stripping libraries is possible... yes
[exec] checking if libtool supports shared libraries... yes
[exec] checking whether to build shared libraries... yes
[exec] checking whether to build static libraries... yes
[exec] configure: creating libtool
[exec] appending configuration tag "CXX" to libtool
[exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
[exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
[exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking for g++ option to produce PIC... -fPIC
[exec] checking if g++ PIC flag -fPIC works... yes
[exec] checking if g++ static flag -static works... yes
[exec] checking if g++ supports -c -o file.o... yes
[exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[exec] checking dynamic linker characteristics... GNU/Linux ld.so
[exec] checking how to hardcode library paths into programs... immediate
[exec] appending configuration tag "F77" to libtool
[exec] checking for dlopen in -ldl... yes
[exec] checking for JNI_GetCreatedJavaVMs in -ljvm... no
[exec] checking for ANSI C header files... (cached) yes
[exec] checking stdio.h usability... yes
[exec] checking stdio.h presence... yes
[exec] checking for stdio.h... yes
[exec] checking stddef.h usability... yes
[exec] checking stddef.h presence... yes
[exec] checking for stddef.h... yes
[exec] checking jni.h usability... yes
[exec] checking jni.h presence... yes
[exec] checking for jni.h... yes
[exec] checking zlib.h usability... yes
[exec] checking zlib.h presence... yes
[exec] checking for zlib.h... yes
[exec] checking Checking for the 'actual' dynamic-library for '-lz'... "libz.so.1"
[exec] checking zconf.h usability... yes
[exec] checking zconf.h presence... yes
[exec] checking for zconf.h... yes
[exec] checking Checking for the 'actual' dynamic-library for '-lz'... (cached) "libz.so.1"
[exec] checking fcntl.h usability... yes
[exec] checking fcntl.h presence... yes
[exec] checking for fcntl.h... yes
[exec] checking for stdlib.h... (cached) yes
[exec] checking for string.h... (cached) yes
[exec] checking for unistd.h... (cached) yes
[exec] checking for an ANSI C-conforming const... yes
[exec] checking for memset... yes
[exec] configure: creating ./config.status
[exec] config.status: creating Makefile
[exec] config.status: creating config.h
[exec] config.status: executing depfiles commands
[exec] make all-am
[exec] make[1]: Entering directory `<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'>
[exec] if /bin/bash ./libtool --tag=CC --mode=compile gcc -DHAVE_CONFIG_H -I. -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> -I. -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src> -Isrc/org/apache/hadoop/io/compress/zlib -Isrc/org/apache/hadoop/security -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF ".deps/ZlibCompressor.Tpo" -c -o ZlibCompressor.lo `test -f 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo '<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c;> \
[exec] then mv -f ".deps/ZlibCompressor.Tpo" ".deps/ZlibCompressor.Plo"; else rm -f ".deps/ZlibCompressor.Tpo"; exit 1; fi
[exec] mkdir .libs
[exec] gcc -DHAVE_CONFIG_H -I. -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native> -I. -I/homes/hudson/tools/java/latest1.6/include -I/homes/hudson/tools/java/latest1.6/include/linux -I<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src> -Isrc/org/apache/hadoop/io/compress/zlib -Isrc/org/apache/hadoop/security -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF .deps/ZlibCompressor.Tpo -c <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c> -fPIC -DPIC -o .libs/ZlibCompressor.o
[exec] In file included from /usr/include/features.h:354,
[exec] from /usr/include/stdio.h:28,
[exec] from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c>:24:
[exec] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such file or directory
[exec] make[1]: Leaving directory `<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'>
[exec] make[1]: *** [ZlibCompressor.lo] Error 1
[exec] make: *** [all] Error 2
BUILD FAILED
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build.xml>:398: exec returned: 2
Total time: 23 seconds
======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================
mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
[WARNINGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Hudson build is back to normal : Hadoop-Common-trunk #522
Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk/522/>