You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2011/11/01 00:19:51 UTC

Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #58

See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/58/changes>

Changes:

[todd] HDFS-2512. Add textual error message to data transfer protocol responses. Contributed by Todd Lipcon.

[szetszwo] svn merge -c 1195656 from trunk for HDFS-2385.

[acmurthy] Merge -c 1195579 from trunk to branch-0.23 to fix MAPREDUCE-3275.

[acmurthy] Merge -c 1195575 from trunk to branch-0.23 to fix MAPREDUCE-3035.

[amarrk] MAPREDUCE-3241. [Rumen] Fix Rumen to ignore the AMStartedEvent. (amarrk)

[amarrk] MAPREDUCE-3166. [Rumen] Make Rumen use job history api instead of relying on current history file name format. (Ravi Gummadi via amarrk)

[amarrk] MAPREDUCE-3157. [Rumen] Fix TraceBuilder to handle 0.20 history file names also. (Ravi Gummadi via amarrk)

------------------------------------------
[...truncated 7787 lines...]
80 KB   
81 KB   
        
Downloaded: http://repo1.maven.org/maven2/org/apache/maven/plugins/maven-checkstyle-plugin/2.6/maven-checkstyle-plugin-2.6.jar (81 KB at 137.9 KB/sec)
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS 0.23.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target>
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (hdfs) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'.
Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes>
[INFO] Compiling 8 JSP source files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
log4j:WARN No appenders could be found for logger (org.apache.jasper.JspC).
log4j:WARN Please initialize the log4j system properly.
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.270
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (secondary) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'.
[INFO] Compiling 1 JSP source file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.016
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (datanode) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'.
[INFO] Compiling 3 JSP source files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.021
[INFO] 
[INFO] --- build-helper-maven-plugin:1.5:add-source (add-source) @ hadoop-hdfs ---
[INFO] Source directory: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp> added.
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ hadoop-hdfs ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ hadoop-hdfs ---
[INFO] Compiling 328 source files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-web-xmls) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/hdfs/WEB-INF>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/secondary/WEB-INF>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/datanode/WEB-INF>
     [copy] Copying 6 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (compile) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
     [copy] Copying 15 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native>
     [copy] Copied 6 empty directories to 2 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native>
[INFO] Executed tasks
[INFO] 
[INFO] --- make-maven-plugin:1.0-beta-1:autoreconf (compile) @ hadoop-hdfs ---
[INFO] 
[INFO] --- make-maven-plugin:1.0-beta-1:configure (compile) @ hadoop-hdfs ---
[INFO] checking for a BSD-compatible install... /usr/bin/install -c
[INFO] checking whether build environment is sane... yes
[INFO] checking for a thread-safe mkdir -p... /bin/mkdir -p
[INFO] checking for gawk... no
[INFO] checking for mawk... mawk
[INFO] checking whether make sets $(MAKE)... yes
[INFO] checking build system type... x86_64-unknown-linux-gnu
[INFO] checking host system type... x86_64-unknown-linux-gnu
[INFO] checking for style of include used by make... GNU
[INFO] checking for gcc... gcc
[INFO] checking whether the C compiler works... yes
[INFO] checking for C compiler default output file name... a.out
[INFO] checking for suffix of executables... 
[INFO] checking whether we are cross compiling... no
[INFO] checking for suffix of object files... o
[INFO] checking whether we are using the GNU C compiler... yes
[INFO] checking whether gcc accepts -g... yes
[INFO] checking for gcc option to accept ISO C89... none needed
[INFO] checking dependency style of gcc... gcc3
[INFO] checking for a sed that does not truncate output... /bin/sed
[INFO] checking for grep that handles long lines and -e... /bin/grep
[INFO] checking for egrep... /bin/grep -E
[INFO] checking for fgrep... /bin/grep -F
[INFO] checking for ld used by gcc... /usr/bin/ld
[INFO] checking if the linker (/usr/bin/ld) is GNU ld... yes
[INFO] checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
[INFO] checking the name lister (/usr/bin/nm -B) interface... BSD nm
[INFO] checking whether ln -s works... yes
[INFO] checking the maximum length of command line arguments... 1572864
[INFO] checking whether the shell understands some XSI constructs... yes
[INFO] checking whether the shell understands "+="... yes
[INFO] checking for /usr/bin/ld option to reload object files... -r
[INFO] checking for objdump... objdump
[INFO] checking how to recognize dependent libraries... pass_all
[INFO] checking for ar... ar
[INFO] checking for strip... strip
[INFO] checking for ranlib... ranlib
[INFO] checking command to parse /usr/bin/nm -B output from gcc object... ok
[INFO] checking how to run the C preprocessor... gcc -E
[INFO] checking for ANSI C header files... yes
[INFO] checking for sys/types.h... yes
[INFO] checking for sys/stat.h... yes
[INFO] checking for stdlib.h... yes
[INFO] checking for string.h... yes
[INFO] checking for memory.h... yes
[INFO] checking for strings.h... yes
[INFO] checking for inttypes.h... yes
[INFO] checking for stdint.h... yes
[INFO] checking for unistd.h... yes
[INFO] checking for dlfcn.h... yes
[INFO] checking for objdir... .libs
[INFO] checking if gcc supports -fno-rtti -fno-exceptions... no
[INFO] checking for gcc option to produce PIC... -fPIC -DPIC
[INFO] checking if gcc PIC flag -fPIC -DPIC works... yes
[INFO] checking if gcc static flag -static works... yes
[INFO] checking if gcc supports -c -o file.o... yes
[INFO] checking if gcc supports -c -o file.o... (cached) yes
[INFO] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[INFO] checking whether -lc should be explicitly linked in... no
[INFO] checking dynamic linker characteristics... GNU/Linux ld.so
[INFO] checking how to hardcode library paths into programs... immediate
[INFO] checking whether stripping libraries is possible... yes
[INFO] checking if libtool supports shared libraries... yes
[INFO] checking whether to build shared libraries... yes
[INFO] checking whether to build static libraries... yes
[INFO] *** Current host ***
[INFO] checking cached host system type... ok
[INFO] *** C-Language compilation tools ***
[INFO] checking for gcc... (cached) gcc
[INFO] checking whether we are using the GNU C compiler... (cached) yes
[INFO] checking whether gcc accepts -g... (cached) yes
[INFO] checking for gcc option to accept ISO C89... (cached) none needed
[INFO] checking dependency style of gcc... (cached) gcc3
[INFO] checking for ranlib... (cached) ranlib
[INFO] *** Host support ***
[INFO] checking C flags dependant on host system type... ok
[INFO] *** Java compilation tools ***
[INFO] checking for sablevm... NONE
[INFO] checking for kaffe... NONE
[INFO] checking for javac... /home/jenkins/tools/java/latest/bin/javac
[INFO] /home/jenkins/tools/java/latest/bin/javac
[INFO] checking wether the Java compiler (/home/jenkins/tools/java/latest/bin/javac) works... yes
[INFO] checking for jar... /home/jenkins/tools/java/latest/bin/jar
[INFO] checking where on earth this jvm library is..... ohh u there ... /home/jenkins/tools/java/latest/jre/lib/i386/server 
[INFO] VALUE OF JVM_ARCH IS :32
[INFO] gcc flags added
[INFO] checking for gcc... (cached) gcc
[INFO] checking whether we are using the GNU C compiler... (cached) yes
[INFO] checking whether gcc accepts -g... (cached) yes
[INFO] checking for gcc option to accept ISO C89... (cached) none needed
[INFO] checking dependency style of gcc... (cached) gcc3
[INFO] checking for size_t... no
[INFO] checking for strdup... no
[INFO] checking for strerror... no
[INFO] checking for strtoul... no
[INFO] checking fcntl.h usability... no
[INFO] checking fcntl.h presence... yes
[INFO] configure: WARNING: fcntl.h: present but cannot be compiled
[INFO] configure: WARNING: fcntl.h:     check for missing prerequisite headers?
[INFO] configure: WARNING: fcntl.h: see the Autoconf documentation
[INFO] configure: WARNING: fcntl.h:     section "Present But Cannot Be Compiled"
[INFO] configure: WARNING: fcntl.h: proceeding with the compiler's result
[INFO] configure: WARNING:     ## --------------------------------- ##
[INFO] configure: WARNING:     ## Report this to omalley@apache.org ##
[INFO] configure: WARNING:     ## --------------------------------- ##
[INFO] checking for fcntl.h... no
[INFO] checking for an ANSI C-conforming const... yes
[INFO] checking for working volatile... yes
[INFO] checking for stdbool.h that conforms to C99... yes
[INFO] checking for _Bool... no
[INFO] configure: creating ./config.status
[INFO] config.status: creating Makefile
[INFO] config.status: executing depfiles commands
[INFO] config.status: executing libtool commands
[INFO] 
[INFO] --- make-maven-plugin:1.0-beta-1:make-install (compile) @ hadoop-hdfs ---
[INFO] /bin/bash ./libtool --tag=CC   --mode=compile gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -Dsize_t=unsigned\ int -DHAVE_STDBOOL_H=1 -I.     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/home/jenkins/tools/java/latest/include -I/home/jenkins/tools/java/latest/include/linux -Wall -Wstrict-prototypes -MT hdfs.lo -MD -MP -MF .deps/hdfs.Tpo -c -o hdfs.lo hdfs.c
[INFO] libtool: compile:  gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" "-DPACKAGE_STRING=\"libhdfs 0.1.0\"" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" "-Dsize_t=unsigned int" -DHAVE_STDBOOL_H=1 -I. -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/home/jenkins/tools/java/latest/include -I/home/jenkins/tools/java/latest/include/linux -Wall -Wstrict-prototypes -MT hdfs.lo -MD -MP -MF .deps/hdfs.Tpo -c hdfs.c  -fPIC -DPIC -o .libs/hdfs.o
[INFO] In file included from /usr/include/features.h:378,
[INFO]                  from /usr/include/sys/types.h:27,
[INFO]                  from hdfs.h:22,
[INFO]                  from hdfs.c:19:
[INFO] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such file or directory
[INFO] In file included from /usr/include/sys/types.h:147,
[INFO]                  from hdfs.h:22,
[INFO]                  from hdfs.c:19:
[INFO] /usr/lib/gcc/x86_64-linux-gnu/4.4.3/include/stddef.h:211: error: duplicate 'unsigned'
[INFO] /usr/lib/gcc/x86_64-linux-gnu/4.4.3/include/stddef.h:211: error: two or more data types in declaration specifiers
[INFO] make: *** [hdfs.lo] Error 1
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [22.162s]
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23.890s
[INFO] Finished at: Mon Oct 31 22:09:40 UTC 2011
[INFO] Final Memory: 27M/275M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:make-maven-plugin:1.0-beta-1:make-install (compile) on project hadoop-hdfs: make returned an exit value != 0. Aborting build; see command output above for more information. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Publishing Clover coverage report...
Publishing Clover HTML report...
Publishing Clover XML report...
Publishing Clover coverage results...
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
	at org.apache.tools.ant.types.AbstractFileSet.getDirectoryScanner(AbstractFileSet.java:474)
	at hudson.FilePath$34.hasMatch(FilePath.java:1801)
	at hudson.FilePath$34.invoke(FilePath.java:1710)
	at hudson.FilePath$34.invoke(FilePath.java:1701)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1995)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:287)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-3157
Updating MAPREDUCE-3166
Updating MAPREDUCE-3035
Updating HDFS-2385
Updating MAPREDUCE-3275
Updating HDFS-2512
Updating MAPREDUCE-3241


Hadoop-Hdfs-0.23-Build - Build # 59 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/59/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 7867 lines...]
[INFO]                  from /usr/include/sys/types.h:27,
[INFO]                  from hdfs.h:22,
[INFO]                  from hdfs.c:19:
[INFO] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such file or directory
[INFO] In file included from /usr/include/sys/types.h:147,
[INFO]                  from hdfs.h:22,
[INFO]                  from hdfs.c:19:
[INFO] /usr/lib/gcc/x86_64-linux-gnu/4.4.3/include/stddef.h:211: error: duplicate 'unsigned'
[INFO] /usr/lib/gcc/x86_64-linux-gnu/4.4.3/include/stddef.h:211: error: two or more data types in declaration specifiers
[INFO] make: *** [hdfs.lo] Error 1
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [22.602s]
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23.145s
[INFO] Finished at: Tue Nov 01 05:02:48 UTC 2011
[INFO] Final Memory: 25M/242M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:make-maven-plugin:1.0-beta-1:make-install (compile) on project hadoop-hdfs: make returned an exit value != 0. Aborting build; see command output above for more information. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Publishing Clover coverage report...
Publishing Clover HTML report...
Publishing Clover XML report...
Publishing Clover coverage results...
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
	at org.apache.tools.ant.types.AbstractFileSet.getDirectoryScanner(AbstractFileSet.java:474)
	at hudson.FilePath$34.hasMatch(FilePath.java:1801)
	at hudson.FilePath$34.invoke(FilePath.java:1710)
	at hudson.FilePath$34.invoke(FilePath.java:1701)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1995)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:287)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemotePermissionException: This issue does not exist or you don't have permission to view it.
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #59

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/59/changes>

Changes:

[suresh] Fix HDFS-2552 to HDFS-2522

[suresh] Disable TestDfsOverAvroRpc in 0.23. Contributed by Suresh Srinivas.

[tomwhite] Merge -r 1195816:1195817 from trunk to branch-0.23. Fixes: HADOOP-7782.

[mahadev] MAPREDUCE-3317. Rumen TraceBuilder is emiting null as hostname. (Ravi Gummadi via mahadev) - Merging r1195814 from trunk.

[mahadev] MAPREDUCE-3316. Rebooted link is not working properly. (Bhallamudi Venkata Siva Kamesh via mahadev) - Merging r1195805 from trunk.

[acmurthy] Merge -c 1195792 from trunk to branch-0.23 to fix MAPREDUCE-3237.

[acmurthy] Fixing CHANGES.txt to reflect 0.23 content.

[acmurthy] Merge -c 1195764 from trunk to branch-0.23 to fix MAPREDUCE-3322.

[mahadev] MAPREDUCE-3103. Implement Job ACLs for MRAppMaster. (mahadev) - Merging r1195761 from trunk.

[szetszwo] svn merge -c 1195760 from trunk for HADOOP-7771.

[szetszwo] svn merge -c 1195754 from trunk for HDFS-2038.

[acmurthy] Merge -c 1195745 from trunk to branch-0.23 to fix MAPREDUCE-3220.

[acmurthy] Merge -c 1195743 from trunk to branch-0.23 to fix MAPREDUCE-3321.

[szetszwo] svn merge -c 1195731 from trunk for HDFS-2065.

------------------------------------------
[...truncated 7674 lines...]
[WARNING] 
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING] 
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING] 
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Hadoop HDFS
[INFO] Apache Hadoop HDFS Project
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS 0.23.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target>
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (hdfs) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'.
Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes>
[INFO] Compiling 8 JSP source files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
log4j:WARN No appenders could be found for logger (org.apache.jasper.JspC).
log4j:WARN Please initialize the log4j system properly.
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.261
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (secondary) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'.
[INFO] Compiling 1 JSP source file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.017
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (datanode) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'.
[INFO] Compiling 3 JSP source files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.025
[INFO] 
[INFO] --- build-helper-maven-plugin:1.5:add-source (add-source) @ hadoop-hdfs ---
[INFO] Source directory: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp> added.
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ hadoop-hdfs ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ hadoop-hdfs ---
[INFO] Compiling 328 source files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-web-xmls) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/hdfs/WEB-INF>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/secondary/WEB-INF>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/datanode/WEB-INF>
     [copy] Copying 6 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (compile) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
     [copy] Copying 15 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native>
     [copy] Copied 6 empty directories to 2 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native>
[INFO] Executed tasks
[INFO] 
[INFO] --- make-maven-plugin:1.0-beta-1:autoreconf (compile) @ hadoop-hdfs ---
[INFO] 
[INFO] --- make-maven-plugin:1.0-beta-1:configure (compile) @ hadoop-hdfs ---
[INFO] checking for a BSD-compatible install... /usr/bin/install -c
[INFO] checking whether build environment is sane... yes
[INFO] checking for a thread-safe mkdir -p... /bin/mkdir -p
[INFO] checking for gawk... no
[INFO] checking for mawk... mawk
[INFO] checking whether make sets $(MAKE)... yes
[INFO] checking build system type... x86_64-unknown-linux-gnu
[INFO] checking host system type... x86_64-unknown-linux-gnu
[INFO] checking for style of include used by make... GNU
[INFO] checking for gcc... gcc
[INFO] checking whether the C compiler works... yes
[INFO] checking for C compiler default output file name... a.out
[INFO] checking for suffix of executables... 
[INFO] checking whether we are cross compiling... no
[INFO] checking for suffix of object files... o
[INFO] checking whether we are using the GNU C compiler... yes
[INFO] checking whether gcc accepts -g... yes
[INFO] checking for gcc option to accept ISO C89... none needed
[INFO] checking dependency style of gcc... gcc3
[INFO] checking for a sed that does not truncate output... /bin/sed
[INFO] checking for grep that handles long lines and -e... /bin/grep
[INFO] checking for egrep... /bin/grep -E
[INFO] checking for fgrep... /bin/grep -F
[INFO] checking for ld used by gcc... /usr/bin/ld
[INFO] checking if the linker (/usr/bin/ld) is GNU ld... yes
[INFO] checking for BSD- or MS-compatible name lister (nm)... /usr/bin/nm -B
[INFO] checking the name lister (/usr/bin/nm -B) interface... BSD nm
[INFO] checking whether ln -s works... yes
[INFO] checking the maximum length of command line arguments... 1572864
[INFO] checking whether the shell understands some XSI constructs... yes
[INFO] checking whether the shell understands "+="... yes
[INFO] checking for /usr/bin/ld option to reload object files... -r
[INFO] checking for objdump... objdump
[INFO] checking how to recognize dependent libraries... pass_all
[INFO] checking for ar... ar
[INFO] checking for strip... strip
[INFO] checking for ranlib... ranlib
[INFO] checking command to parse /usr/bin/nm -B output from gcc object... ok
[INFO] checking how to run the C preprocessor... gcc -E
[INFO] checking for ANSI C header files... yes
[INFO] checking for sys/types.h... yes
[INFO] checking for sys/stat.h... yes
[INFO] checking for stdlib.h... yes
[INFO] checking for string.h... yes
[INFO] checking for memory.h... yes
[INFO] checking for strings.h... yes
[INFO] checking for inttypes.h... yes
[INFO] checking for stdint.h... yes
[INFO] checking for unistd.h... yes
[INFO] checking for dlfcn.h... yes
[INFO] checking for objdir... .libs
[INFO] checking if gcc supports -fno-rtti -fno-exceptions... no
[INFO] checking for gcc option to produce PIC... -fPIC -DPIC
[INFO] checking if gcc PIC flag -fPIC -DPIC works... yes
[INFO] checking if gcc static flag -static works... yes
[INFO] checking if gcc supports -c -o file.o... yes
[INFO] checking if gcc supports -c -o file.o... (cached) yes
[INFO] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes
[INFO] checking whether -lc should be explicitly linked in... no
[INFO] checking dynamic linker characteristics... GNU/Linux ld.so
[INFO] checking how to hardcode library paths into programs... immediate
[INFO] checking whether stripping libraries is possible... yes
[INFO] checking if libtool supports shared libraries... yes
[INFO] checking whether to build shared libraries... yes
[INFO] checking whether to build static libraries... yes
[INFO] *** Current host ***
[INFO] checking cached host system type... ok
[INFO] *** C-Language compilation tools ***
[INFO] checking for gcc... (cached) gcc
[INFO] checking whether we are using the GNU C compiler... (cached) yes
[INFO] checking whether gcc accepts -g... (cached) yes
[INFO] checking for gcc option to accept ISO C89... (cached) none needed
[INFO] checking dependency style of gcc... (cached) gcc3
[INFO] checking for ranlib... (cached) ranlib
[INFO] *** Host support ***
[INFO] checking C flags dependant on host system type... ok
[INFO] *** Java compilation tools ***
[INFO] checking for sablevm... NONE
[INFO] checking for kaffe... NONE
[INFO] checking for javac... /home/jenkins/tools/java/latest/bin/javac
[INFO] /home/jenkins/tools/java/latest/bin/javac
[INFO] checking wether the Java compiler (/home/jenkins/tools/java/latest/bin/javac) works... yes
[INFO] checking for jar... /home/jenkins/tools/java/latest/bin/jar
[INFO] checking where on earth this jvm library is..... ohh u there ... /home/jenkins/tools/java/latest/jre/lib/i386/server 
[INFO] VALUE OF JVM_ARCH IS :32
[INFO] gcc flags added
[INFO] checking for gcc... (cached) gcc
[INFO] checking whether we are using the GNU C compiler... (cached) yes
[INFO] checking whether gcc accepts -g... (cached) yes
[INFO] checking for gcc option to accept ISO C89... (cached) none needed
[INFO] checking dependency style of gcc... (cached) gcc3
[INFO] checking for size_t... no
[INFO] checking for strdup... no
[INFO] checking for strerror... no
[INFO] checking for strtoul... no
[INFO] checking fcntl.h usability... no
[INFO] checking fcntl.h presence... yes
[INFO] configure: WARNING: fcntl.h: present but cannot be compiled
[INFO] configure: WARNING: fcntl.h:     check for missing prerequisite headers?
[INFO] configure: WARNING: fcntl.h: see the Autoconf documentation
[INFO] configure: WARNING: fcntl.h:     section "Present But Cannot Be Compiled"
[INFO] configure: WARNING: fcntl.h: proceeding with the compiler's result
[INFO] configure: WARNING:     ## --------------------------------- ##
[INFO] configure: WARNING:     ## Report this to omalley@apache.org ##
[INFO] configure: WARNING:     ## --------------------------------- ##
[INFO] checking for fcntl.h... no
[INFO] checking for an ANSI C-conforming const... yes
[INFO] checking for working volatile... yes
[INFO] checking for stdbool.h that conforms to C99... yes
[INFO] checking for _Bool... no
[INFO] configure: creating ./config.status
[INFO] config.status: creating Makefile
[INFO] config.status: executing depfiles commands
[INFO] config.status: executing libtool commands
[INFO] 
[INFO] --- make-maven-plugin:1.0-beta-1:make-install (compile) @ hadoop-hdfs ---
[INFO] /bin/bash ./libtool --tag=CC   --mode=compile gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" -Dsize_t=unsigned\ int -DHAVE_STDBOOL_H=1 -I.     -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/home/jenkins/tools/java/latest/include -I/home/jenkins/tools/java/latest/include/linux -Wall -Wstrict-prototypes -MT hdfs.lo -MD -MP -MF .deps/hdfs.Tpo -c -o hdfs.lo hdfs.c
[INFO] libtool: compile:  gcc -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\" "-DPACKAGE_STRING=\"libhdfs 0.1.0\"" -DPACKAGE_BUGREPORT=\"omalley@apache.org\" -DPACKAGE_URL=\"\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" "-Dsize_t=unsigned int" -DHAVE_STDBOOL_H=1 -I. -g -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m32 -I/home/jenkins/tools/java/latest/include -I/home/jenkins/tools/java/latest/include/linux -Wall -Wstrict-prototypes -MT hdfs.lo -MD -MP -MF .deps/hdfs.Tpo -c hdfs.c  -fPIC -DPIC -o .libs/hdfs.o
[INFO] In file included from /usr/include/features.h:378,
[INFO]                  from /usr/include/sys/types.h:27,
[INFO]                  from hdfs.h:22,
[INFO]                  from hdfs.c:19:
[INFO] /usr/include/gnu/stubs.h:7:27: error: gnu/stubs-32.h: No such file or directory
[INFO] In file included from /usr/include/sys/types.h:147,
[INFO]                  from hdfs.h:22,
[INFO]                  from hdfs.c:19:
[INFO] /usr/lib/gcc/x86_64-linux-gnu/4.4.3/include/stddef.h:211: error: duplicate 'unsigned'
[INFO] /usr/lib/gcc/x86_64-linux-gnu/4.4.3/include/stddef.h:211: error: two or more data types in declaration specifiers
[INFO] make: *** [hdfs.lo] Error 1
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [22.602s]
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23.145s
[INFO] Finished at: Tue Nov 01 05:02:48 UTC 2011
[INFO] Final Memory: 25M/242M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:make-maven-plugin:1.0-beta-1:make-install (compile) on project hadoop-hdfs: make returned an exit value != 0. Aborting build; see command output above for more information. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Publishing Clover coverage report...
Publishing Clover HTML report...
Publishing Clover XML report...
Publishing Clover coverage results...
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
	at org.apache.tools.ant.types.AbstractFileSet.getDirectoryScanner(AbstractFileSet.java:474)
	at hudson.FilePath$34.hasMatch(FilePath.java:1801)
	at hudson.FilePath$34.invoke(FilePath.java:1710)
	at hudson.FilePath$34.invoke(FilePath.java:1701)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1995)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:287)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemotePermissionException: This issue does not exist or you don't have permission to view it.