You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/11/28 04:25:55 UTC

Build failed in Jenkins: Hadoop-Common-trunk #2051

See <https://builds.apache.org/job/Hadoop-Common-trunk/2051/changes>

Changes:

[aw] HADOOP-11505. Various native parts use bswap incorrectly and unportably

------------------------------------------
[...truncated 4352 lines...]
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target>
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-auth-examples ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-auth-examples ---
[INFO] There are 20 errors reported by Checkstyle 6.6 with checkstyle/checkstyle.xml ruleset.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-auth-examples ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-auth-examples ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-auth-examples ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 3 source files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-auth-examples ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/src/test/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-auth-examples ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-auth-examples ---
[INFO] No tests to run.
[INFO] 
[INFO] --- maven-war-plugin:2.4:war (default-war) @ hadoop-auth-examples ---
[INFO] Packaging webapp
[INFO] Assembling webapp [hadoop-auth-examples] in [<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/hadoop-auth-examples-3.0.0-SNAPSHOT]>
[INFO] Processing war project
[INFO] Copying webapp resources [<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/src/main/webapp]>
[INFO] Webapp assembled in [58 msecs]
[INFO] Building war: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/hadoop-auth-examples.war>
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-auth-examples ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/hadoop-auth-examples-3.0.0-SNAPSHOT-sources.jar>
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-auth-examples ---
[INFO] No sources in project. Archive not created.
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-auth-examples ---
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-auth-examples ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-auth-examples ---
[INFO] 
Loading source files for package org.apache.hadoop.security.authentication.examples...
Constructing Javadoc information...
Standard Doclet version 1.7.0_55
Building tree for all the packages and classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/RequestLoggerFilter.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/WhoClient.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/WhoServlet.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/package-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/package-summary.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/package-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/constant-values.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/serialized-form.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/class-use/WhoServlet.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/class-use/RequestLoggerFilter.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/class-use/WhoClient.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/org/apache/hadoop/security/authentication/examples/package-use.html...>
Building index for all the packages and classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/overview-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/index-all.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/deprecated-list.html...>
Building index for all classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/allclasses-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/allclasses-noframe.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/index.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/help-doc.html...>
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth-examples/target/hadoop-auth-examples-3.0.0-SNAPSHOT-javadoc.jar>
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-common ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target>
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/site/markdown> (includes = [UnixShellAPI.md], excludes = [])
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/site/resources> (includes = [configuration.xsl, core-default.xml], excludes = [])
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-common ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-common ---
[INFO] There are 11072 errors reported by Checkstyle 6.6 with checkstyle/checkstyle.xml ruleset.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-os) @ hadoop-common ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-common ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/test-dir>
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/test/data>
[INFO] Executed tasks
[INFO] 
[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @ hadoop-common ---
[INFO] Wrote protoc checksums to file <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/hadoop-maven-plugins-protoc-checksums.json>
[INFO] 
[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:version-info (version-info) @ hadoop-common ---
[WARNING] [svn, info] failed with error code 1
[INFO] SCM: GIT
[INFO] Computed MD5: d878fb9f3ae435ca4741e2d5a9f49fb2
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-common ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 7 resources
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-common ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 861 source files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/classes>
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/FastByteComparisons.java>:[25,16] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:[54,19] sun.net.dns.ResolverConfiguration is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:[55,20] sun.net.util.IPAddressUtil is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:[46,16] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:[21,16] sun.misc.Signal is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:[22,16] sun.misc.SignalHandler is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:[44,43] sun.misc.SignalHandler is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/LdapGroupsMapping.java>:[308,28] com.sun.jndi.ldap.LdapCtxFactory is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/FastByteComparisons.java>:[136,20] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/FastByteComparisons.java>:[142,22] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/FastByteComparisons.java>:[147,29] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:[536,9] sun.net.dns.ResolverConfiguration is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:[554,11] sun.net.util.IPAddressUtil is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:[556,21] sun.net.util.IPAddressUtil is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:[558,18] sun.net.util.IPAddressUtil is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SecurityUtil.java>:[560,21] sun.net.util.IPAddressUtil is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:[329,39] sun.nio.ch.DirectBuffer is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:[330,17] sun.misc.Cleaner is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:[331,25] sun.nio.ch.DirectBuffer is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:[684,17] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:[686,7] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/io/nativeio/NativeIO.java>:[686,24] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:[46,19] sun.misc.SignalHandler is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:[50,39] sun.misc.Signal is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:[50,21] sun.misc.Signal is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/SignalLogger.java>:[59,24] sun.misc.Signal is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoStreamUtils.java>:[39,37] sun.nio.ch.DirectBuffer is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoStreamUtils.java>:[40,21] sun.misc.Cleaner is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/crypto/CryptoStreamUtils.java>:[41,23] sun.nio.ch.DirectBuffer is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java>: Some input files use or override a deprecated API.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java>: Recompile with -Xlint:deprecation for details.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java>: Some input files use unchecked or unsafe operations.
[WARNING] <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FileSystem.java>: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- native-maven-plugin:1.0-alpha-8:javah (default) @ hadoop-common ---
[INFO] /bin/sh -c cd <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common> && /home/jenkins/tools/java/jdk1.7.0_55/bin/javah -d <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/native/javah> -classpath <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/classes>:<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-annotations/target/hadoop-annotations-3.0.0-SNAPSHOT.jar>:/home/jenkins/tools/java/jdk1.7.0_55/jre/../lib/tools.jar:/home/jenkins/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/home/jenkins/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar:/home/jenkins/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/home/jenkins/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/home/jenkins/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar:/home/jenkins/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/home/jenkins/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/home/jenkins/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/home/jenkins/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/home/jenkins/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/home/jenkins/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/home/jenkins/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/home/jenkins/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/home/jenkins/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/home/jenkins/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/home/jenkins/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-jaxrs/1.9.13/jackson-jaxrs-1.9.13.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-xc/1.9.13/jackson-xc-1.9.13.jar:/home/jenkins/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/home/jenkins/.m2/repository/asm/asm/3.2/asm-3.2.jar:/home/jenkins/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/home/jenkins/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/jenkins/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar:/home/jenkins/.m2/repository/org/apache/httpcomponents/httpcore/4.2.5/httpcore-4.2.5.jar:/home/jenkins/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/home/jenkins/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/home/jenkins/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/home/jenkins/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/home/jenkins/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/home/jenkins/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/home/jenkins/.m2/repository/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/home/jenkins/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/home/jenkins/.m2/repository/org/apache/avro/avro/1.7.4/avro-1.7.4.jar:/home/jenkins/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/home/jenkins/.m2/repository/org/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar:/home/jenkins/.m2/repository/com/google/re2j/re2j/1.0/re2j-1.0.jar:/home/jenkins/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/jenkins/.m2/repository/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/target/hadoop-auth-3.0.0-SNAPSHOT.jar>:/home/jenkins/.m2/repository/com/nimbusds/nimbus-jose-jwt/3.9/nimbus-jose-jwt-3.9.jar:/home/jenkins/.m2/repository/net/jcip/jcip-annotations/1.0/jcip-annotations-1.0.jar:/home/jenkins/.m2/repository/net/minidev/json-smart/1.1.1/json-smart-1.1.1.jar:/home/jenkins/.m2/repository/org/apache/directory/server/apacheds-kerberos-codec/2.0.0-M15/apacheds-kerberos-codec-2.0.0-M15.jar:/home/jenkins/.m2/repository/org/apache/directory/server/apacheds-i18n/2.0.0-M15/apacheds-i18n-2.0.0-M15.jar:/home/jenkins/.m2/repository/org/apache/directory/api/api-asn1-api/1.0.0-M20/api-asn1-api-1.0.0-M20.jar:/home/jenkins/.m2/repository/org/apache/directory/api/api-util/1.0.0-M20/api-util-1.0.0-M20.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-framework/2.7.1/curator-framework-2.7.1.jar:/home/jenkins/.m2/repository/com/jcraft/jsch/0.1.51/jsch-0.1.51.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-client/2.7.1/curator-client-2.7.1.jar:/home/jenkins/.m2/repository/org/apache/curator/curator-recipes/2.7.1/curator-recipes-2.7.1.jar:/home/jenkins/.m2/repository/com/google/code/findbugs/jsr305/3.0.0/jsr305-3.0.0.jar:/home/jenkins/.m2/repository/org/apache/htrace/htrace-core4/4.0.1-incubating/htrace-core4-4.0.1-incubating.jar:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/home/jenkins/.m2/repository/io/netty/netty/3.6.2.Final/netty-3.6.2.Final.jar:/home/jenkins/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/home/jenkins/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar org.apache.hadoop.io.compress.zlib.ZlibCompressor org.apache.hadoop.io.compress.zlib.ZlibDecompressor org.apache.hadoop.io.compress.bzip2.Bzip2Compressor org.apache.hadoop.io.compress.bzip2.Bzip2Decompressor org.apache.hadoop.security.JniBasedUnixGroupsMapping org.apache.hadoop.io.nativeio.NativeIO org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory org.apache.hadoop.security.JniBasedUnixGroupsNetgroupMapping org.apache.hadoop.io.compress.snappy.SnappyCompressor org.apache.hadoop.io.compress.snappy.SnappyDecompressor org.apache.hadoop.io.compress.lz4.Lz4Compressor org.apache.hadoop.io.compress.lz4.Lz4Decompressor org.apache.hadoop.io.erasurecode.ErasureCodeNative org.apache.hadoop.crypto.OpensslCipher org.apache.hadoop.crypto.random.OpensslSecureRandom org.apache.hadoop.util.NativeCrc32 org.apache.hadoop.net.unix.DomainSocket org.apache.hadoop.net.unix.DomainSocketWatcher
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (make) @ hadoop-common ---
[INFO] Executing tasks

main:
     [exec] -- The C compiler identification is GNU 4.8.4
     [exec] -- The CXX compiler identification is GNU 4.8.4
     [exec] -- Check for working C compiler: /usr/bin/cc
     [exec] -- Check for working C compiler: /usr/bin/cc -- works
     [exec] -- Detecting C compiler ABI info
     [exec] -- Detecting C compiler ABI info - done
     [exec] -- Check for working CXX compiler: /usr/bin/c++
     [exec] -- Check for working CXX compiler: /usr/bin/c++ -- works
     [exec] -- Detecting CXX compiler ABI info
     [exec] JAVA_HOME=, JAVA_JVM_LIBRARY=/home/jenkins/tools/java/jdk1.7.0_55/jre/lib/amd64/server/libjvm.so
     [exec] JAVA_INCLUDE_PATH=/home/jenkins/tools/java/jdk1.7.0_55/include, JAVA_INCLUDE_PATH2=/home/jenkins/tools/java/jdk1.7.0_55/include/linux
     [exec] Located all JNI components successfully.
     [exec] -- Detecting CXX compiler ABI info - done
     [exec] -- Found JNI: /home/jenkins/tools/java/jdk1.7.0_55/jre/lib/amd64/libjawt.so  
     [exec] -- Check if the system is big endian
     [exec] -- Searching 16 bit integer
     [exec] -- Looking for sys/types.h
     [exec] -- Looking for sys/types.h - found
     [exec] -- Looking for stdint.h
     [exec] -- Looking for stdint.h - found
     [exec] -- Looking for stddef.h
     [exec] -- Looking for stddef.h - found
     [exec] -- Check size of unsigned short
     [exec] -- Check size of unsigned short - done
     [exec] -- Using unsigned short
     [exec] -- Check if the system is big endian - little endian
     [exec] -- Looking for endian.h
     [exec] -- Looking for endian.h - found
     [exec] -- Looking for be64toh
     [exec] CMake Error: File <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/src/hadoop_endian.h.cmake> does not exist.
     [exec] CMake Error at CMakeLists.txt:164 (configure_file):
     [exec]   configure_file Problem configuring file
     [exec] 
     [exec] 
     [exec] CUSTOM_OPENSSL_PREFIX = 
     [exec] -- Looking for be64toh - found
     [exec] -- Found ZLIB: /lib/x86_64-linux-gnu/libz.so.1 (found version "1.2.8") 
     [exec] -- Found Snappy: /usr/lib/libsnappy.so.1
     [exec] -- Performing Test HAS_NEW_ENOUGH_OPENSSL
     [exec] -- Performing Test HAS_NEW_ENOUGH_OPENSSL - Success
     [exec] -- Looking for sync_file_range
     [exec] -- Looking for sync_file_range - found
     [exec] -- Looking for posix_fadvise
     [exec] -- Looking for posix_fadvise - found
     [exec] -- Looking for dlopen in dl
     [exec] -- Looking for dlopen in dl - found
     [exec] -- Configuring incomplete, errors occurred!
     [exec] See also "<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/native/CMakeFiles/CMakeOutput.log".>
     [exec] See also "<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/native/CMakeFiles/CMakeError.log".>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Annotations ......................... SUCCESS [  9.433 s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [ 22.950 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [03:56 min]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  3.911 s]
[INFO] Apache Hadoop Common .............................. FAILURE [ 20.966 s]
[INFO] Apache Hadoop NFS ................................. SKIPPED
[INFO] Apache Hadoop KMS ................................. SKIPPED
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:55 min
[INFO] Finished at: 2015-11-28T03:25:57+00:00
[INFO] Final Memory: 85M/1410M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec dir="<https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/native"> executable="cmake" failonerror="true">... @ 4:161 in <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/antrun/build-main.xml>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common
result: 1

################
# mvn -eaf test -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
################
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Jenkins build is back to normal : Hadoop-Common-trunk #2055

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Common-trunk/2055/changes>


Build failed in Jenkins: Hadoop-Common-trunk #2054

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Common-trunk/2054/changes>

Changes:

[cnauroth] Revert "HADOOP-11505. Various native parts use bswap incorrectly and

[cnauroth] Revert "HADOOP-11505. Various native parts use bswap incorrectly and

------------------------------------------
[...truncated 5387 lines...]
Running org.apache.hadoop.metrics2.lib.TestInterns
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.243 sec - in org.apache.hadoop.metrics2.lib.TestInterns
Running org.apache.hadoop.metrics2.source.TestJvmMetrics
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.593 sec - in org.apache.hadoop.metrics2.source.TestJvmMetrics
Running org.apache.hadoop.metrics2.filter.TestPatternFilter
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.528 sec - in org.apache.hadoop.metrics2.filter.TestPatternFilter
Running org.apache.hadoop.conf.TestConfigurationSubclass
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.447 sec - in org.apache.hadoop.conf.TestConfigurationSubclass
Running org.apache.hadoop.conf.TestGetInstances
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.354 sec - in org.apache.hadoop.conf.TestGetInstances
Running org.apache.hadoop.conf.TestConfigurationDeprecation
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.379 sec - in org.apache.hadoop.conf.TestConfigurationDeprecation
Running org.apache.hadoop.conf.TestDeprecatedKeys
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.689 sec - in org.apache.hadoop.conf.TestDeprecatedKeys
Running org.apache.hadoop.conf.TestConfiguration
Tests run: 62, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.549 sec - in org.apache.hadoop.conf.TestConfiguration
Running org.apache.hadoop.conf.TestReconfiguration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.017 sec - in org.apache.hadoop.conf.TestReconfiguration
Running org.apache.hadoop.conf.TestConfServlet
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.685 sec - in org.apache.hadoop.conf.TestConfServlet
Running org.apache.hadoop.test.TestJUnitSetup
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.179 sec - in org.apache.hadoop.test.TestJUnitSetup
Running org.apache.hadoop.test.TestMultithreadedTestUtil
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.191 sec - in org.apache.hadoop.test.TestMultithreadedTestUtil
Running org.apache.hadoop.test.TestGenericTestUtils
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.216 sec - in org.apache.hadoop.test.TestGenericTestUtils
Running org.apache.hadoop.test.TestTimedOutTestsListener
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.222 sec - in org.apache.hadoop.test.TestTimedOutTestsListener
Running org.apache.hadoop.metrics.TestMetricsServlet
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.094 sec - in org.apache.hadoop.metrics.TestMetricsServlet
Running org.apache.hadoop.metrics.spi.TestOutputRecord
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.062 sec - in org.apache.hadoop.metrics.spi.TestOutputRecord
Running org.apache.hadoop.metrics.ganglia.TestGangliaContext
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.212 sec - in org.apache.hadoop.metrics.ganglia.TestGangliaContext
Running org.apache.hadoop.net.TestNetUtils
Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.839 sec - in org.apache.hadoop.net.TestNetUtils
Running org.apache.hadoop.net.TestDNS
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.38 sec - in org.apache.hadoop.net.TestDNS
Running org.apache.hadoop.net.TestSocketIOWithTimeout
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.437 sec - in org.apache.hadoop.net.TestSocketIOWithTimeout
Running org.apache.hadoop.net.TestNetworkTopologyWithNodeGroup
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.229 sec - in org.apache.hadoop.net.TestNetworkTopologyWithNodeGroup
Running org.apache.hadoop.net.TestClusterTopology
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.26 sec - in org.apache.hadoop.net.TestClusterTopology
Running org.apache.hadoop.net.TestScriptBasedMappingWithDependency
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.705 sec - in org.apache.hadoop.net.TestScriptBasedMappingWithDependency
Running org.apache.hadoop.net.TestTableMapping
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.686 sec - in org.apache.hadoop.net.TestTableMapping
Running org.apache.hadoop.net.TestScriptBasedMapping
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.737 sec - in org.apache.hadoop.net.TestScriptBasedMapping
Running org.apache.hadoop.net.unix.TestDomainSocketWatcher
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.691 sec - in org.apache.hadoop.net.unix.TestDomainSocketWatcher
Running org.apache.hadoop.net.unix.TestDomainSocket
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.922 sec - in org.apache.hadoop.net.unix.TestDomainSocket
Running org.apache.hadoop.net.TestSwitchMapping
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.419 sec - in org.apache.hadoop.net.TestSwitchMapping
Running org.apache.hadoop.net.TestStaticMapping
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.802 sec - in org.apache.hadoop.net.TestStaticMapping
Running org.apache.hadoop.cli.TestCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.714 sec - in org.apache.hadoop.cli.TestCLI
Running org.apache.hadoop.io.TestSortedMapWritable
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.233 sec - in org.apache.hadoop.io.TestSortedMapWritable
Running org.apache.hadoop.io.TestIOUtils
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.471 sec - in org.apache.hadoop.io.TestIOUtils
Running org.apache.hadoop.io.TestSequenceFile
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.479 sec - in org.apache.hadoop.io.TestSequenceFile
Running org.apache.hadoop.io.TestEnumSetWritable
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.54 sec - in org.apache.hadoop.io.TestEnumSetWritable
Running org.apache.hadoop.io.TestWritableName
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.206 sec - in org.apache.hadoop.io.TestWritableName
Running org.apache.hadoop.io.TestBoundedByteArrayOutputStream
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.078 sec - in org.apache.hadoop.io.TestBoundedByteArrayOutputStream
Running org.apache.hadoop.io.TestSequenceFileAppend
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.004 sec - in org.apache.hadoop.io.TestSequenceFileAppend
Running org.apache.hadoop.io.TestBytesWritable
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.218 sec - in org.apache.hadoop.io.TestBytesWritable
Running org.apache.hadoop.io.TestSequenceFileSerialization
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.939 sec - in org.apache.hadoop.io.TestSequenceFileSerialization
Running org.apache.hadoop.io.TestDataByteBuffers
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.413 sec - in org.apache.hadoop.io.TestDataByteBuffers
Running org.apache.hadoop.io.file.tfile.TestTFileComparators
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.864 sec - in org.apache.hadoop.io.file.tfile.TestTFileComparators
Running org.apache.hadoop.io.file.tfile.TestTFileSeek
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.129 sec - in org.apache.hadoop.io.file.tfile.TestTFileSeek
Running org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsByteArrays
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.427 sec - in org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsByteArrays
Running org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams
Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.403 sec - in org.apache.hadoop.io.file.tfile.TestTFileLzoCodecsStreams
Running org.apache.hadoop.io.file.tfile.TestTFileUnsortedByteArrays
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.009 sec - in org.apache.hadoop.io.file.tfile.TestTFileUnsortedByteArrays
Running org.apache.hadoop.io.file.tfile.TestTFileStreams
Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.907 sec - in org.apache.hadoop.io.file.tfile.TestTFileStreams
Running org.apache.hadoop.io.file.tfile.TestTFile
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.459 sec - in org.apache.hadoop.io.file.tfile.TestTFile
Running org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsJClassComparatorByteArrays
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.554 sec - in org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsJClassComparatorByteArrays
Running org.apache.hadoop.io.file.tfile.TestTFileJClassComparatorByteArrays
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.635 sec - in org.apache.hadoop.io.file.tfile.TestTFileJClassComparatorByteArrays
Running org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsStreams
Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.037 sec - in org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsStreams
Running org.apache.hadoop.io.file.tfile.TestTFileSplit
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.766 sec - in org.apache.hadoop.io.file.tfile.TestTFileSplit
Running org.apache.hadoop.io.file.tfile.TestTFileComparator2
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.663 sec - in org.apache.hadoop.io.file.tfile.TestTFileComparator2
Running org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsByteArrays
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.557 sec - in org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsByteArrays
Running org.apache.hadoop.io.file.tfile.TestTFileByteArrays
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.735 sec - in org.apache.hadoop.io.file.tfile.TestTFileByteArrays
Running org.apache.hadoop.io.file.tfile.TestTFileSeqFileComparison
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.27 sec - in org.apache.hadoop.io.file.tfile.TestTFileSeqFileComparison
Running org.apache.hadoop.io.file.tfile.TestVLong
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.917 sec - in org.apache.hadoop.io.file.tfile.TestVLong
Running org.apache.hadoop.io.TestTextNonUTF8
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.201 sec - in org.apache.hadoop.io.TestTextNonUTF8
Running org.apache.hadoop.io.TestArrayWritable
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.209 sec - in org.apache.hadoop.io.TestArrayWritable
Running org.apache.hadoop.io.erasurecode.rawcoder.TestXORRawCoder
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.18 sec - in org.apache.hadoop.io.erasurecode.rawcoder.TestXORRawCoder
Running org.apache.hadoop.io.erasurecode.rawcoder.TestDummyRawCoder
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.099 sec - in org.apache.hadoop.io.erasurecode.rawcoder.TestDummyRawCoder
Running org.apache.hadoop.io.erasurecode.rawcoder.TestRSRawCoder
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.391 sec - in org.apache.hadoop.io.erasurecode.rawcoder.TestRSRawCoder
Running org.apache.hadoop.io.erasurecode.TestECSchema
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.089 sec - in org.apache.hadoop.io.erasurecode.TestECSchema
Running org.apache.hadoop.io.erasurecode.coder.TestXORCoder
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.26 sec - in org.apache.hadoop.io.erasurecode.coder.TestXORCoder
Running org.apache.hadoop.io.erasurecode.coder.TestRSErasureCoder
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.161 sec - in org.apache.hadoop.io.erasurecode.coder.TestRSErasureCoder
Running org.apache.hadoop.io.TestMapWritable
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.248 sec - in org.apache.hadoop.io.TestMapWritable
Running org.apache.hadoop.io.TestText
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.802 sec - in org.apache.hadoop.io.TestText
Running org.apache.hadoop.io.TestBooleanWritable
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.218 sec - in org.apache.hadoop.io.TestBooleanWritable
Running org.apache.hadoop.io.compress.zlib.TestZlibCompressorDecompressor
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.145 sec - in org.apache.hadoop.io.compress.zlib.TestZlibCompressorDecompressor
Running org.apache.hadoop.io.compress.TestCompressionStreamReuse
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.679 sec - in org.apache.hadoop.io.compress.TestCompressionStreamReuse
Running org.apache.hadoop.io.compress.lz4.TestLz4CompressorDecompressor
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.345 sec - in org.apache.hadoop.io.compress.lz4.TestLz4CompressorDecompressor
Running org.apache.hadoop.io.compress.TestCodecFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.655 sec - in org.apache.hadoop.io.compress.TestCodecFactory
Running org.apache.hadoop.io.compress.TestCodec
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 146.959 sec - in org.apache.hadoop.io.compress.TestCodec
Running org.apache.hadoop.io.compress.bzip2.TestBzip2CompressorDecompressor
Tests run: 2, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 0.498 sec - in org.apache.hadoop.io.compress.bzip2.TestBzip2CompressorDecompressor
Running org.apache.hadoop.io.compress.TestCompressorDecompressor
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.705 sec - in org.apache.hadoop.io.compress.TestCompressorDecompressor
Running org.apache.hadoop.io.compress.TestCodecPool
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.547 sec - in org.apache.hadoop.io.compress.TestCodecPool
Running org.apache.hadoop.io.compress.snappy.TestSnappyCompressorDecompressor
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.48 sec - in org.apache.hadoop.io.compress.snappy.TestSnappyCompressorDecompressor
Running org.apache.hadoop.io.compress.TestBlockDecompressorStream
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.08 sec - in org.apache.hadoop.io.compress.TestBlockDecompressorStream
Running org.apache.hadoop.io.compress.TestCompressorStream
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.094 sec - in org.apache.hadoop.io.compress.TestCompressorStream
Running org.apache.hadoop.io.TestMD5Hash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.216 sec - in org.apache.hadoop.io.TestMD5Hash
Running org.apache.hadoop.io.TestSecureIOUtils
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.884 sec - in org.apache.hadoop.io.TestSecureIOUtils
Running org.apache.hadoop.io.TestMapFile
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.551 sec - in org.apache.hadoop.io.TestMapFile
Running org.apache.hadoop.io.TestGenericWritable
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.593 sec - in org.apache.hadoop.io.TestGenericWritable
Running org.apache.hadoop.io.nativeio.TestNativeIO
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.177 sec - in org.apache.hadoop.io.nativeio.TestNativeIO
Running org.apache.hadoop.io.nativeio.TestSharedFileDescriptorFactory
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.4 sec - in org.apache.hadoop.io.nativeio.TestSharedFileDescriptorFactory
Running org.apache.hadoop.io.TestDefaultStringifier
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.454 sec - in org.apache.hadoop.io.TestDefaultStringifier
Running org.apache.hadoop.io.TestArrayPrimitiveWritable
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.217 sec - in org.apache.hadoop.io.TestArrayPrimitiveWritable
Running org.apache.hadoop.io.TestUTF8
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.836 sec - in org.apache.hadoop.io.TestUTF8
Running org.apache.hadoop.io.TestBloomMapFile
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.803 sec - in org.apache.hadoop.io.TestBloomMapFile
Running org.apache.hadoop.io.retry.TestRetryProxy
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.239 sec - in org.apache.hadoop.io.retry.TestRetryProxy
Running org.apache.hadoop.io.retry.TestDefaultRetryPolicy
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.507 sec - in org.apache.hadoop.io.retry.TestDefaultRetryPolicy
Running org.apache.hadoop.io.retry.TestFailoverProxy
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.107 sec - in org.apache.hadoop.io.retry.TestFailoverProxy
Running org.apache.hadoop.io.TestVersionedWritable
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.184 sec - in org.apache.hadoop.io.TestVersionedWritable
Running org.apache.hadoop.io.TestObjectWritableProtos
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.359 sec - in org.apache.hadoop.io.TestObjectWritableProtos
Running org.apache.hadoop.io.TestSequenceFileSync
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.159 sec - in org.apache.hadoop.io.TestSequenceFileSync
Running org.apache.hadoop.io.serializer.TestSerializationFactory
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.576 sec - in org.apache.hadoop.io.serializer.TestSerializationFactory
Running org.apache.hadoop.io.serializer.avro.TestAvroSerialization
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.72 sec - in org.apache.hadoop.io.serializer.avro.TestAvroSerialization
Running org.apache.hadoop.io.serializer.TestWritableSerialization
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.439 sec - in org.apache.hadoop.io.serializer.TestWritableSerialization
Running org.apache.hadoop.io.TestWritable
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.397 sec - in org.apache.hadoop.io.TestWritable
Running org.apache.hadoop.io.TestWritableUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.211 sec - in org.apache.hadoop.io.TestWritableUtils
Running org.apache.hadoop.io.TestArrayFile
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.05 sec - in org.apache.hadoop.io.TestArrayFile
Running org.apache.hadoop.io.TestSetFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.184 sec - in org.apache.hadoop.io.TestSetFile

Results :

Failed tests: 
  TestReloadingX509TrustManager.testReload:112 expected:<2> but was:<1>

Tests run: 3206, Failures: 1, Errors: 0, Skipped: 86

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Annotations ......................... SUCCESS [  9.492 s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [ 23.911 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [03:58 min]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  3.967 s]
[INFO] Apache Hadoop Common .............................. FAILURE [23:22 min]
[INFO] Apache Hadoop NFS ................................. SKIPPED
[INFO] Apache Hadoop KMS ................................. SKIPPED
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 28:00 min
[INFO] Finished at: 2015-11-29T02:48:56+00:00
[INFO] Final Memory: 81M/940M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-common: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-common/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common
result: 1

################
# mvn -eaf test -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
################
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Hadoop-Common-trunk #2053

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Common-trunk/2053/changes>

Changes:

[cnauroth] HADOOP-12600. FileContext and AbstractFileSystem should be annotated as

------------------------------------------
[...truncated 3877 lines...]
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-annotations/target/deprecated-list.html...>
Building index for all classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-annotations/target/allclasses-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-annotations/target/allclasses-noframe.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-annotations/target/index.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-annotations/target/overview-summary.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-annotations/target/help-doc.html...>
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-annotations/target/hadoop-annotations-3.0.0-SNAPSHOT-javadoc.jar>
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop MiniKDC 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-minikdc ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target>
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-minikdc ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-minikdc ---
[INFO] There are 10 errors reported by Checkstyle 6.6 with checkstyle/checkstyle.xml ruleset.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-minikdc ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-minikdc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-minikdc ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 2 source files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-minikdc ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/src/test/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-minikdc ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 2 source files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-minikdc ---
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.minikdc.TestChangeOrgNameAndDomain
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.189 sec - in org.apache.hadoop.minikdc.TestChangeOrgNameAndDomain
Running org.apache.hadoop.minikdc.TestMiniKdc
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.263 sec - in org.apache.hadoop.minikdc.TestMiniKdc

Results :

Tests run: 6, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-jar-plugin:2.5:jar (default-jar) @ hadoop-minikdc ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/hadoop-minikdc-3.0.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-minikdc ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/hadoop-minikdc-3.0.0-SNAPSHOT-sources.jar>
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-minikdc ---
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/hadoop-minikdc-3.0.0-SNAPSHOT-test-sources.jar>
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-minikdc ---
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-minikdc ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-minikdc ---
[INFO] 
Loading source files for package org.apache.hadoop.minikdc...
Constructing Javadoc information...
Standard Doclet version 1.7.0_55
Building tree for all the packages and classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/org/apache/hadoop/minikdc/KerberosSecurityTestcase.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/org/apache/hadoop/minikdc/MiniKdc.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/org/apache/hadoop/minikdc/package-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/org/apache/hadoop/minikdc/package-summary.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/org/apache/hadoop/minikdc/package-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/constant-values.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/org/apache/hadoop/minikdc/class-use/KerberosSecurityTestcase.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/org/apache/hadoop/minikdc/class-use/MiniKdc.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/org/apache/hadoop/minikdc/package-use.html...>
Building index for all the packages and classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/overview-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/index-all.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/deprecated-list.html...>
Building index for all classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/allclasses-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/allclasses-noframe.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/index.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/help-doc.html...>
2 warnings
[WARNING] Javadoc Warnings
[WARNING] /home/jenkins/.m2/repository/org/apache/directory/api/api-ldap-model/1.0.0-M20/api-ldap-model-1.0.0-M20.jar(org/apache/directory/api/ldap/model/name/Dn.class): warning: Cannot find annotation method 'value()' in type 'SuppressWarnings': class file for edu.umd.cs.findbugs.annotations.SuppressWarnings not found
[WARNING] /home/jenkins/.m2/repository/org/apache/directory/api/api-ldap-model/1.0.0-M20/api-ldap-model-1.0.0-M20.jar(org/apache/directory/api/ldap/model/name/Dn.class): warning: Cannot find annotation method 'justification()' in type 'SuppressWarnings'
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-minikdc/target/hadoop-minikdc-3.0.0-SNAPSHOT-javadoc.jar>
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop Auth 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-auth ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/target>
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-auth ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-auth ---
[INFO] There are 258 errors reported by Checkstyle 6.6 with checkstyle/checkstyle.xml ruleset.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-auth ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-auth ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/src/main/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-auth ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 25 source files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-auth ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/src/test/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-auth ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 24 source files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-auth ---
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.security.authentication.server.TestAuthenticationToken
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.081 sec - in org.apache.hadoop.security.authentication.server.TestAuthenticationToken
Running org.apache.hadoop.security.authentication.server.TestJWTRedirectAuthentictionHandler
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.85 sec - in org.apache.hadoop.security.authentication.server.TestJWTRedirectAuthentictionHandler
Running org.apache.hadoop.security.authentication.server.TestPseudoAuthenticationHandler
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.348 sec - in org.apache.hadoop.security.authentication.server.TestPseudoAuthenticationHandler
Running org.apache.hadoop.security.authentication.server.TestAuthenticationFilter
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.735 sec - in org.apache.hadoop.security.authentication.server.TestAuthenticationFilter
Running org.apache.hadoop.security.authentication.server.TestKerberosAuthenticationHandler
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.663 sec - in org.apache.hadoop.security.authentication.server.TestKerberosAuthenticationHandler
Running org.apache.hadoop.security.authentication.server.TestAltKerberosAuthenticationHandler
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.876 sec - in org.apache.hadoop.security.authentication.server.TestAltKerberosAuthenticationHandler
Running org.apache.hadoop.security.authentication.util.TestCertificateUtil
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.108 sec - in org.apache.hadoop.security.authentication.util.TestCertificateUtil
Running org.apache.hadoop.security.authentication.util.TestStringSignerSecretProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.066 sec - in org.apache.hadoop.security.authentication.util.TestStringSignerSecretProvider
Running org.apache.hadoop.security.authentication.util.TestRandomSignerSecretProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.178 sec - in org.apache.hadoop.security.authentication.util.TestRandomSignerSecretProvider
Running org.apache.hadoop.security.authentication.util.TestJaasConfiguration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.076 sec - in org.apache.hadoop.security.authentication.util.TestJaasConfiguration
Running org.apache.hadoop.security.authentication.util.TestSigner
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.096 sec - in org.apache.hadoop.security.authentication.util.TestSigner
Running org.apache.hadoop.security.authentication.util.TestKerberosUtil
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.256 sec - in org.apache.hadoop.security.authentication.util.TestKerberosUtil
Running org.apache.hadoop.security.authentication.util.TestKerberosName
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.243 sec - in org.apache.hadoop.security.authentication.util.TestKerberosName
Running org.apache.hadoop.security.authentication.util.TestRolloverSignerSecretProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.171 sec - in org.apache.hadoop.security.authentication.util.TestRolloverSignerSecretProvider
Running org.apache.hadoop.security.authentication.util.TestZKSignerSecretProvider
Tests run: 3, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 16.971 sec <<< FAILURE! - in org.apache.hadoop.security.authentication.util.TestZKSignerSecretProvider
testMultipleInit(org.apache.hadoop.security.authentication.util.TestZKSignerSecretProvider)  Time elapsed: 4.097 sec  <<< FAILURE!
java.lang.AssertionError: expected null, but was:<[B...@142bad79>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotNull(Assert.java:664)
	at org.junit.Assert.assertNull(Assert.java:646)
	at org.junit.Assert.assertNull(Assert.java:656)
	at org.apache.hadoop.security.authentication.util.TestZKSignerSecretProvider.testMultipleInit(TestZKSignerSecretProvider.java:149)

Running org.apache.hadoop.security.authentication.util.TestFileSignerSecretProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.073 sec - in org.apache.hadoop.security.authentication.util.TestFileSignerSecretProvider
Running org.apache.hadoop.security.authentication.util.TestAuthToken
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.219 sec - in org.apache.hadoop.security.authentication.util.TestAuthToken
Running org.apache.hadoop.security.authentication.client.TestAuthenticatedURL
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.331 sec - in org.apache.hadoop.security.authentication.client.TestAuthenticatedURL
Running org.apache.hadoop.security.authentication.client.TestPseudoAuthenticator
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.003 sec - in org.apache.hadoop.security.authentication.client.TestPseudoAuthenticator
Running org.apache.hadoop.security.authentication.client.TestKerberosAuthenticator
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.098 sec - in org.apache.hadoop.security.authentication.client.TestKerberosAuthenticator

Results :

Failed tests: 
  TestZKSignerSecretProvider.testMultipleInit:149 expected null, but was:<[B...@142bad79>

Tests run: 112, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Annotations ......................... SUCCESS [  9.457 s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [ 23.639 s]
[INFO] Apache Hadoop Auth ................................ FAILURE [03:40 min]
[INFO] Apache Hadoop Auth Examples ....................... SKIPPED
[INFO] Apache Hadoop Common .............................. SKIPPED
[INFO] Apache Hadoop NFS ................................. SKIPPED
[INFO] Apache Hadoop KMS ................................. SKIPPED
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:14 min
[INFO] Finished at: 2015-11-29T00:25:13+00:00
[INFO] Final Memory: 63M/1314M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-auth: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-auth/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-auth
result: 1

################
# mvn -eaf test -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
################
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Hadoop-Common-trunk #2052

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Common-trunk/2052/changes>

Changes:

[aw] HADOOP-11505. Various native parts use bswap incorrectly and unportably

------------------------------------------
[...truncated 9045 lines...]
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/READ3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/READDIR3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/READDIR3Response.Entry3.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/READDIR3Response.DirList3.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/READLINK3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/SYMLINK3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/RENAME3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/CREATE3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/MKDIR3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/ACCESS3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/REMOVE3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/WccData.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/WRITE3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/PATHCONF3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/LINK3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/WccAttr.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/FSSTAT3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/class-use/FSINFO3Response.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/Nfs3Interface.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/Nfs3Base.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/Nfs3FileAttributes.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/Nfs3FileAttributes.Specdata3.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/Nfs3Constant.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/Nfs3Constant.NFSPROC3.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/Nfs3Constant.WriteStableHow.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/Nfs3Status.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/class-use/FileHandle.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcUtil.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RegistrationClient.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/SimpleTcpClient.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcMessage.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcMessage.Type.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcCall.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcResponse.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcReply.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcReply.ReplyState.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/SimpleTcpServer.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcProgram.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/SimpleTcpClientHandler.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/SimpleUdpClient.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/SimpleUdpServer.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcDeniedReply.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcDeniedReply.RejectState.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcCallCache.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcCallCache.CacheEntry.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcCallCache.ClientRequest.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/XDR.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/XDR.State.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcAcceptedReply.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcAcceptedReply.AcceptState.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/class-use/RpcInfo.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/VerifierGSS.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/Verifier.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/SysSecurityHandler.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/Credentials.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/CredentialsNone.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/CredentialsSys.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/SecurityHandler.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/RpcAuthInfo.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/RpcAuthInfo.AuthFlavor.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/CredentialsGSS.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/class-use/VerifierNone.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/mount/class-use/MountEntry.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/mount/class-use/MountdBase.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/mount/class-use/MountResponse.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/mount/class-use/MountInterface.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/mount/class-use/MountInterface.MNTPROC.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/mount/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/request/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/nfs/nfs3/response/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/oncrpc/security/package-use.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/org/apache/hadoop/portmap/package-use.html...>
Building index for all the packages and classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/overview-tree.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/index-all.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/deprecated-list.html...>
Building index for all classes...
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/allclasses-frame.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/allclasses-noframe.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/index.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/overview-summary.html...>
Generating <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/help-doc.html...>
[INFO] Building jar: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/hadoop-nfs-3.0.0-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-assembly-plugin:2.4:single (dist) @ hadoop-nfs ---
[INFO] Reading assembly descriptor: ../../hadoop-assemblies/src/main/resources/assemblies/hadoop-nfs-dist.xml
[WARNING] The following patterns were never triggered in this artifact exclusion filter:
o  'org.apache.hadoop:hadoop-common'
o  'org.apache.hadoop:hadoop-hdfs'
o  'org.hsqldb:hsqldb'

[INFO] Copying files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-nfs/target/hadoop-nfs-3.0.0-SNAPSHOT>
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop KMS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-kms ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/target>
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-kms ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-kms ---
[INFO] There are 45 errors reported by Checkstyle 6.6 with checkstyle/checkstyle.xml ruleset.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-kms ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hadoop-kms ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/src/main/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-kms ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 14 source files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/target/classes>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-web-xmls) @ hadoop-kms ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/target/test-classes/kms-webapp>
     [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/target/test-classes/kms-webapp>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hadoop-kms ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hadoop-kms ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 6 source files to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-kms ---
[WARNING] The parameter forkMode is deprecated since version 2.14. Use forkCount and reuseForks instead.
[INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.crypto.key.kms.server.TestKMSWithZK
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.687 sec - in org.apache.hadoop.crypto.key.kms.server.TestKMSWithZK
Running org.apache.hadoop.crypto.key.kms.server.TestKeyAuthorizationKeyProvider
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.847 sec - in org.apache.hadoop.crypto.key.kms.server.TestKeyAuthorizationKeyProvider
Running org.apache.hadoop.crypto.key.kms.server.TestKMSAudit
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.796 sec - in org.apache.hadoop.crypto.key.kms.server.TestKMSAudit
Running org.apache.hadoop.crypto.key.kms.server.TestKMS
Tests run: 21, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 71.026 sec <<< FAILURE! - in org.apache.hadoop.crypto.key.kms.server.TestKMS
testKMSWithZKDTSM(org.apache.hadoop.crypto.key.kms.server.TestKMS)  Time elapsed: 1.998 sec  <<< ERROR!
java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: GSSException: No valid credentials provided (Mechanism level: Cannot get a KDC reply)
	at sun.security.krb5.KdcComm.send(KdcComm.java:250)
	at sun.security.krb5.KdcComm.send(KdcComm.java:191)
	at sun.security.krb5.KrbTgsReq.send(KrbTgsReq.java:187)
	at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:202)
	at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:311)
	at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
	at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:449)
	at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641)
	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
	at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:311)
	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator$1.run(KerberosAuthenticator.java:287)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.doSpnegoSequence(KerberosAuthenticator.java:287)
	at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:205)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:128)
	at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
	at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.openConnection(DelegationTokenAuthenticatedURL.java:322)
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider$1.run(KMSClientProvider.java:483)
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider$1.run(KMSClientProvider.java:478)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1669)
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createConnection(KMSClientProvider.java:478)
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createKeyInternal(KMSClientProvider.java:677)
	at org.apache.hadoop.crypto.key.kms.KMSClientProvider.createKey(KMSClientProvider.java:695)
	at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$10.call(LoadBalancingKMSClientProvider.java:259)
	at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider$10.call(LoadBalancingKMSClientProvider.java:256)
	at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.doOp(LoadBalancingKMSClientProvider.java:94)
	at org.apache.hadoop.crypto.key.kms.LoadBalancingKMSClientProvider.createKey(LoadBalancingKMSClientProvider.java:256)
	at org.apache.hadoop.crypto.key.kms.server.TestKMS$13$1.run(TestKMS.java:1783)
	at org.apache.hadoop.crypto.key.kms.server.TestKMS$13$1.run(TestKMS.java:1779)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1669)
	at org.apache.hadoop.crypto.key.kms.server.TestKMS.doAs(TestKMS.java:266)
	at org.apache.hadoop.crypto.key.kms.server.TestKMS.access$100(TestKMS.java:75)

Running org.apache.hadoop.crypto.key.kms.server.TestKMSACLs
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.829 sec - in org.apache.hadoop.crypto.key.kms.server.TestKMSACLs

Results :

Tests in error: 
  TestKMS.testKMSWithZKDTSM:1714->doKMSWithZK:1796->runServer:112->runServer:130->access$100:75->doAs:266 ยป IO

Tests run: 30, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Annotations ......................... SUCCESS [ 10.911 s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [ 29.399 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [04:13 min]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  4.941 s]
[INFO] Apache Hadoop Common .............................. SUCCESS [32:06 min]
[INFO] Apache Hadoop NFS ................................. SUCCESS [ 17.729 s]
[INFO] Apache Hadoop KMS ................................. FAILURE [01:31 min]
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 38:55 min
[INFO] Finished at: 2015-11-28T06:00:27+00:00
[INFO] Final Memory: 102M/873M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-kms: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Common-trunk/ws/hadoop-common-project/hadoop-kms/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-kms
result: 1

################
# mvn -eaf test -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
################
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results