You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2012/02/29 20:06:19 UTC

Hadoop-Hdfs-trunk-Commit - Build # 1882 - Failure

See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/1882/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9390 lines...]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] <<< maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs <<<
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (xprepare-package-hadoop-daemon) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
      [get] Destination already exists (skipping): /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/downloads/commons-daemon-1.0.3-bin-linux-i686.tar.gz
   [delete] Deleting directory /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/target/commons-daemon.staging
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/target/commons-daemon.staging
    [untar] Expanding: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/downloads/commons-daemon-1.0.3-bin-linux-i686.tar.gz into /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/target/commons-daemon.staging
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ hadoop-hdfs ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-0.24.0-SNAPSHOT.jar to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-hdfs/0.24.0-SNAPSHOT/hadoop-hdfs-0.24.0-SNAPSHOT.jar
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-hdfs/0.24.0-SNAPSHOT/hadoop-hdfs-0.24.0-SNAPSHOT.pom
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-0.24.0-SNAPSHOT-tests.jar to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-hdfs/0.24.0-SNAPSHOT/hadoop-hdfs-0.24.0-SNAPSHOT-tests.jar
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-0.24.0-SNAPSHOT-sources.jar to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-hdfs/0.24.0-SNAPSHOT/hadoop-hdfs-0.24.0-SNAPSHOT-sources.jar
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-0.24.0-SNAPSHOT-test-sources.jar to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-hdfs/0.24.0-SNAPSHOT/hadoop-hdfs-0.24.0-SNAPSHOT-test-sources.jar
[INFO] 
[INFO] --- maven-deploy-plugin:2.5:deploy (default-deploy) @ hadoop-hdfs ---
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-hdfs/0.24.0-SNAPSHOT/maven-metadata.xml

[WARNING] Could not transfer metadata org.apache.hadoop:hadoop-hdfs:0.24.0-SNAPSHOT/maven-metadata.xml from/to apache.snapshots.https (https://repository.apache.org/content/repositories/snapshots): Error transferring file: Server returned HTTP response code: 503 for URL: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-hdfs/0.24.0-SNAPSHOT/maven-metadata.xml
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [24.844s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25.270s
[INFO] Finished at: Wed Feb 29 19:06:18 UTC 2012
[INFO] Final Memory: 24M/337M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.5:deploy (default-deploy) on project hadoop-hdfs: Failed to retrieve remote metadata org.apache.hadoop:hadoop-hdfs:0.24.0-SNAPSHOT/maven-metadata.xml: Could not transfer metadata org.apache.hadoop:hadoop-hdfs:0.24.0-SNAPSHOT/maven-metadata.xml from/to apache.snapshots.https (https://repository.apache.org/content/repositories/snapshots): Error transferring file: Server returned HTTP response code: 503 for URL: https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-hdfs/0.24.0-SNAPSHOT/maven-metadata.xml -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
Updating HDFS-2991
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.