You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2013/01/06 13:35:06 UTC
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #486
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/486/changes>
Changes:
[tgraves] MAPREDUCE-4913. TestMRAppMaster#testMRAppMasterMissingStaging occasionally exits (Jason Lowe via tgraves)
------------------------------------------
[...truncated 9010 lines...]
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.375s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.811s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.537s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.498s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.172s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.111s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.397s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.223s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.47s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.026s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.146s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.096s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.154s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.079s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.124s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.064s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.026s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.132s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.054s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.098s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.278s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.141s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.153s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.135s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.182s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.382s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.092s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.059s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.167s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.011s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.012s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.011s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.01s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.01s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.27s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:52.243s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:52.850s
[INFO] Finished at: Sun Jan 06 11:35:03 UTC 2013
[INFO] Final Memory: 36M/430M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-4913
Hadoop-Hdfs-0.23-Build - Build # 487 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/487/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14505 lines...]
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 18 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:59.415s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:00.016s
[INFO] Finished at: Mon Jan 07 11:35:12 UTC 2013
[INFO] Final Memory: 36M/334M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Hadoop-Hdfs-0.23-Build - Build # 488 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/488/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14491 lines...]
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 19 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:56.074s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:56.675s
[INFO] Finished at: Tue Jan 08 11:35:52 UTC 2013
[INFO] Final Memory: 36M/333M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Hadoop-Hdfs-0.23-Build - Build # 489 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/489/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 23574 lines...]
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 19 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:59.629s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:00.710s
[INFO] Finished at: Wed Jan 09 11:35:53 UTC 2013
[INFO] Final Memory: 37M/517M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
REGRESSION: org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner.testDirectoryScanner
Error Message:
IPC server unable to read call parameters: readObject can't find class org.apache.hadoop.io.Writable
Stack Trace:
java.lang.RuntimeException: IPC server unable to read call parameters: readObject can't find class org.apache.hadoop.io.Writable
at org.apache.hadoop.ipc.Client.call(Client.java:1088)
at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:195)
at $Proxy11.addBlock(Unknown Source)
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:102)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:67)
at $Proxy11.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1097)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:973)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:455)
Hadoop-Hdfs-0.23-Build - Build # 491 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/491/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14541 lines...]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:50.652s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:51.254s
[INFO] Finished at: Fri Jan 11 11:35:41 UTC 2013
[INFO] Final Memory: 37M/459M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating HDFS-2757
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Hadoop-Hdfs-0.23-Build - Build # 493 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/493/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9203 lines...]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:56.449s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:57.051s
[INFO] Finished at: Sun Jan 13 11:35:36 UTC 2013
[INFO] Final Memory: 35M/378M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Hadoop-Hdfs-0.23-Build - Build # 494 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/494/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9205 lines...]
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:55.038s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:55.673s
[INFO] Finished at: Mon Jan 14 11:35:37 UTC 2013
[INFO] Final Memory: 36M/395M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
REGRESSION: org.apache.hadoop.hdfs.TestCrcCorruption.testCrcCorruption
Error Message:
IPC server unable to read call parameters: readObject can't find class java.lang.String
Stack Trace:
java.lang.RuntimeException: IPC server unable to read call parameters: readObject can't find class java.lang.String
at org.apache.hadoop.ipc.Client.call(Client.java:1088)
at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:195)
at $Proxy13.complete(Unknown Source)
at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:102)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:67)
at $Proxy13.complete(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:1671)
at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:1658)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:66)
at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:99)
at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:200)
at org.apache.hadoop.hdfs.DFSTestUtil.createFiles(DFSTestUtil.java:170)
at org.apache.hadoop.hdfs.TestCrcCorruption.thistest(TestCrcCorruption.java:78)
at org.apache.hadoop.hdfs.TestCrcCorruption.__CLR3_0_269rbwcx3v(TestCrcCorruption.java:210)
at org.apache.hadoop.hdfs.TestCrcCorruption.testCrcCorruption(TestCrcCorruption.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)
Hadoop-Hdfs-0.23-Build - Build # 498 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/498/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9210 lines...]
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 15 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:51.973s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:52.570s
[INFO] Finished at: Fri Jan 18 11:35:58 UTC 2013
[INFO] Final Memory: 37M/435M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating HADOOP-7886
Updating MAPREDUCE-4278
Updating HADOOP-9147
Updating HADOOP-9155
Updating HADOOP-8849
Updating HADOOP-9216
Updating MAPREDUCE-4907
Updating HADOOP-9212
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Hadoop-Hdfs-0.23-Build - Build # 503 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/503/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9203 lines...]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 18 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:58.238s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:58.848s
[INFO] Finished at: Wed Jan 23 11:35:17 UTC 2013
[INFO] Final Memory: 35M/357M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-4946
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Hadoop-Hdfs-0.23-Build - Build # 507 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/507/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14831 lines...]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:50.629s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:51.233s
[INFO] Finished at: Sun Jan 27 11:34:44 UTC 2013
[INFO] Final Memory: 35M/332M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Hadoop-Hdfs-0.23-Build - Build # 508 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/508/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 15265 lines...]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[exec] * [52/2] [0/0] 0.24s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:53.103s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:53.704s
[INFO] Finished at: Mon Jan 28 11:34:49 UTC 2013
[INFO] Final Memory: 37M/474M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Jenkins build is back to normal : Hadoop-Hdfs-0.23-Build #510
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/510/changes>
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #509
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/509/changes>
Changes:
[kihwal] merge -r 1439652:1439653 Merging YARN-133 to branch-0.23
[tgraves] HADOOP-9255. relnotes.py missing last jira (tgraves)
[suresh] HADOOP-9247. Merge r1438698 from trunk
[tgraves] Fix HDFS change log from left over merge entries
------------------------------------------
[...truncated 14607 lines...]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.384s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.762s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.491s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.478s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.177s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.122s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.386s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.236s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.364s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.281s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.094s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.148s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.069s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.141s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.066s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.024s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.141s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.051s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.014s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.101s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.276s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.125s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.152s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.139s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.325s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.227s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.092s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.053s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.171s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.011s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0080s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.017s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0080s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.288s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.771s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:55.375s
[INFO] Finished at: Tue Jan 29 11:35:07 UTC 2013
[INFO] Final Memory: 36M/303M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating YARN-133
Updating HADOOP-9255
Updating HADOOP-9247
Hadoop-Hdfs-0.23-Build - Build # 509 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/509/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14800 lines...]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.771s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:55.375s
[INFO] Finished at: Tue Jan 29 11:35:07 UTC 2013
[INFO] Final Memory: 36M/303M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating YARN-133
Updating HADOOP-9255
Updating HADOOP-9247
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #508
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/508/>
------------------------------------------
[...truncated 15072 lines...]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.488s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.763s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 1.365s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.312s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.193s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.122s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.995s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.189s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.351s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.271s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.1s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.149s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.074s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.177s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.067s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.04s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.135s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.056s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.096s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.149s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.262s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.16s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.024s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.135s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.179s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.231s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.233s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.049s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.165s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.018s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.0090s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.017s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[exec] * [52/2] [0/0] 0.24s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:53.103s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:53.704s
[INFO] Finished at: Mon Jan 28 11:34:49 UTC 2013
[INFO] Final Memory: 37M/474M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #507
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/507/>
------------------------------------------
[...truncated 14638 lines...]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.363s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.814s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.417s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.643s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.176s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.125s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.397s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.226s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.347s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.018s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.269s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.094s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.143s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.077s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.125s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.066s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.024s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.134s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.053s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.014s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.097s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.148s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.261s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.163s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.139s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.181s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.374s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.081s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.05s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.166s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.01s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.015s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.0090s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.01s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.016s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.247s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:50.629s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:51.233s
[INFO] Finished at: Sun Jan 27 11:34:44 UTC 2013
[INFO] Final Memory: 35M/332M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #506
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/506/>
------------------------------------------
[...truncated 14643 lines...]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.428s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.78s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.407s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.57s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.185s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.114s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.394s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.211s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.46s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.147s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.094s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.147s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.073s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.131s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.062s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.029s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.254s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.051s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.017s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.1s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.145s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.134s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.155s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.136s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.33s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.228s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.085s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.051s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.189s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.0090s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.011s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.018s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.01s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.283s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in
[exec] column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.074s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.679s
[INFO] Finished at: Sat Jan 26 11:35:11 UTC 2013
[INFO] Final Memory: 37M/452M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Hadoop-Hdfs-0.23-Build - Build # 506 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/506/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14836 lines...]
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in
[exec] column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.074s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.679s
[INFO] Finished at: Sat Jan 26 11:35:11 UTC 2013
[INFO] Final Memory: 37M/452M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #505
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/505/>
------------------------------------------
[...truncated 9010 lines...]
[exec] validate-skinconf:
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.498s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.841s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 1.309s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 2.806s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.177s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.119s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.576s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.178s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.301s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.016s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.255s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.093s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.141s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.072s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.125s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.068s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.024s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.128s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.05s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.014s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.099s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.141s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.247s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.153s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.013s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.138s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.181s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.224s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.226s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.05s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.175s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.01s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0080s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.01s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.367s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 12 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 15 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:52.667s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:53.327s
[INFO] Finished at: Fri Jan 25 11:35:02 UTC 2013
[INFO] Final Memory: 37M/403M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Hadoop-Hdfs-0.23-Build - Build # 505 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/505/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9203 lines...]
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 15 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:52.667s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:53.327s
[INFO] Finished at: Fri Jan 25 11:35:02 UTC 2013
[INFO] Final Memory: 37M/403M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #504
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/504/changes>
Changes:
[jlowe] svn merge -c 1437775 FIXES: YARN-354. WebAppProxyServer exits immediately after startup. Contributed by Liang Xie
[suresh] HDFS-4426. Merge change 1437627 from trunk.
------------------------------------------
[...truncated 9011 lines...]
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.357s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.803s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.766s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.091s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.191s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.134s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.391s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.23s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.385s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.317s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.119s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.158s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.077s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.313s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.074s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.024s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.136s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.051s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.019s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.025s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.28s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.133s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.154s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.135s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.18s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.361s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.084s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.05s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.066s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.011s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0080s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.015s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0080s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.253s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.629s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:55.227s
[INFO] Finished at: Thu Jan 24 11:34:49 UTC 2013
[INFO] Final Memory: 36M/396M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating YARN-354
Updating HDFS-4426
Hadoop-Hdfs-0.23-Build - Build # 504 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/504/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9204 lines...]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.629s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:55.227s
[INFO] Finished at: Thu Jan 24 11:34:49 UTC 2013
[INFO] Final Memory: 36M/396M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating YARN-354
Updating HDFS-4426
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #503
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/503/changes>
Changes:
[sseth] MAPREDUCE-4946. Fix a performance problem for large jobs by reducing the number of map completion event type conversions. Contributed by Jason Lowe.
------------------------------------------
[...truncated 9010 lines...]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.426s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.766s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.258s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.796s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.183s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.134s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.496s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.226s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.459s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.016s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.15s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.112s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.146s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.075s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.128s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.064s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.032s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.262s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.05s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.019s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.097s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.149s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.135s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.152s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.283s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.192s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.222s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.084s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.057s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.17s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.0090s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.01s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0080s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.012s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.361s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0060s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 18 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:58.238s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:58.848s
[INFO] Finished at: Wed Jan 23 11:35:17 UTC 2013
[INFO] Final Memory: 35M/357M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-4946
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #502
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/502/>
------------------------------------------
[...truncated 9011 lines...]
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.522s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.694s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 1.27s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.234s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.182s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.124s 12.3Kb hdfs_editsviewer.pdf
[exec] Fontconfig error: Cannot load default config file
[exec] * [8/22] [0/0] 1.012s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.177s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.473s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.158s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.105s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.149s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.077s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.176s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.068s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.035s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.274s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.051s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.105s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.151s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.149s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.162s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.025s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.286s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.204s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.242s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.087s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.056s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.169s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.0090s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.015s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0040s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.015s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] Java Result: 1
[exec] * [52/2] [0/0] 0.28s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 18 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.294s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.983s
[INFO] Finished at: Tue Jan 22 11:36:09 UTC 2013
[INFO] Final Memory: 36M/380M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Hadoop-Hdfs-0.23-Build - Build # 502 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/502/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9204 lines...]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 18 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.294s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.983s
[INFO] Finished at: Tue Jan 22 11:36:09 UTC 2013
[INFO] Final Memory: 36M/380M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #501
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/501/>
------------------------------------------
[...truncated 9010 lines...]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.361s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.802s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.146s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.577s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.171s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.115s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.41s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.2s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.32s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.264s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.095s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.148s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.075s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.124s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.066s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.024s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.132s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.049s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.014s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.106s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.143s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.249s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.152s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.017s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.131s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.195s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.228s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.09s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.049s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.169s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.0090s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.014s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0080s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0080s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.01s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0080s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.28s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 16 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:51.368s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:51.968s
[INFO] Finished at: Mon Jan 21 11:35:37 UTC 2013
[INFO] Final Memory: 37M/404M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Hadoop-Hdfs-0.23-Build - Build # 501 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/501/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9203 lines...]
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 16 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:51.368s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:51.968s
[INFO] Finished at: Mon Jan 21 11:35:37 UTC 2013
[INFO] Final Memory: 37M/404M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #500
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/500/>
------------------------------------------
[...truncated 9010 lines...]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] 12 file(s) have been successfully validated.
[exec] ...validated xdocs
[exec]
[exec] validate-skinconf:
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.36s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.764s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 1.676s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 2.814s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.17s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.121s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.4s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.186s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.388s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.27s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.103s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.145s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.075s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.234s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.068s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.026s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.131s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.057s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.014s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.036s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.268s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.128s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.161s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.013s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.137s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.336s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.216s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.086s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.054s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.066s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.016s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.014s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0040s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 16 seconds
[exec] * [52/2] [0/0] 0.305s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 12 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:51.880s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:52.477s
[INFO] Finished at: Sun Jan 20 11:35:45 UTC 2013
[INFO] Final Memory: 36M/367M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Hadoop-Hdfs-0.23-Build - Build # 500 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/500/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9203 lines...]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 16 seconds
[exec] * [52/2] [0/0] 0.305s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 12 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:51.880s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:52.477s
[INFO] Finished at: Sun Jan 20 11:35:45 UTC 2013
[INFO] Final Memory: 36M/367M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #499
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/499/changes>
Changes:
[jeagles] MAPREDUCE-4458. Warn if java.library.path is used for AM or Task (Robert Parker via jeagles)
------------------------------------------
[...truncated 14616 lines...]
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.33s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.834s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.883s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 2.919s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.178s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.119s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.391s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.229s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.356s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.266s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.098s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.146s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.077s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.145s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.067s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.025s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.132s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.063s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.097s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.285s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.14s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.159s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.145s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.327s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.22s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.091s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.054s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.163s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.018s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.014s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.0090s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0040s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.013s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.222s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.259s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.868s
[INFO] Finished at: Sat Jan 19 11:35:41 UTC 2013
[INFO] Final Memory: 36M/360M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-4458
Hadoop-Hdfs-0.23-Build - Build # 499 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/499/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14809 lines...]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 17 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:54.259s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.868s
[INFO] Finished at: Sat Jan 19 11:35:41 UTC 2013
[INFO] Final Memory: 36M/360M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-4458
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
REGRESSION: org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery.testRaceBetweenReplicaRecoveryAndFinalizeBlock
Error Message:
Problem binding to [localhost:50070] java.net.BindException: Address already in use; For more details see: http://wiki.apache.org/hadoop/BindException
Stack Trace:
java.net.BindException: Problem binding to [localhost:50070] java.net.BindException: Address already in use; For more details see: http://wiki.apache.org/hadoop/BindException
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:689)
at org.apache.hadoop.ipc.Server.bind(Server.java:258)
at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:348)
at org.apache.hadoop.ipc.Server.<init>(Server.java:1648)
at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:609)
at org.apache.hadoop.ipc.WritableRpcEngine$Server.<init>(WritableRpcEngine.java:350)
at org.apache.hadoop.ipc.WritableRpcEngine.getServer(WritableRpcEngine.java:300)
at org.apache.hadoop.ipc.RPC.getServer(RPC.java:581)
at org.apache.hadoop.ipc.RPC.getServer(RPC.java:570)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.<init>(NameNodeRpcServer.java:146)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createRpcServer(NameNode.java:360)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:338)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:462)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:454)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:755)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:678)
at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:573)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:282)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:88)
at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:268)
at org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery.__CLR3_0_2uxpta118jm(TestBlockRecovery.java:570)
at org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery.testRaceBetweenReplicaRecoveryAndFinalizeBlock(TestBlockRecovery.java:566)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.internal.runners.statements.FailOnTimeout$1.run(FailOnTimeout.java:28)
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #498
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/498/changes>
Changes:
[tgraves] HADOOP-9216. CompressionCodecFactory#getCodecClasses should trim the result of parsing by Configuration. (Tsuyoshi Ozawa via todd)
[tgraves] HADOOP-9212. Potential deadlock in FileSystem.Cache/IPC/UGI (Tom White via tgraves)
[tgraves] MAPREDUCE-4278. cannot run two local jobs in parallel from the same gateway. (Sandy Ryza via tgraves)
[bobby] svn merge -c 1434868 FIXES: HADOOP-8849. FileUtil#fullyDelete should grant the target directories +rwx permissions (Ivan A. Veselovsky via bobby)
[tgraves] HADOOP-9155. FsPermission should have different default value, 777 for directory and 666 for file (Binglin Chang via tgraves)
[tgraves] HADOOP-9147. Add missing fields to FIleStatus.toString.(Jonathan Allen via suresh)
[tgraves] HADOOP-7886 Add toString to FileStatus (SreeHari via tgraves)
[tgraves] MAPREDUCE-4907. TrackerDistributedCacheManager issues too many getFileStatus calls (Sandy Ryza via tgraves)
------------------------------------------
[...truncated 9017 lines...]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.355s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.76s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 1.379s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 2.708s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.165s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.119s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.388s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.185s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.433s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.143s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.095s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.141s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.076s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.126s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.067s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.025s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.133s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.053s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.095s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.276s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.128s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.152s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.017s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.135s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.334s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.23s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.102s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.055s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.418s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.016s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.238s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 12 seconds, Site size: 694,383 Site pages: 43
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 15 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:51.973s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:52.570s
[INFO] Finished at: Fri Jan 18 11:35:58 UTC 2013
[INFO] Final Memory: 37M/435M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating HADOOP-7886
Updating MAPREDUCE-4278
Updating HADOOP-9147
Updating HADOOP-9155
Updating HADOOP-8849
Updating HADOOP-9216
Updating MAPREDUCE-4907
Updating HADOOP-9212
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #497
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/497/changes>
Changes:
[jlowe] svn merge -c 1303634 FIXES: HADOOP-8157. Fix race condition in Configuration that could cause spurious ClassNotFoundExceptions after a GC. Contributed by Todd Lipcon.
[tgraves] Preparing for 0.23.7 development
[tgraves] Preparing for release 0.23.6
------------------------------------------
[...truncated 9008 lines...]
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.397s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.766s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 1.337s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 2.711s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.173s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.116s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.46s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.183s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.479s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.016s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.146s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.097s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.14s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.072s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.128s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.063s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.035s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.261s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.052s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.017s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.096s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.168s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.134s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.163s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.28s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.208s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.22s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.088s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.058s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.19s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.011s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.0090s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0080s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.013s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.276s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 12 seconds, Site size: 694,383 Site pages: 43
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 15 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:56.005s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:56.607s
[INFO] Finished at: Thu Jan 17 11:36:13 UTC 2013
[INFO] Final Memory: 36M/443M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating HADOOP-8157
Hadoop-Hdfs-0.23-Build - Build # 497 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/497/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9201 lines...]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 15 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:56.005s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:56.607s
[INFO] Finished at: Thu Jan 17 11:36:13 UTC 2013
[INFO] Final Memory: 36M/443M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating HADOOP-8157
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #496
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/496/changes>
Changes:
[suresh] HADOOP-9217. Merging change 1433713 from trunk
[tgraves] HDFS-4399. precommit release audit warnings (tgraves)
[bobby] MAPREDUCE-4921. JobClient should acquire HS token with RM principal (Daryn Sharp via bobby)
[harsh] MAPREDUCE-4925. The pentomino option parser may be buggy. Contributed by Karthik Kambatla. (harsh)
[harsh] MAPREDUCE-4678. Running the Pentomino example with defaults throws java.lang.NegativeArraySizeException. Contributed by Chris McConnell. (harsh)
------------------------------------------
[...truncated 14616 lines...]
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.54s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.781s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.823s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.275s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.176s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.124s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 1.125s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.23s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.521s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.153s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.098s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.144s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.081s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.218s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.065s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.045s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.262s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.052s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.017s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.024s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.16s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.136s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.153s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.14s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.333s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.221s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.091s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.055s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.065s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.011s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.0090s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.01s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.248s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0070s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 15 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 19 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:55.267s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:55.901s
[INFO] Finished at: Wed Jan 16 11:36:10 UTC 2013
[INFO] Final Memory: 35M/300M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-4678
Updating MAPREDUCE-4921
Updating HADOOP-9217
Updating HDFS-4399
Updating MAPREDUCE-4925
Hadoop-Hdfs-0.23-Build - Build # 496 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/496/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14809 lines...]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 19 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:55.267s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:55.901s
[INFO] Finished at: Wed Jan 16 11:36:10 UTC 2013
[INFO] Final Memory: 35M/300M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-4678
Updating MAPREDUCE-4921
Updating HADOOP-9217
Updating HDFS-4399
Updating MAPREDUCE-4925
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #495
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/495/changes>
Changes:
[tgraves] HADOOP-9181. Set daemon flag for HttpServer's QueuedThreadPool (Liang Xie via tgraves)
[tgraves] YARN-170. NodeManager stop() gets called twice on shutdown (Sandy Ryza via tgraves)
[tgraves] HADOOP-9097. Maven RAT plugin is not checking all source files (tgraves)
[tgraves] HDFS-4385. Maven RAT plugin is not checking all source files (tgraves)
[tgraves] MAPREDUCE-4934. Maven RAT plugin is not checking all source files (tgraves)
[tgraves] YARN-334. Maven RAT plugin is not checking all source files (tgraves)
------------------------------------------
[...truncated 9014 lines...]
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.557s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.845s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.589s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.684s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.184s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.125s 12.3Kb hdfs_editsviewer.pdf
[exec] Fontconfig error: Cannot load default config file
[exec] * [8/22] [0/0] 0.728s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.232s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.368s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.138s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.161s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.108s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.155s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.086s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.183s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.067s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.043s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.153s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.064s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.016s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.1s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.159s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.266s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.157s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.015s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.141s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.191s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.393s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.088s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.05s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.134s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.012s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.011s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.011s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.217s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 15 seconds, Site size: 694,383 Site pages: 43
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 18 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:58.467s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:59.083s
[INFO] Finished at: Tue Jan 15 11:35:42 UTC 2013
[INFO] Final Memory: 37M/468M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating HADOOP-9181
Updating YARN-334
Updating HADOOP-9097
Updating YARN-170
Updating HDFS-4385
Updating MAPREDUCE-4934
Hadoop-Hdfs-0.23-Build - Build # 495 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/495/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9207 lines...]
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 18 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:58.467s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:59.083s
[INFO] Finished at: Tue Jan 15 11:35:42 UTC 2013
[INFO] Final Memory: 37M/468M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating HADOOP-9181
Updating YARN-334
Updating HADOOP-9097
Updating YARN-170
Updating HDFS-4385
Updating MAPREDUCE-4934
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #494
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/494/>
------------------------------------------
[...truncated 9012 lines...]
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.366s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.837s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.711s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.04s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.167s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.11s 12.3Kb hdfs_editsviewer.pdf
[exec] Fontconfig error: Cannot load default config file
[exec] * [8/22] [0/0] 0.471s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.242s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.473s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.145s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.099s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.146s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.071s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.155s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.079s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.025s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.266s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.051s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.099s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.14s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.142s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.164s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.019s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.275s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.198s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.238s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.087s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.049s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.175s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.01s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0080s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.016s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.011s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.01s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.018s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.219s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:55.038s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:55.673s
[INFO] Finished at: Mon Jan 14 11:35:37 UTC 2013
[INFO] Final Memory: 36M/395M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #493
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/493/>
------------------------------------------
[...truncated 9010 lines...]
[exec] validate-skinconf:
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.4s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.772s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.489s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.399s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.19s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.118s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.401s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.251s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.501s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.142s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.104s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.16s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.071s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.269s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.075s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.026s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.139s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.051s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.028s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.156s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.138s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.161s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.017s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.147s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.185s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.365s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.092s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.051s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.072s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.019s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.011s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0060s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.01s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.259s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[exec] Total time: 0 minutes 14 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:56.449s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:57.051s
[INFO] Finished at: Sun Jan 13 11:35:36 UTC 2013
[INFO] Final Memory: 35M/378M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #492
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/492/changes>
Changes:
[tgraves] YARN-80. Add support for delaying rack-local containers in CapacityScheduler. (acmurthy)
[jlowe] svn merge -c 1429872 to add svn:ignore properties to hadoop-mapreduce-client-hs-plugins
[tgraves] MAPREDUCE-4810. Added new admin command options for MR AM. (Jerry Chen via tgraves)
------------------------------------------
[...truncated 14614 lines...]
[exec] ...validated xdocs
[exec]
[exec] validate-skinconf:
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.381s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.794s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 1.541s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 2.748s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.187s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.126s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.392s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.178s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.454s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.156s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.093s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.134s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.075s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.126s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.066s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.025s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.133s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.05s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.134s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.096s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.145s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.129s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.172s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.146s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.33s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.224s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.084s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.053s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.174s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.012s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.01s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.011s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 15 seconds
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.243s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 12 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:53.923s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.528s
[INFO] Finished at: Sat Jan 12 11:35:45 UTC 2013
[INFO] Final Memory: 39M/483M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating YARN-80
Updating MAPREDUCE-4810
Hadoop-Hdfs-0.23-Build - Build # 492 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/492/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14807 lines...]
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 15 seconds
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.243s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 12 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:53.923s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.528s
[INFO] Finished at: Sat Jan 12 11:35:45 UTC 2013
[INFO] Final Memory: 39M/483M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating YARN-80
Updating MAPREDUCE-4810
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #491
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/491/changes>
Changes:
[kihwal] merge -r 1382408:1382409 for HDFS-2757
------------------------------------------
[...truncated 14348 lines...]
[exec] 12 file(s) have been successfully validated.
[exec] ...validated xdocs
[exec]
[exec] validate-skinconf:
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.374s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.794s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.06s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 2.963s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.196s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.113s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.404s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.24s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.491s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.017s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.147s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.103s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.155s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.069s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.156s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.067s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.025s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.251s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.053s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.106s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.133s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.135s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.16s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.144s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.343s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.222s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.087s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.049s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.203s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.0090s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.013s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0080s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.0090s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.01s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0080s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.264s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 13 seconds, Site size: 694,383 Site pages: 43
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 17 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:50.652s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:51.254s
[INFO] Finished at: Fri Jan 11 11:35:41 UTC 2013
[INFO] Final Memory: 37M/459M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating HDFS-2757
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #490
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/490/changes>
Changes:
[jlowe] svn merge -c 1431131 FIXES: MAPREDUCE-4848. TaskAttemptContext cast error during AM recovery. Contributed by Jerry Chen
[tgraves] YARN-325. RM CapacityScheduler can deadlock when getQueueInfo() is called and a container is completing (Arun C Murthy via tgraves)
[sseth] YARN-320. RM should always be able to renew its own tokens. Contributed by Daryn Sharp
------------------------------------------
[...truncated 14337 lines...]
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.356s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.735s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 1.622s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 2.729s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.199s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.113s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.395s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.187s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.35s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.143s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.138s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.104s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.148s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.072s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.125s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.062s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.027s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.134s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.05s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.017s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.097s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.267s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.135s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.162s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.017s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.139s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.331s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.223s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.087s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.056s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.435s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.01s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0080s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.013s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.014s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.25s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 12 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 16 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:50.547s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:51.152s
[INFO] Finished at: Thu Jan 10 11:35:21 UTC 2013
[INFO] Final Memory: 36M/374M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating YARN-325
Updating YARN-320
Updating MAPREDUCE-4848
Hadoop-Hdfs-0.23-Build - Build # 490 - Still Failing
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/490/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 14530 lines...]
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml
[exec]
[exec] Total time: 16 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:50.547s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:51.152s
[INFO] Finished at: Thu Jan 10 11:35:21 UTC 2013
[INFO] Final Memory: 36M/374M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating YARN-325
Updating YARN-320
Updating MAPREDUCE-4848
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #489
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/489/>
------------------------------------------
[...truncated 23381 lines...]
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.597s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.798s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.438s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.607s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.184s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.121s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.719s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.221s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.375s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.016s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.271s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.101s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.156s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.084s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.255s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.068s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.025s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.132s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.055s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.026s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.138s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.26s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.159s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.135s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.187s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.392s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.081s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.05s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.065s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.01s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.011s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.01s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0050s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.011s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.306s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 15 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 19 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:59.629s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:00.710s
[INFO] Finished at: Wed Jan 09 11:35:53 UTC 2013
[INFO] Final Memory: 37M/517M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #488
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/488/>
------------------------------------------
[...truncated 14298 lines...]
[exec] 1 file(s) have been successfully validated.
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.562s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.845s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.402s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.562s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.18s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.118s 12.3Kb hdfs_editsviewer.pdf
[exec] * [8/22] [0/0] 0.963s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.234s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.497s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.016s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.147s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.097s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.143s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.074s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.254s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.068s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.069s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.256s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.058s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.015s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.027s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.14s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.13s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.152s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.145s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.338s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.234s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.09s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.05s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.064s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.013s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.01s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.01s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.015s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0060s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.01s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.01s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.384s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 15 seconds, Site size: 694,383 Site pages: 43
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 19 seconds
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:56.074s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:56.675s
[INFO] Finished at: Tue Jan 08 11:35:52 UTC 2013
[INFO] Final Memory: 36M/333M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Build failed in Jenkins: Hadoop-Hdfs-0.23-Build #487
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/487/>
------------------------------------------
[...truncated 14312 lines...]
[exec] ...validated skinconf
[exec]
[exec] validate-sitemap:
[exec]
[exec] validate-skins-stylesheets:
[exec]
[exec] validate-skins:
[exec]
[exec] validate-skinchoice:
[exec] ...validated existence of skin 'pelt'
[exec]
[exec] validate-stylesheets:
[exec]
[exec] validate:
[exec]
[exec] site:
[exec]
[exec] Copying the various non-generated resources to site.
[exec] Warnings will be issued if the optional project resources are not found.
[exec] This is often the case, because they are optional and so may not be available.
[exec] Copying project resources and images to site ...
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/webapp/resources> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt/images> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/common> not found.
[exec] Warning: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/skins/pelt> not found.
[exec] Copied 1 empty directory to 1 empty directory under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Copying main skin images to site ...
[exec] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 20 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying 14 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin/images>
[exec] Copying project skin images to site ...
[exec] Copying main skin css and js files to site ...
[exec] Copying 11 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copied 4 empty directories to 3 empty directories under <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying 4 files to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/skin>
[exec] Copying project skin css and js files to site ...
[exec]
[exec] Finished copying the non-generated resources.
[exec] Now Cocoon will generate the rest.
[exec]
[exec]
[exec] Static site will be generated at:
[exec] <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec]
[exec] Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec]
[exec] ------------------------------------------------------------------------
[exec] cocoon 2.1.12-dev
[exec] Copyright (c) 1999-2007 Apache Software Foundation. All rights reserved.
[exec] ------------------------------------------------------------------------
[exec]
[exec]
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [1/26] [26/30] 2.472s 8.6Kb linkmap.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [2/26] [1/29] 0.781s 19.4Kb hdfs_permissions_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] ^ api/org/apache/hadoop/fs/FileStatus.html
[exec] ^ api/org/apache/hadoop/fs/Path.html
[exec] * [3/26] [1/63] 2.181s 67.6Kb webhdfs.html
[exec] WARN - Line 1 of a paragraph overflows the available area by 30000mpt. (fo:block, "dfs.web.authentication.kerberos.principal")
[exec] WARN - Line 1 of a paragraph overflows the available area by 12000mpt. (fo:block, "dfs.web.authentication.kerberos.keytab")
[exec] * [4/25] [0/0] 3.434s 127.4Kb webhdfs.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [6/24] [1/29] 0.171s 10.8Kb hdfs_editsviewer.html
[exec] * [7/23] [0/0] 0.112s 12.3Kb hdfs_editsviewer.pdf
[exec] Fontconfig error: Cannot load default config file
[exec] * [8/22] [0/0] 0.929s 348b skin/images/rc-b-l-15-1body-2menu-3menu.png
[exec] X [0] images/hdfs-logo.jpg BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfs-logo.jpg> (No such file or directory)
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [11/20] [1/29] 0.23s 27.3Kb hdfs_imageviewer.html
[exec] * [12/19] [0/0] 0.506s 31.0Kb hdfs_imageviewer.pdf
[exec] * [13/18] [0/0] 0.016s 766b images/favicon.ico
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] ^ api/org/apache/hadoop/fs/FileSystem.html
[exec] * [14/18] [1/30] 0.147s 9.5Kb libhdfs.html
[exec] * [15/17] [0/0] 0.1s 10.1Kb linkmap.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [16/17] [1/29] 0.149s 11.7Kb hdfs_quota_admin_guide.html
[exec] * [17/16] [0/0] 0.076s 13.9Kb hdfs_quota_admin_guide.pdf
[exec] * [18/15] [0/0] 0.159s 1.2Kb skin/print.css
[exec] X [0] hdfs_design.html BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/hdfs_design.xml> (No such file or directory)
[exec] * [20/13] [0/0] 0.066s 14.0Kb libhdfs.pdf
[exec] * [21/12] [0/0] 0.026s 4.4Kb skin/profile.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [23/12] [2/31] 0.258s 7.0Kb index.html
[exec] * [24/11] [0/0] 0.051s 8.0Kb index.pdf
[exec] * [25/10] [0/0] 0.014s 9.2Kb images/hadoop-logo.jpg
[exec] * [27/8] [0/0] 0.104s 2.9Kb skin/basic.css
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [28/8] [1/29] 0.147s 14.5Kb SLG_user_guide.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [29/8] [1/29] 0.141s 8.4Kb hftp.html
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [30/9] [2/30] 0.154s 20.1Kb faultinject_framework.html
[exec] * [31/8] [0/0] 0.014s 30.2Kb images/FI-framework.gif
[exec] * [32/7] [0/0] 0.142s 23.4Kb hdfs_permissions_guide.pdf
[exec] ^ api/index.html
[exec] ^ jdiff/changes.html
[exec] ^ releasenotes.html
[exec] ^ changes.html
[exec] * [33/8] [2/31] 0.33s 35.8Kb hdfs_user_guide.html
[exec] * [34/7] [0/0] 0.209s 48.3Kb hdfs_user_guide.pdf
[exec] X [0] images/hdfsarchitecture.gif BROKEN: <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/src/documentation/content/xdocs/images.hdfsarchitecture.gif> (No such file or directory)
[exec] * [36/5] [0/0] 0.088s 15.7Kb SLG_user_guide.pdf
[exec] * [37/4] [0/0] 0.049s 10.6Kb hftp.pdf
[exec] * [38/16] [13/13] 0.177s 12.3Kb skin/screen.css
[exec] * [39/15] [0/0] 0.01s 214b skin/images/rc-t-r-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [40/14] [0/0] 0.0090s 319b skin/images/rc-b-r-15-1body-2menu-3menu.png
[exec] * [41/13] [0/0] 0.012s 199b skin/images/rc-t-l-5-1header-2tab-unselected-3tab-unselected.png
[exec] * [43/11] [0/0] 0.0090s 215b skin/images/rc-t-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [44/10] [0/0] 0.0090s 199b skin/images/rc-t-l-5-1header-2searchbox-3searchbox.png
[exec] * [46/8] [0/0] 0.0090s 214b skin/images/rc-t-r-5-1header-2searchbox-3searchbox.png
[exec] * [47/7] [0/0] 0.021s 390b skin/images/rc-t-r-15-1body-2menu-3menu.png
[exec] * [48/6] [0/0] 0.0040s 285b images/instruction_arrow.png
[exec] * [49/5] [0/0] 0.0090s 200b skin/images/rc-b-r-5-1header-2tab-selected-3tab-selected.png
[exec] * [51/3] [0/0] 0.0090s 209b skin/images/rc-t-l-5-1header-2tab-selected-3tab-selected.png
[exec] WARN - Page 5: Unresolved id reference "Putting+it+all+together" found.
[exec] WARN - Page 6: Unresolved id reference "Putting+it+all+together" found.
[exec] * [52/2] [0/0] 0.269s 55.6Kb faultinject_framework.pdf
[exec] * [53/1] [0/0] 0.0050s 1.8Kb images/built-with-forrest-button.png
[exec] Total time: 0 minutes 15 seconds, Site size: 694,383 Site pages: 43
[exec]
[exec] Copying broken links file to site root.
[exec]
[exec] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site>
[exec] Java Result: 1
[exec]
[exec] BUILD FAILED
[exec] /home/jenkins/tools/forrest/latest/main/targets/site.xml:224: Error building site.
[exec]
[exec] There appears to be a problem with your site build.
[exec]
[exec] Read the output above:
[exec] * Cocoon will report the status of each document:
[exec] - in column 1: *=okay X=brokenLink ^=pageSkipped (see FAQ).
[exec] * Even if only one link is broken, you will still get "failed".
[exec] * Your site would still be generated, but some pages would be broken.
[exec] - See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/docs-src/build/site/broken-links.xml>
[exec]
[exec] Total time: 18 seconds
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:59.415s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:00.016s
[INFO] Finished at: Mon Jan 07 11:35:12 UTC 2013
[INFO] Final Memory: 36M/334M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (site) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Dmaven.test.failure.ignore=true -Pclover -DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
java.lang.IllegalStateException: basedir <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api> does not exist.
at org.apache.tools.ant.DirectoryScanner.scan(DirectoryScanner.java:879)
at hudson.FilePath$37.hasMatch(FilePath.java:2109)
at hudson.FilePath$37.invoke(FilePath.java:2006)
at hudson.FilePath$37.invoke(FilePath.java:1996)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2309)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:326)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Recording fingerprints